var/home/core/zuul-output/0000755000175000017500000000000015151000326014516 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015151013473015471 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000322617415151013350020257 0ustar corecoreikubelet.log_o[;r)Br'o -n(!9t%Cs7}g/غIs,r.k9GfͅT2| i.߷;U/;Yw?.y7W޾n^X/ixK|1Ool_~yyiw|zxV^֯Li.`|!>ڌj+ACl21E^#QDuxGvZ4c$)9ӋrYWoxCNQWs]8M%3KpNGIrND}2SRCK.(^$0^@hH9%!40Jm>*Kdg?y7|&#)3+o,2s%R>!%*XC7Ln* wCƕH#FLzsѹ Xߛk׹1{,wŻ4v+(n^RϚOGO;5p Cj·1z_j( ,"z-Ee}t(QCuˠMkmi+2z5iݸ6C~z+_Ex$\}*9h>t m2m`QɢJ[a|$ᑨj:D+ʎ; 9Gacm_jY-y`)͐o΁GWo(C U ?}aK+d&?>Y;ufʕ"uZ0EyT0: =XVy#iEW&q]#v0nFNV-9JrdK\D2s&[#bE(mV9ىN囋{W5e1߯F1>9r;:J_T{*T\hVQxi0LZD T{ /WHc&)_`i=į`PÝr JovJw`纪}PSSii4wT (Dnm_`c46A>hPr0ιӦ q:Np8>R'8::8g'h"M{qd 㦿GGk\(Rh07uB^WrN_Ŏ6W>Bߔ)bQ) <4G0 C.iTEZ{(¥:-³xlՐ0A_Fݗw)(c>bugbǎ\J;tf*H7(?PЃkLM)}?=XkLd. yK>"dgӦ{ qke5@eTR BgT9(TڢKBEV*DDQ$3gFfThmIjh}iL;R:7A}Ss8ҧ ΁eor(Ё^g׬JyU{v3Fxlţ@U5$&~ay\CJ68?%tS KK3,87'T`ɻaNhIcn#T[2XDRcm0TJ#r)٧4!)'qϷכrTMiHe1[7c(+!C[KԹҤ 0q;;xG'ʐƭ5J; 6M^ CL3EQXy0Hy[``Xm635o,j&X}6$=}0vJ{*.Jw *nacԇ&~hb[nӉ>'݌6od NN&DǭZrb5Iffe6Rh&C4F;D3T\[ bk5̕@UFB1/ z/}KXg%q3Ifq CXReQP2$TbgK ء#AZ9 K>UHkZ;oﴍ8MEDa3[p1>m`XYB[9% E*:`cBCIqC(1&b f]fNhdQvݸCVA/e.# Okx܍>М>ӗom$rۇnu~Y݇̇TIwӜ'}׃nxuoỴRZ&Yzbm ]) %1(Y^9{q"4e?x+ [Vz;E|d1&ږ/0-Vb=SSO|k1A[|gbͧɇد;:X:@;afU=Sru CK >Y%LwM*t{zƝ$;ȾjHim @tBODɆj>0st\t@HTu( v e`H*1aK`3CmF1K>*Mk{_'֜dN${OT-n,'}6ȴ .#Sqη9]5zoX#ZVOy4%-Lq6dACYm*H@:FUф(vcD%F"i ' VVdmcOTKpwq.M?m12N[=tuw}opYG]2u<ΰ+a1tHayɒ aY(P*aaʨ@ΰ<pX X{k[%Egl1$9  ֲQ$'dJVE%mT{z`R$77.N|b>harNJ(Bň0ae3V#b,PY0TEu1L/]MTB4$`H6NI\nbǛ*AyA\(u|@ [h-,j7gDTÎ4oWJ$j!frH_HI\:U}UE$J @ٚeJ4%0H8Ŗ%?K0D"KjPQ>YP_?1<`SČ.HPdp12 (HĜ]Xgy G[Ŷ.|37xo=N4wjDH>:&EOΆ<䧊1v@b&툒f!y̯RE2w܎ms4gZY-07`-Id,9õ԰t+?}-m6Mw'yR3q㕐)HW'X1BEb $xd(21i)ϸ_і/Cޮm0VKz>I; >d[5Z=4>5!!T@[4 1.x XF`,?Hh]b-#3J(ͥM:<`pz?]6laz5`ZQs{>3ư_o=%*[.\ME/Xp9VqNo}#ƓOފgv[r*hy| IϭR-$$m!-W'wTi:4F5^z3[{1LKO-w7PYEB>-jyL'8>JJ{>źuMp(jL!M7uTźmr(Uxbbqe5rZ HҘڴ(|e@ew>w3C39k-=8pߟTk@2pos/*W#@UTkտ,Fպ̥ 9MGb&0ۺ*u8@8-X[1fiMiT+9[ŗ6 BN=`ͱ)m`6*G-1F 6=X#leU d6xTV6 gn&i"@*"mr栣 IEVpq 0÷bp៚U|,}S@t1:X _ .xI_7ve Z@7IzQpɡr~]SBGڔE7M/k $qZh};naM%~X!^C5Aw͢.@d!@dU}b6[_  CBX`]ڦÞhkX_DɈW!n&.TU n$%rIwPO(fwnv :Nb=X~ax`;Vw}wvRS1q!z[p rjWhUuta^xJꁖ[7ɰ5 [jHhaϯ/lX/bjGO.} @lW GpC cN o~1-b }kAn=)m 3fo˶_ XJ^j< f`mPіpJЦXn6'5m 7aTcTA,} =d#uЇ > hsߺi!4ELy!uG7V]-؆FO^Cr6q,"m%neDdF O>Vu,'Yec]`YvOOwOj-25Hݳ7 li0bSlbw=IsxhRbd+I]Y]JW/i#ƿ:>h+@]*pp桸]%nĴBԨlu |VXnq#r:HGV16n軻)˰ 7#`VCp٫)6vnM!^kCpᯪ;f /`FUwWAֻ,Qu W@ Fi:K ;oqF0qR(]rK unKs^C6B WEt7U "%\;OFxkfs/CFk[ٺ+4Hhao-jp ~~Iu#m1:T <= pDCm3-b _F(ut{xH9AdimV-L^C{0lS|IJe" cѲj Ak-ڶxIuҐqI$6ʎ@lbx\<uV?.*E!qQ5qGҜ$Hf\TRJbż31"qrk3S/0':PSX~߃ʟ~㍖›f!OI1R~-6͘!?/Vvo|{? ǣro_nն-2.l9fbn,2姟d4,?+AWzw A<4 }w"*mj8{ P&Y#ErwHhL2cPr Wҭюky7aXt2 'sofnHXx1'T '6Zapto}PhS/b&X0$Ba{J @޸XFF"4En.Ou8.H#',c@V8 iRX &4ڻ8Ǘ{]o_;E#s:ic.XC^wT\MLX7sTGP7^srq8;G&JS meT;{З>'[LR"w F05N<  yH QveJ=WhwS]֫l"]Јzg6eze;\Mdv!E]?CLC4ʍ@1Ssc;l?ߨG~oB(ъ{zgZJ }z&χF wkߓG9!1u8eBe%1A%!s#JXBz-Bȃ82,߫ ~c a^ 5%Di&hWZ n193T9Щp-NC֤^pY鳡Śk 2` PfjJc%e0nx MScƊզ OT_jX3&:a}@t;3X(91$uxp/Cq6Un9%ZxðvGL qG $ X:u06 E=oWlzN7st˪C:?*|k.TD.Z[mrlTPzQ[$vJKL .FZ*(1'$7^mSNxC8Ña!^{4uE?ܰ8_r 7.sϸRw# y'uR;$厛p!&*̽dPt/ _T2FM?kfRSጷQ_Map@OTi& J1'%JV@ڝy?ʕ+9M+CWp SZEY|Ry^c]/mm}àpGg.S[@AeE{0մ{Ÿ&Tt (6{\٢K 5X\*wٗYSsIXa5FA@ q}/~?aX9QR5Ob 41Y9(^SE@7`KIK`kx& V`X0,%pe_ן >hd xе"Q4SUwy x<'o_~#_?`i3GM"mZ-p)v‡~#mo39YBaZo@/xi*@ t-ö]um_.+^ Sɕle$yM`. #N8أzgۏjk_c51̃^cn ba-X/#=Im41NLu\9ETp^poAOO&AwMm[eG`̵E$uLrk-$_{$# $B*hN/ٟ".zɪ) ӓT)D:fci[*`cc&VhfFp佬)/Wdځ+ uR<$}Kr'ݔTW$md1"#mC_@:m P>DEu&ݛȘPˬ-Ő\B`xr`"F'Iٺ*DnA)yzr^!3Ír!S$,.:+d̋BʺJ#SX*8ҁW7~>oOFe-<uJQ|FZEP__gi(`0/ƍcv7go2G$ N%v$^^&Q 4AMbvvɀ1J{ڔhэK'9*W )IYO;E4z⛢79"hK{BFEmBAΛ3>IO j u߿d{=t-n3Pnef9[}=%G*9sX,¬xS&9'E&"/"ncx}"mV5tŘ:wcZ К G)]$mbXE ^ǽ8%>,0FЕ 6vAVKVCjrD25#Lrv?33Iam:xy`|Q'eű^\ơ' .gygSAixپ im41;P^azl5|JE2z=.wcMԧ ax& =`|#HQ*lS<.U׻`>ajϿ '!9MHK:9#s,jSyIt1!ʐDN q$0Y&Hv]9Zq=N1/u&%].]y#z18m@n1YHR=53hHT( Q(e@-#!'^AK$wTg1!H$|HBTf̋ Y@Mwq[Fī h[W,Ê=j8&d ԋU.I{7O=%iG|xqBչ̋@1+^.r%V12, _&/j"2@+ wm 4\xNtˆ;1ditQyc,m+-!sFɸv'IJ-tH{ "KFnLRH+H6Er$igsϦ>QKwҰ]Mfj8dqV+"/fC Q`B 6כy^SL[bJgW^;zA6hrH#< 1= F8) 򃟤,ŏd7>WKĉ~b2KQdk6՛tgYͼ#$eooԦ=#&d.09DHN>AK|s:.HDŽ">#%zNEt"tLvfkB|rN`)81 &ӭsēj\4iO,H̎<ߥ諵z/f]v2 0t[U;;+8&b=zwɓJ``FiQg9XʐoHKFϗ;gQZg܉?^_ XC.l.;oX]}:>3K0R|WD\hnZm֏op};ԫ^(fL}0/E>ƥN7OQ.8[ʔh,Rt:p<0-ʁקiߟt[A3)i>3Z i򩸉*ΏlA" &:1;O]-wgϊ)hn&i'v"/ͤqr@8!̴G~7u5/>HB)iYBAXKL =Z@ >lN%hwiiUsIA8Y&=*2 5I bHb3Lh!ޒh7YJt*CyJÄFKKùMt}.l^]El>NK|//f&!B {&g\,}F)L b߀My6Õw7[{Gqzfz3_X !xJ8T<2!)^_ďǂ.\-d)Kl1헐Z1WMʜ5$)M1Lʳsw5ǫR^v|t$VȖA+Lܑ,҂+sM/ѭy)_ÕNvc*@k]ן;trȫpeoxӻo_nfz6ؘҊ?b*bj^Tc?m%3-$h`EbDC;.j0X1dR? ^}Ծե4NI ܓR{Omu/~+^K9>lIxpI"wS S 'MV+Z:H2d,P4J8 L72?og1>b$]ObsKx̊y`bE&>XYs䀚EƂ@K?n>lhTm' nܡvO+0fqf٠r,$/Zt-1-dė}2Or@3?]^ʧM <mBɃkQ }^an.Fg86}I h5&XӘ8,>b _ z>9!Z>gUŞ}xTL̵ F8ՅX/!gqwߑZȖF 3U>gCCY Hsc`% s8,A_R$קQM17h\EL#w@>omJ/ŵ_iݼGw eIJipFrO{uqy/]c 2ėi_e}L~5&lҬt񗽐0/λL[H* JzeMlTr &|R 2ӗh$cdk?vy̦7]Ạ8ph?z]W_MqKJ> QA^"nYG0_8`N 7{Puٽ/}3ymGqF8RŔ.MMWrO»HzC7ݴLLƓxxi2mW4*@`tF)Ċ+@@sngir^$W v:?_ͬ5kw[!$ڄ`[nUgu$ B6 [^7 |Xpn1]nr CC5`F`J `rKJ;?28¢E WiBhFa[|ݩSRO3]J-҅31,jl3Y QuH vΎ]n_2a62;VI/ɮ|Lu>'$0&*m.)?$J==񣐝 ZvƂcW+dˍ-m𢛲@ms~}3ɱ© R$ T5%:zZ甎܋)`ŰJ38!;NfHohVbK :S50exU}W`upHЍE_fNTU*q%bq@/5q0);F74~'*z[\M-~#aSmMÉB2Nnʇ)bAg`u2t"8U [tJYSk, "vu\h1Yhl~[mhm+F(g 6+YtHgd/}7m]Q!Mę5bR!JbV>&w6οH+NL$]p>8UU>Ѫg39Yg>OF9V?SAT~:gGt $*}aQ.Zi~%K\rfm$%ɪq(%W>*Hg>KStE)KS1z2"h%^NEN ~ '1rg7.,^RD,Xt]1GZcݒ”^h V juİZh4~!GAxc_$A|`JKEG~ި_TYt%6,:LizjMXXqdl[r9RbSx e\rxacR8L=Ƅz`!1Z.")׭I*0=Cj,?V~ @VC|J,A$SՉ8Y4cT+uIrd31V_ѺUib0/ %IYhq ҕ  O UA!wY~ -`%Űb`\mS38W1`vOF7/.C!Pu&Jm l?Q>}O+D7 P=x@`0ʿ26a>d Bqε^a'NԋsI`Yu.7v$Rt)Ag:ݙyX|HkX cU82/!; 瓠 Li% z}ɯww"O-]J`sdN$@"J`Y13K/9`VTElsX|D^c%֯T][$m;ԝ!,Z5f`XFzȁ=nrSA8; P=uY}r/27OUa%~0;үM3Tu ȩ*'3IC?gu#QH! ,/3`~eB|C1Yg~ؼ/5I7w9I}qww}U~7뭱ԏ,}e7]ukDn`jSlQ7DžHa/EU^IpYWW兹Q7WyTz|nˇ _qˍ[!;n ^b k[);ng]ȶM_u)O_xV hx h[K2kـ`b duhq[..cS'5YO@˒ӓdcY'HAKq^$8`b $1r Qz?ۧ1ZM/G+qYcYl YhD$kt_TId E$dS:֢̆ ?GЅ'JƖ'ZXO݇'kJՂU086\h%1GK(Yn% ']Q; Gd:!gI-XEmkF}:~0}4t3Qf5xd\hEB-} |q*ȃThLj'sQ %؇Gk`F;Sl\h)5؈x2Ld="KԦ:EVewN ًS9d#$*u>>I#lX9vW !&H2kVyKZt<cm^] bClL t1]}r^nʂI-|i*'yW='W6M$oeB,޳X$I6c>EK# 15ۑO2Jh)8Vgl0v/eNEU"Ik dRu˜6Uǖ xs%P ع omWl҈sApX!^ Ɩgv{Xn|$̇d`>1Ljn떚F+B9l"UP۾u2Ja>0c0Vvގj$]p^M+f~@9{bOe@7ȱ^%u~-B竟} |23 Z.`oqD>t@N _7c$h3`lg\)[h+pHBr^J |r\8czEnv@qZbRT1e8V Scc6:$[|a.fpU`ZR֩bKgTlѩynۢ, "1LӰW&jDkM~# (C>ϭQ3{ߤ%EN;?P%ٱm -{2k 8Vbv"wŏݙmn&O1^'}plM)0\n ή ?Cֲa9H] lX9^vCο -vd+OUgRy2Я\ B0!% #>bJPUck\Ul'F瘏Y4Ew`[x٘p,>9V"R1I>bJ` UL'5m1Ԥ:t6I >jz(:W֪Ƹ)!fꠗe[XLE4atGS1px#S]MF˦NJPYDX%ܠꡗhl}i9f?q>b-E'V"mNf""ŦK9kǍ-vU #`uVi<s)/=r=nlӗЩsdLyVIUI':4^6& t,O669Ȁ,EʿkڍfC58$5?DX 4q]ll9W@/zNaZf% >Ę_"+BLu>'Ɩ=xɮ[⠋X((6I#z)2S zp&m?e8 E-Q8W70up˳ A¦5\-> m-8NJ\ALd!>_:h/NAC;?'mg_Œ8bJu,T6?h6 E{lc|T=J1t|VM/!m8N$@"Yҫ{.9Uʍ0\jmvz_Z p&>;d9OU᧯MR3V:<}xXh//T+coY5Ȧ4/m0NE(G2[+G~H'5ipӘ͏O +Px SPp.,?Uv|$][oȒ+_%xm ldr'LA`4Mk‹l[ERnY$Itbu]dM`8w?R"0lAC)JYpw?T5(x8_ U*D)E\i,2Si[HPK"ɗk=*Fˠ)z50+nO/yPH>^x?X5S'R<(CUEd!{;f<.CU\{>U͌ZjE(LM&;ͱ@U]A_G8'Y)8<'"8H':ьqQ^"S"8\Le?">Ls$:T 3 ))SRbtOlѩVGwƗFhs3=V^zh9|^\9YtW4H7#ş ' ;M]|ϘJ9}|Jqs4t!tG{6P! =_1"u<ϳiaevGRغiYÄawlH7|3 ߳,y3-Qv]8d(.VG-;e5U!O&A E#*Ŗz>/ env}H֬i%(oJMfcP9(2 YSy\ ^UI@|r2]|TO.FU!:?wzE2M뺨^NjR겚k[u^yHu2rZ [-p2~tVڤRDM%g%'9y)Ff*\GI,e/@L|5u% NүԴ .(GA/()s–Wr<:m~4 ~xRx&zY“OHȣ/!nm|ola#+Y4ϣǔ]28AF7XdЈJ8_0ػc4-沔'ٷgk<75 `:}u~z#;_Y|}$ut{ܻ2oi I!dʼny^Ѵ~wHp n=.cN4azǬO5 4V9x޿*#- {xp|^!bT7'"7@Av ˀɲg Fdn*ߝ㭰74(6{,Vm 2Hkqk%1M6 B(Uij)^wӌe9mkGm0“s<6!|ߢTMy?`p^ G'!,%G-B/Y ]F'/_d *O[%B4O%&Eݵ5)\Leؔq\4Pk0( HETpDCy/Y<+Xݗ_WBJ?å>!G)u&oa焸ݷ6oq2=2e A J앙&9 3HV.{cIiUmJsv&k9,T±Q Ct[! k*UBarhu)yzb3040ʜXks%|V!}fջ#MBt|mfi$ZUF5mWYV1q gXD%u/#&VDKX#")wREWljkLb0/t"5GE-oRa"-ZES*-^e[qv5S#1~NDn:4J}鐋E\M+NKu0F6J3Tn7R-ޞ%^i}- _|bZSY<$gV'l6n)¨Ҿ`i@78mL lW$vC">{*U-o3ṁ`6^+7Zk!ЄĿP50 df@is}jǫAϳm[FyJ-غ@=ݿeQ- tH=LK )rGBFIc"}U6\TcX[CG ojJ֒Ȫ-v`T%3M)Yw3S6s};R%S~ D6RiMI"!Lu.Yɪ֚2Z1UeyƊc!5T$ ej8[bub=C#m0VA%(Ih+ySʚA$D=~ PK"/_Ռ?PkLuJGrU).MݤSWߖcp &vUyhKs9㿫fkv=~RǺ3I^]⸜9Q3e-Қv%願r66t--WgyU!V4?0T *s=K×nM{!\LX8+tKͤir?kyl'u4Ћ|3̰\׳BY "ݵ71@iGnAX78Jjݬ$.SN>*sK2WI@5DtnL֗*R[xtH˳d*Ϯ8dm`Υh*v߸~S!v0mЁ%y%.eK$\UhDз$ydr\,ڲRޡTwhtU&}RWp}Ouo۝v2Cdyah9JZYu} cW~Wl#zዼ =[_T2 X9]Sؚ]p$;O5\U>xMN>jwhUfOcf KO$Ov'07D=&YҾTo,`6k C"oL=שOoľ[9:E)^q}Ձ;3ͱ Ye*˛*gMk_а*w SUXT-*˾jjEg}g T-8,Nڥ\"Uȓ|FܝHtWܾ"_99Z%LUc-kHsL$PLUjXw͵lG&8 [fh|6}FLj CߗބޓqcAh0>de~MfX wP}8ޞtem˸7L?vZ* ؗAߓ tڹ}0ܽ ܿa26u浌43;+Ξ̚(7 ܇Mݫi_Q?T`MefBJ{fOOfO2Ӵjh0̦l3  Mwa!ox@{7@y1k!٣$Db#dѺM M` iL{t XLB \[ =/qX\ߓO<-I#xg2w5Vw=z ܽ,aǮo7Ӯț'gw̱X[P9(#}(#}24 >‚dk>r_1e@=`a?~8er$apSƾ`g!aiw鿯|x5YJӻso/.3PNw’pRp<"}Ez9Oʋu^۵enYn-?w;a_𸽸!5 8WK a?kԭ\kl^׫ٻ߶mm&%J"C[MkȒxu,_ΗW琒LNrX}!?<_)9=l,u~= JUk>¯S]U~1{|2\~%e޽f^r2ꪣiR~""j|sCWp>mR':jv\6խIdGP4 ZŅ1yt  ޏ4j"7~ ܃pv(ݖ%DnM!SZϋ$HD?Z zR2#0:(.MD>xs}TK QRH(קp}+2Pv!=GPXT!Efpe Qȸ0~?tqLzP 8BLBܲ'ŢfweȢvOEE!iB=I£sW=:xh qW$؄>8$`d E_:˻TkeSi(bϜ ,}aSY6¹m8~}P,و$QCډrIC!ӿ\fnJ|`HԹ(0ߏXOH^lE^=E|;1${{tBL"?E( Y0}yaZrWB#Y]u,eL=Gyvg=Kl|7#w%JiʱXnRi~gE@Zds^ .f/Ɇ~%ߗJH}=)#Jpы`0;O6^r&MI5jhIM}~ )/|[{rt @X;L9*.=Φ4t/Miq@ud"Yޝ|R')YXxYsIrX3l (zu&Urփ鎉GХqB8b*ȿ(#9dZ|`L{x`wuF@Y!EA`; +(`EU K,.<SbL_-7VCh{9  ~oe`su19|t.y̅X8bkuuG'5,E:'0 ?R5#Y._pvaTyܗ|@Co }w1a`=,QpjD\1RDYrxrlѡر ZɶP3V>~6X5IN (i>AS=UNu vet7Е>azY<%悔,I $t#/( s/a_ תRROJJȕ; 9шوы4ǃ XɆP$T~2zRq"<{QƔ\NfFɰ+2EEci Xo U`noo|{3|Po6ZM+7X@߇ way+ ֍V4\[o .i5(yeiߤ2 cL0˫EٌGA#oe ˃AҏP䳬(_ƇKX?`H߼_O.n+ۦKgGk4U}/+MR0;A䔠hVqg'7dzqhp8[(]ܽ~Ɣ30?.?'aF"K݃WwQ 6oyLP)ce"?L4OəZp#Lԍ2O<#{!2,yyM1s9y*2o$( R2Gv oY\]K"MrAï@hgf9 # J~L7RfGt^m? @7~f{(ՙ"]I~GsGEx֑z3Hc4#Іۛ{+~ߓm"]@7ԗ)xݐO&CG]V1YɆ Vd\'Ch,ȳ%aMI57*gaa2'>,A1K*%kY@qh^En/8b/|.l.l[x5Q`@ogÆb ]m}ԭȏ8bPm۹@o/.9\KYA.f.{|āg!Pޑ.*V?8 bIc.6|"yf-qkj'{N_n&Y1_ӘA[lrmvg?ёy,h[PEb/j56Y,෦PN(ہP;iGv ['ہPowB=PoBuBw'# w w N( v 4؝i$4؁ppB FhHB v 4ڝiF$4B..0?w oa SC_fHxn"5XB`@~Yqf||< jR%FiFL'b cC_!D1ɯT\ 'yY z{d|>'&_B;0>.5A>|p#c"۾|W5ϟ,+$`^N<-oj-6lXbP'핣Dst)!ͨrjT|LaP =.:T,!>T_8]@Dž$<ß'<g` AIQv;z1j\!S;*^5St?Ia0vA'tϯ) 2J0*yd٫? >HTշڢTSe) Ϡ$G=<Ϭ=ݛ3\ߝؕRXH H?_7ߝ9vo.")KMD}9[w `/q]a d-m:m~j6.4qp{μ3wijtz;|!//K/_yL]LA[~@Lޥ2 ! w62bQSxX`&Ā?"_Re0p)&#:tC9\@}ἙNtL.K00hv RE"@ OBsI ;AYoH<^t֘b ݽNU5gG ,?s6Z BcShiK魆t MoYN&Yr+2l . \8z}70n{U[o^j}b <sQwn&y0_\H9H71eʯK0_?K`3Lt56YbvXdU*VCey{vү8s|0DoGIgT l+Gbg0Ty˘+5z gČC mR53ឱ@Q k!pƼJԳf8H , <[:03łw2`NLƘeHӃ_7XίC`;SOYE" GFJ˧ & Q'k"@-(U]W/G]($\ z~MK`J!VU+Uu WH?ߪ!\卫߰)3PuAB}w.j* kHqeƏSiiexM ;cYq=N kPE8":0]'y̭ E\+vY`(+ ֐0H3`a!pkj%hSh]MܗfyQ蛕`ޜɉNahx)gNQS?bGGe[OJ"5Ckx:OH** xZ% j>kOpGk ծv(tKΤ`jU2=P mVۜo`N#<\NwF4?Գj-Ep85?ca8 m^XbTʘROڣRRa<bXSȨT1We4+Y*RO`Vl[G;bs\w 4ѭIPz |~M"L cj(&LQ|E~Ydl,ڰS+`\&<A#sg !^0K## ;p;>F "uJvwa,;h4kHpcqVk%]1ݲM{xPш51$Zʡ>4\BXR-3QB k@1vdWYDj&1Q1aɁFT cM}(X>688p/a5:U%8 ̭Rl eXΎݱUM TXFer-c̰A|h򺇩 }cn,SLJ">0dv䖳M(ͰG12d]{DJ$xڱ[Y\x[yVE=WciDҢ`]tJ@\W2L~El"HF [[9ʼn(?՞MBJR׃ɥ_FQt##4Âװ|bhVG9dcig¢ '׫d7ڍυZcAe7ϙ(Se!ɣc;aFdc^]֛Hx oHwrSi5;*.;"ZQcs8 F G<,W& qo"̐zF/ :A߄z$nM ErVRLCnż.j=7#$X]al6qJAxJVZ1e ǭA%8 {lߵ(-Wuū`s+[EL .xkJ0}'3r*\O_ƢYeF Zf} j=yT~_;7F.e\}(qLE"ӥj.C6n'jyqkJf<29n-7B%K]K_ 4dߢf=0bosxڐh/<6U;LB-٨}hcbng#+̏sr{ J:$2Yk0i&kaj'|)BFTŊR.1C~z%Ui1;XD`p3ݷw]70\LAz[ua4=hc#+I3!ӥd[hLdbdY3%()GP#0"l@݅4K}?7$z85igp<`jXwՊNaWZHqN 8=⪡=Dù8Sb$!Dtg*X;ͱgpxe o"M8`ȥTG$QTO5}noBBݲVfBL YAޗڲq,<,3*$b-* AIYcί$8Jn?Ȉ%',]E g0QTaq,REI%mj}ҟ>Y z-\:^ytm{q$8w}Jo vO ^) rEM$95">Xn!LSw%uO`z:L!A*_iټR ط(WKg8o$88Uڕ^(, U@lίO5 zl⒋1/b5K6&mQ26X:QyC勚Kvr3d]r_^o֫o`krӵEp=$RZk*-) Ύ9ר1Ĕ~}i`i=Ma]a7HݖyK٧Չ"gOy! ˃"w8,IYQu;+QMv o`^@i8%Y%iZ4#hK `{ySjJ\LKO/ P}BȀ!̐fpwiMl;cjb^=rSb u!aͲ5H V*%dy")Yn޳93KevOk%{Fr1`Z {F%p,Ow9ݥ 6F?vMرRaY 6P/OXu/ji+\{$ n,bH0o:,fΤlQ8 &a=D#R!oF1o$]ΛoAL!{߭PݗO Pj5`Z,d/{jUy]> tHS2)P=)&YXb͡A.&,:TXܚwi@td ,8Ȣ~G.yc^+MU3G= Vt GԘsI@nQǯlm{a4Xq`֬6ev]dH%<=!,k6IBܢZfuLK-a& uṯ` )6]!KG3xD =KfQ פr tFzU;R^H|sw {ljCKw:gck/Sqb8x8Gmp<!Wp,eP c,|LJ$8j -bFX 67+"_#ct)pB/l|UCs)PJ(FFH?8xt|p 2 གྷi93c Hp~QMLvL١zv~ω(QKK\Z*M\Gp 1ٴ{f,B]A{~!}&܅a]><8I2y"Bpr֎Qi4'^DX9J@rkʞ쒷Hy0 I'ɢʰ3j? rkHp= >h[H3,kBAdC—=ݥ˚2 3jkӡfDJ^"5CR!Z*t䥇Gۨw=dgvRf *\|uLLZOzf7&^>rȫ͏Fs ϓAh 5I津Qdw:4[eKZ4);VD_tdVʓ֢L4|ő(9U+BHLH/F1g]3BI_8Oc<#K;N[)v>yXA|'nh7f})fFǰjH n%FU n#̌]_|`7Rr, |~yb35 ˺viSܹS+^ nCјu0!a:ѵp[G X"ܴ^~F=2]ě*CǺ!oJA]S'`:j.qP{8?l]7`q\]6v |}km#G!`ggv! <>Ld}תdA-VK#-jA`R>U2@-9̾ W^Pn  |'ޖ73EM~;ɨ4=tʮ`@{U~ ^'o c&I2=Ȟ{JǶ^HnzZ\3%CX/sY\<6 Ofbov%7 SQK/zhP~Tt(ӵ- JU'O,Q 97<#lq1`c,&㈂,QYdN`=њ刉o1N,,7@KQD eu]RrқY~17qf6޸ę)шc4eR`? 969v6Ro:rDnT"WЕXAzR(;7Tu%UīuC$ŖW(8sCp(x`R;/p҇FFBqWՎ-us=˥V]c22G3~exeed oA!["{wZ0;zt4Y[{L8B*[ش򧵉c]5D~7,@…7E_6D$dӴˑw6#qaD&:Ic|yA,~ GyMI nwj{ﲮri$85Fܵ/Bp9n[ޏ JmGaܢy 7 Ny3ebӅHf$4ψ[7fȤ{_o9ob`rn\Nn>1H%@`. n-vH2U%֎l܎nC@CMRC- nU8E- /A\&kt!$-I(au?aGY)c-ZP"䶇gm&jWg4/Q8UUi7`cťh5(uA "Ol5HId R(TZ?* HsU&uE%bkBDffk4Mg^LS%cJYu'04Z#c}Qոqv?/r8`T4*`M7(zRI"ox}X0-W@z; %q]^*vǟ"ӁeT& $23O~ug·d7#g,%RaeWQE;dޝ5]ݲ,&vw|Y^hA$Ѐ$˺5 0K5"De3"ifUWɒoWɃ9JPۧ$vyf2V SfRݢiFtF65(յ7i1 ``/hS |~ZgPya~5֮vk:"R'Rmq 9-Z8MB9]Њj,Z:^Ҙ<`lņo:]pݒ>dKcMN9>nGkm4!]1 x; q猔(B{2'"WMiٚz&84>aLߞlP;Z~RJ9m؋_[QZ@T֪tXmtg0V6G z8e2yqu/ ظ̀Db(6X]?Ō@b`Gɷ'?nhSW']Z]<ޕE'ɑdQK4~\gIee˫t>H 5@)DuT8'K skbB`ΔZpІJi $1(md)2Jsa8Բ α8SZOuJ'dPr-ÊE{*JIcFJeaF,8e +ܚ)gJTͬ!> WkV=|9Ž;.wke=g`@)4&82Y C4#DaAf8Gg:afrAMn׎jZbrҪܣw݀ 1i߁m܄.2t* Lr˲cF4`5ٌ!dH Ɔ^1V_s[c6pd"m@\[m! ?9u删-GThCx!dPag qfN |[)&>R$A"pڌC]O@T= M8Q (b-?n)V[`oBK?FeUHSfQ*D+!1 SG&2T9ʲ џk&̧8W-`>xu EZ hkak+ic+4'۾(B;Vj̴7ُ5+g& ~J+c%sY6I(A^v&J~0 OoFQexwe[Xu"w\{7l;aqUy;_g(>1f\tw~*+Jߩv?VW %݌991Q "vݭA0(C!?;>dw~4h0Z;q7qvm9W5ܷ/WN=3.֏++Yrlo"<2~G1zyR ؊B/E_ V7a%ϼF! -o-1R,)KU`LpL,$ H̎$9G2! Q2 ++&7 r 0WM3 $V: qyw}qPGZZar@N3F䥵QH5Bo=qHnxq9o]h%m#9."SVBD-H0haZgITHY!% Ey%UUUO!E'TyʛO2m$Eo-\R9R0:I]q)d-u) JRYRanrP|\u[|(|nqѦ.nX?4Z"ͥ4gg31R;\tNXe{dE`%74a8J5WonAb0Hjgc6`Li ?uCuC!Uf۩٠mWfONjLiL&3if;=pȄաԸy2 ZiXOoû` ӮoQ~ÃIo-wH^=iRY36(C0g"} A$Zn Sp:a%gȅHcڂЉ, VFE&t$>,ƑS؊4J8R*.C,cM G)!A{Vy{]ѻ9cn->,/+L^MW$9j?g' I7G÷7_V*KԨ^˞WS^`}S̮Ϯ|{a|m(`S2f& |Ѻ\3ٚ3p{ZmRd$ZĎ@Ipjrښ%ku:ו Z?iq 49io;jO#uDZD4) 8n[{*\3PGZnE($ f%xHįAܟ`!#Og1"-%Z?i9<`:3i.pw@|iSC9^rZԆR|>h;WENjwwM$Wy25[P,Ze ?'C|-՝MY+EUq_NPC6¯6|7 6r-5'Si˔BKNnNۓXŽ#u+sLYkJw>\-]۬pQ er yΖ$^w+IJ;Wtzc.Z28/z߶*6pqET-Dl%4kQ=7c.X%,BAӸk1 ([;pWJSՁםiPHf^54; gnrlRMUwE]kC#To{U Znkj%uo>]Gw򹿼{ eTAY QL>,v2tzǾ ?7<.$+;׃B0Mi/=8\ܜOq 詺z'óN)γ}ONfs~,1k#wS./v^MxZME(ާ~a_׋U f];uRN'烫wǗ˳oN-T7CO;>I}Sq{lVƱEW^=[91gMkuzg~KY˼Em==/;On(.Qoy8vZE:Fqq_/OOQrl%ךgb`^l㯪?3{=/>O.?_~<9\ySi^dŗb6g{+q`Q\L0>+q~tw/22_pvTLgW"n|H*IG:?!'͡A5ZIYƨ?U ^g87뻭ܝ0i'ѺDI^žDk]It4.|gkL4- ׇoSEJd23,uFqnd&Ђ&\6if}]&eZ2L|%$B=1Q`D4&`gL4& \dњF;0ch'\k?x(VIeoKq5gLn#ץLS`5HyA힣fP++B"yDoN$FDH{A[j-D8(MO0njUg`TOiFz2tp2\%cIi2$٧3_MrlǴ/#)&pGQjz@a8&9+3 1]sf]2ZSu$ۡKD]AW|ƞ0{H2p6( F0mH0{E"qy&hhV}C}"HS+>usZ%,X}y5ato07FVxcWxcоH׼1Z{z@>BGr'v`(0zihpm\$Zl}f;a3ӧ?QbN)yE\'ne2aVvwt[q8rgx*\ɡ+5[57<."p*0}Bis &o زȆ72) -ePB1ji AbkD8z+鞽̶_R9PwdkZ1u{qm1ۅ(DmjM]{vmn NџI? ׇ\uiN>Y4:Uh2{^`coipy hzMvy@j0DV?AMPuRpQjVkؓwv'kj=y>`:uxO"Ӳ= >NF)0=&dH:\H[F$̈y =zÅ[ LK+ZUӷlbCp[õz{cJ灄\GƄi~YF1fe@ ;^Q`o ڎH(" dXԉF{(1rƎK;4>[b+i잉Ǝ#vصƎF.Nm66['HJ. {w 4*G(hEjJ gp&m.ut0)$.5Ay!rK0MO_Ms^zmU")к), a H#3U&Z/4O3.ï;1wp?<AѬ"iq [TVuw8_ܣ;U&o2mNmpK$A/x~s5#U*Z \ϒٌ{Q#)ZgxECgYsA=ٷnOڪ@`Fl@B2 Lj7$ ܾ ZM;Ӫ4v[R[wWN ?u>:!n>|شyy 8Mۼ y4.w>]nI5=h$MKs2/T"!KTRJ{Y&r3Q袄2) R֤RlDZm E;֘mBS3Wǭ“/[aNP#h$"PxzV%MO~Cߙ>v" EOA9gqV(D` B~SAuaBU-T}}C1]m$F?ϧ5bAAvEFRYןsM|arYŊӧrD.2$ b.1Gux䆁4hɬx -I0kMN^A J#-qO8<'$za.R8 ljUD59m-"o~\n?Nѓrs 6Kꔾ,*;BK\|$<S8 i<@"~XXg݉δt%Y˒d[aH l }~d@> vs=?IGNhIz~sr|HkfFX9S<`hp##[4jA3q88l )`J ihWˠb(v3O>'T _ )Z}|d(?nyy%p{F]p}_Hub ;^Qbo1Њ#yBuGh=*^ȾmB^ `YLP="AX$X ZfH$ sԶW'6 01[S=5Ȕ3BV2Q~)3lw=vp(dbb5%̧Ha>& {IDNk"ZZD EgPUr73A9CӨd u#bƩHS00۟ i3ZeJA9^bj%0}Ѱ&J$=ـ^./T)@,]<Ԉ{,ntj]z q=so\k}4uje`ٰ4U@h3>he2JV剼 Kvҗ'>Wy"MGN.&7g?ŃXp{j^0W/\b%8LzՏ+#ɬzOݯ؎W!& dE'O&?y Els @FhPw Mr;.XSAт햎o~ 5z.h\wv+N_%H)Np(IϽf 7޵%I%289z!U]]#d̥vӛc LH@`281ZZ(}绡1eܦ_RVG=UnNI]_ެZ>,ԮykE 5f)Fκ7W TmU%xM*kRTAdT+T^2;TS wR%I,ē2M-H2Maȴ&&)1D\moj{!7C1C×*6V{7ʔ)X_L)V{p7THɩ)IV{)l/ UVj{=Dk?%S0lI2e||ZUD Rmoj{:V*Z rT*@z:0O`RL2|)Ru.zX0)R/֕$R\oX]ouzALHdj{breGڒdV{7S"CjI"~j{h{9K!O^RT= W[m!^LC*N4u1-Hީj{=<ۋpHP]HIV{:3%Uq>RꍵV{yS2\)S>3=tm{ϗvpy.d۟Oׄxɭ};^[p3 =M:ڗps総o7mpoOQ'~K=m/K-ta?it]~3sכO#hԑmBΙhmBöF[ G >b1}v*ފ6_}4D,<]hpgiWl1i8u Y R lz \ۮM/6XԂsJIGqG^QY=K;ƶyU*FZXEmzn5m}F[+-a XfTA 4nM 9ZFr{~\mҖ}9i.r?ArY8Niš['j7}g1 #c.g)9#H9#ߓN妇:v?5Ro wF䉣WWcU-ς"+H9Pu8fXcB0!3 q `~!C)YH:+sxj-Yʓ\r%&B" zo!KNSc۩-6ƥ'ֹ< 1s.ZF\['r,sSZݜŗ_?'{sǨ˻]~OjY,R"ês՗<ܟ~9_&j' d˱yk}·s`v"xrE彚T9C ˕w>+}SV&8x$JF3RY2jEWibx-֫x48>a5![U2l;CY+u:%eu*C ޅҴ9̈́=IesGI*q#i/f #}vSj|}v}G#r~mo^'c>[A2ɘ@ gF'#H9d>\Z 4oõc6>6;H=R HYA޲ƴuk15` D4fdY R ,X{7)KȜ]z4x4mj$J3gcsoksƯ3\~:E@XҦx=@08'`jB2DMMH?!Ʉә,JMƺ  A,=!!mcKxKȬ?Y%xHVkTR{)NJs8X7?Q~;'С;cC٩*`CJs>yɞDDa0$05LRy̺0wҴѺ.̺pf` v8MA@9NFrMzX>:/: 9 9̾>:OHVfY !:9=ه?ߙ9ͱ|8Fu .U8??KjLI} OnBżgc6H1 W+;<W6ݭʓZU¦рh:nkW5ш^NvPDi䌋2L*hdy R;9ku u=-sv-[C4ǁp\ZꙔ@Ef&݌1ͼٵDlih~وWhQ-*@4 Uصtly@xZ„ 3Y R İT@<7kq~ Ӥn|d +.?z;nN惔-"lɆ.q6fPHQlj0feCAʁ[PA`+4^ 7j_P< <j)ZA5C[Gr-nhKܷG}Ak#Es~.+H9IoE:BC !uD/7A#C[M3 d7 ]zgsN CƧgݘ2N5c [Es<IMYfgr/Z#1 yng6F)Gixv6t! d5}!hmÒѨ-+H9ԔiLT3YʴLF%@(?8qثEFb@ax6@h@|>@ƒfQqI9R(k Pt5VP츊)t= IYXƔI{(,T)Um6lŇڹs7qs]^};jB#'#ş#3ưysnm'^ < Hx,&/ h3P k b=qs'uMq9R ( f- 1߮YKF֒/!rBϩ>MfK! \D݁ʹf-9C%p,zO&1%o%M&mM'jr^?t*Sٮ^e۲mYZeU.ȚYWJJبF~K(A)(}֖nK[-mE(V-`B0$NPS! %/} ҖnK`+E 9^<05P?bw-T"Cdޮ䌢oeca:8q܆ay9֢YW \2$fj:?:᳟l?Wg|vd^E7Ygo?06w$@ζ\@Ыd[ dΰV˳`qJffjl}OzOF!i5R{ISyr4=вu[uOa 7#aXAàƞ|+[_)p{8G=GL`&wW%N".]{zGnA'5jp@m> O\fq0rdh׾,jY˃pF')W(;bɊJC,>d ` m\]A_$rap^í'8\EGf:]6|>F^@20Kc%eV܈_]羿iГn6O`(e-vv7^kg9u`I&e\ z7AAXB| N(r}5ٻ'YoWцO>лkvᱝ# zr&G6Fo~`CJKF}) `~yגrAZ8X1%,6X nHHJ- Jp1oL_x%%֭OyƩwnFBD Ys_jgLÕa/ Px98Fv.'pNPX'j-Pu?‰,A2Y#IyDks؟''BhPRy@3ץ?.cO_ZCh8؏Ԃ,m<&A12BQ4 לEٗ5*|t}脌ѢI>ج͵j%"ғ47c|C|].$yq#bwY?#\߸7~d:UUTEե?[w/pCSHyyW -ܥJvo_YV~"K;Fz|8g:>zqLRzl;y}[zSnPm[{%y0<e4gRo5{Ӎɭ%ML)f qAӘKbb0J ;HRFxI)|UIX$T}ߎlvue&7~f_ `U^x6ZZ1Qz~UېQ73M6 c:h9!q|Rj[ ~uU)dⴢĨD!LVbXh;']&WkٴW`ěѵВ yƃ3*xJG{w7֓&&7u[8(1Za5$ƴD=eZIS3|u`j-wkf0kحW5AK FTHQ!~'qzB|?x5'"Zѥ҄uKL*ƛ*Q y`0W6[,W,Ozb;IS~M\:t =1M^j7!#Py5+z! YJhxȫMڊ@0W!riFoiE**5oRP{$x`kUS7gp. ۽&⊻R tikyU<숑™Z`]GӆkwՃP H\]=S}~pEa?T#ښ0[y]&ym諍Boj eײ^+ Tt=]35iՕi%8{+TPR&p'(d!} ԅz?]Q.6@xBbT+n_Y{CV_(icHXZ bլ] myss1nGj.jc<*" m~a!~*+GT_AjHLYk<Ӹ"rY:S_'`GVu"|5 aZ)8ɦͧA_>3_t Ye:1fk!ϓ Ҿ,w^C++~Ӫ*UN(Nqw=Jc4{_r:E9RkX0,To>n:0ÍU rmp'T=F7G@<7^N~G>r\GOq<48Ì"TbL (I@L}nucTo"Ou:z/A9GbX (ZV) +#1>'4BO$j#巏.%L (XU0D."]NR!屄c‰5EUlR$A u`>He}n= W'j>T B<;fvOpx@7! 2gBϥ #Dj@i05[f‰ާ^^[dT3Ÿ`ͱx:pz2ăowaMQ(KGf0P@tCL"9mޙ慈sLg~BŁ;ς9¿Qoqo>DZGb9',JqØ|qٮ~&+|"⓷!s0Fyb3lI`O4ы7Y>﹩-] zW7u#f<,z?ߍ_q.{e5V\LÇRν+PO&\/\5+rޛUeu }~

N:8Lkf;4,Ӵ{:sN\+hh#Ląv󯬋gpя3TYC)*uX(ck&(VT(ņ2H)韜v|(/o mMn]Us̀O1ˌ !ShTK$$RQbsl QN &傔$&61QXb!6RdD!K %) RaCaKؘK7b]؇,& .E (DcH "[pCLb,c̙K,0ىzED1qhs9P%K4N,tƌ9L{L46CK؉D֔Ud;ڥKiʹMbBL9I;9N2f!j cTUХ}a0C7 .fʭ [I xYNml 2¿w=Af̴Kv4'ݎ,hBl8P&v b ȯ=P*:F&IS%SSif-'HԫR9Ԛ# :iv i_7VTbb9V` dcXRdž L[(+>2i:oX"LXRlkpRJK5 A*ɏO09Z%Zy 5Z81Qfe`YO5!0I-2j,9%X$b &EH j$ 4D[`SʝHZb]kk$G+{`ֻ]/ϑhI%T=c=UzR;M*Aݭ*eq9q⚘-H@"(X'i$MZDEɄ1d%0E9#kdOR`@M yuL:RȪPB@"%GYQ=2LaDji l<#SN mDm xBKX/`t[ XkDjm  S7PgN+aD9p,Yٶ4 %Ndp-!$Seژ[U7)D(٠1. -c6$}5ٲ":. /Sq ,! 8F'ڢ(M0pjù " ;8ؒqUC *18o`+ y XNUgT *+-Yp2V!!ML((py4бSEB>rNBAqd:do)e2eP꠱tYT@1+Ɂ@:SL˓;&̨͠j`o]Q?4#0cR3&"h8tjBr@G{;q)YeD|M1ľXtB9cy}Av06_т 3ji]: nSGLǀ2^ RX8xuU}rdp9ZPL.ZMUzb AM`IBBGą1 yPr (1AdeFU Lt~,[ܣi@{S ^\@&6X _T@jk7VTnh ~X:;)*`JTU;- +<-P9%.+$`1~v<ݞYyP'SA[t56}+xmp e, zإ!z=~ ,Ť:"J6wH_GWq8 G ]J.&C>ᡔJr A# Da:ࠃ K^ϠB.dsˍ9':hܱXK4T?p A$ʀGh=c';F!t&M 39I$@ eD!o 24\J>CIisQVcD8"g9@ D8l9i MgtґpiPLJ"5@Јs*u{EAދ*R joR[@ /I@?g#GU ^ \n u0qmQc~|߭:mΎߑ/vs,94e4aeBQC\AtՊf-F8s_ "E2jZR ΃Dɪas6z(mg#}IiA,]{I z Xs1C%1CK6Źn]iI*BјPzCAʸIO!HLP X 7mkbXllFV w+(#9U2trQpŌ 9v%҇&Dym ŽClo(j(Acx$5tT#>qCW=k@Ul?9J{gR @&}*P $tDDEc)+G֐D*U{|ڋ7FK#aT=F m6'`9 02Sa 5@!W iDF9rTC%#pT{*eHOrQP mJCq'Y!ݶ\'. JCqV;ͤ1rbq "&b٥PH6UL _"!S$M]0p9@BF r^Bp{B;Xd3B3 T GR e]-ōvpX:Pq*}8UcY n{o,#ql(t4ݠؽSW j6O%}<.j}vk;۞~]3=~3ZW,`C_f'S})G_Nmu]\a"'uΑNh8!=>?ϥ|m<(pF+<[N]^6Vy1u pIԭ7= ={e_̐YVКn4~e"ЅwЫ?xG=.www ~$?g߾>};rf ҟEr5.m@X.=ѧ']blnjw5xs]\}x|?C"|CcD1+FP>?0kW ,|tuQٯCY'_7@&~Y.ps4$| ]u))jQ&u&#ո Mjb 6gL}۟SdjjVD4l&j&Cbwzo8&G+1+6ĘUʤZ%W+ kBvݚvY+3REF@ Ny鳵o[O%Rsɩ5c h+FɭrlXB)P儋NR2Ŝp3d.R2UR2<iM;G62yO@N닋O`TcT)GEV-tu{I!C.K2dZE4bVݚ/@1Zsr$5ٚoBDKϲzY=+yeJ!-!TFt֩VC.8SƬr l $pppNHoA"o¨ RCʤ}?w6qI+ v -gw!`>F~InjY)i.&A2I5Ίx:˘oYxc;cQuH,R4֧CU>)eb67J}WeDMM[*1ڜ1yNTe}h"ɵ S$!wZu}Cr~9HӰ1fneJG_Gqu~BяXk"|}(N , 0%aSFV~ Y\Ԉu`c~+>Ц`J0T܇*y-^f&ф@[bS,vn 3,Hظww}"fQJ: D{los$v@Z2p4.ZQ퍶A#S:OrO E G'+sVkazxe:ب~ 5 <:b&[Ry$%OJaKDhco5{Nb#NulVc7qP24k]VuuR3+p 3w9aҼ1kQ Ki) omx_)#rJC.8(]-x"tl9lx]$gmְN[Mn(*•R#cj:C6M(peo6ߠ2:,ItP:R_I[NpV ,_LY)+'֫y vMq 6ҭ._s*F#W8c)hzk97J[S" W,,ȼL;‡ sĠuvbzC';3b]D(55a PTh¿gۛZ`*zo]|FLF8J8 lseԸb;af]Y-w@U60CsO/~-zʖ]jy껐 |[xv.>Fj‘сRyѫіBB} p1-(v5k/=X qamYK()DM+2v+W_vF~y |",Q)wr_x P, cb;6p$**UC@Vt%gf H/Svn_;:o1 (a[6XDܥt1lTb>MzC*rK }|9ZR454epEx;v$\h5vR9 ѽcAXxA%LAX!Mv%M[8؈`pRɪ%m p9=ZDMȒ,B\{jހjCYLZ,dsA?A.0jأwG{cSb#byՆydQ}LY%,iJȡ( ܚg{ 4]+YWY¡kԊ$R:͠A*ե&M譈^AVX"TЀ$`B ~J* )^ Du;ϟt5DIc7dvsrI4 (d0u 4C*ZlѣP1{0!ۿGJkGAUǻ63ZZhΣDjaKDodOl_ 6#HHwæD{ XmʔRA5G?#ڍZ}, 窣X ʨTp=˄Tj!J$=B>4r]߼U7ægc BfwA+ XK1ҤF\ 뱿Aɟ&wQ^pY0zacP(tڠHȺ9aW=g`Ul?=J6"Q.1UBϩoU*c&; A5kՃ+U|j̤:5L\с(aB5ge}ҵJ+kDooC) Aj+ªH6R VxXC¬aɣFZ֪0q -C? 0!#Ƚ=>dNUSQa,1T*JjB%+ .: u{cgXb *5=.KC,2ciVMJ9Б$.Xj+9b@4rhM]~DZ?]QwxR `j?zc(nZ0؁ NjDOR3~֣Qv#ӫ~:8q :Fcyǃ}NY:OXO4)~k:__)?'^o/O?]\Ǽ5ޮlJ۳3IT?64{?V ŦRm~g5o?vStGt َӱݍ `_*Ӂ0c:80Øc:0Øc:0Øc:0Øc:0Øc:0Øc:0Øc:0Øc:0Øc:0Øc:0Øc:0 0~ztr; ޾zL}dLt1XeL1taL1taL1taL1taL1taL1taL1taL1taL1taL1taL1taL1e0 fGt q0ӁZ%^=tCL O0Øc:0Øc:0Øc:0Øc:0Øc:0Øc:0Øc:0Øc:0Øc:0Øc:0Øc:0Ø+t>(OOʝh1v:=_owjuy܉.X[/1XF[  Bp.x5XBvJ*~ZjAieYiE btrԅ$Т+K0[\AJH"HqĬ&D!%D%-`"=XOD,Ѣb GdAY bс0Jcn!H{ ꇋ߯wtR=;< U8ݐ|X~];1γ䥗u A%\ve@t*44Xg[G%7 "V+-(+kV"*`N٥b;FU ruXu!W)+3 bIy b"֎Z15`DueyPA閭`@Tt( "VjkDlPypJ;/^m\'j!!;x߾{vC֗WuuvY |')Z9G5sďaZ{FIFFkfȥ?9\7rO w@TGdAU;X2=Y{XeЌ#c%XQR3,bFL')kkQS+& bp68\B,|pʥ.5A:?Xۉ}9Xg ĬQb6EźźSa;>Xijd#;XeF A'Ep?< "9e(a&7XuJgp䰿FYYz9ŚaΛO.ֹ (b}Dl@.aӵaxq}8gw_ -}qSeĹecuz~Zc O?n.>ԛR7C/igl7+~/)xT{?r I9NnXLQj\f*0۞mO0 '2O[/ie9ʴXD%X0JRLxMkz:q٠m{ r{ܥ~{vBp0) D9bV!Vb01NE*_S]N}=tV(~j6fc5hWaX3hr*0j5GDYkAU~QtUslvff3̟5`?YV[tn{b໩]deW/H t+4 W*j&e1T+~Tث/EY!5&95LWwiLj?I;XSb|N2Tߟ Գ*(+qrΦW0Γ͍QM>hTƼҴ^#]VcO*k.X|QRW@0ɨL.#2Bt]]e*QAfqB*ɨLৢ2*Szt-|%'`+'2dU㮫L ՕRSBW`MNF]er =uJytQWsR+1n~.ue'j uuWWߠR +2w]])GWߤ\vց4UDH>[v/P_53ax-?N"r/XyAMSRlw3$"foURZe Cq<͇~vzZG"oxh>%}Z s0o|Siwr<{ΐiN: ]u0c:6ZL,$cqk$26$D16O9WTѣc(ClOw{&7kggq Mn³Y9)W,| Bfw NWTyO獈ڴo>6s>_uT+]Co;[_r,h³&kA,CW `0韅st˛R:3G\u ۻzna}NLHXjģe1j$*-Y\S|oxD0tv/o2Je_'Bv?O6p<vG]?ЅѲEG[#nf_;^\ZBu5-Nތ+̽7:ўfk l÷SyEFR1IZ:YXU}H^xb҇UL8!&(ŨP\0*E;</zOELz2nRڽ{ʱQ=!LUuW?Rf|3m?_Ěq'7P`&q\-e!o;DMXYAތ`>xr;lhB{֍BTl1I^z 6}k v;x*tsۺ]?źs0r#߆S-ȩm?Vk,e*>y,~]#(f~?GO8ެΆ?wY>%aIw?\C٬s3{ϗ# 퓢^ ثgRô)z?RK;-֞)vZcd;7ɾ^ӵȞa `EeZ1wAl"&$~Θ ,xo!RQ[Cv?i6 L=缞?n ThVPO 2s]J{ijk 0 5 䢶I.Q1B{u@tX9F҃< v ~hGtWup X|2>)v Ux+G.d75 WxN(CPRtzCPj=TZպ՘ ' XLƊLK_Bu=cER vŌݢeYzUYp(*.rCֵN|qgoq[ŮgGFn,!Ά ‡A C"]woZjRq0IFDX}j >Hf}j~Lrykta!OI|Uun) %W % ֐Ȝ /~V;% `v$ îHDf[쐳"৓;nzuBth9Fr A(czfl 8 rIS#FkuN?,ilvjp2#(ǫ>:A8r3Ě/ɽP%a;pRi3[ebvۅa>˸&*JoP,g?2~finڢԕ>M@%s [ܢHJ&c_͂WK(T7_ 1USGf~[}^+bck2 aLb9jN<"]2pB;Xˊ7e93_ˁu"Ιu #.A 8`ɍsFKڨxSKgÏàS^`TD7"JG&r$3<BJhٵ)Y EśִAn8{0 e#@\YV䰆jLbmIkˊ7dٸvid6i),FQ!`Y"уdh4WKոZV)Wk8}ɳ~I8s2.Ki (C+&L'#t=`7C*R@hCZXXhdCO%վ@Y_Rҙ"񾣿!h {*W%SH5%DiЎ>,/hMYap6q{h|./T@P2yD m]uMAB8r̰I'ibg#vofF*FnSv0&A }&EKVIHfDbU &iZIaTуWl.]csYl6nrL]Uo[Zȣg<]Kr&GU2g=e"Np2g.#\3^P4REś2]. 씖Jdũ*@b0NHQRu# xS6CLf40L _dYRR\sd'&'&PW.i y lk^ ~mMwyQ; IH Qr( @0`Ŗ8zu\T)kv^rQ8eJKD , c(h(*ޔxAҿ]NRl H,}mԝEś@^^Nid4.N{GJ0ڳh=\UP2ަ#&hY`(8*eD&@[w歽ͯ]RM-g_YeRiq@[!Q)*$,^;XQ)*ޔJ3ީA͡qODtqEhTۀ{F;|P|ň26380v 3<Ǥ4G巠~]~7e5o$2|w\+Y]?*H4eBcCͮ5NMy+evbzF;j 9^0\Óʸkk./xplL{ȃֻ -41*&uQhϪ,Ϲ Po˾^",ʜreWFXOaMRrnG]>VgL6;e*Px@+>A mԃ7eTT \A #+A#EśrWnF,;QPEqx׿)MP%!7;pčT偏fP%IvA su4V_.[ QLJȹu9'xS+aux{~*ܢy),;p>EZN_q-^F 5>|.~lpNMnǃ Q L}DcnqelԢpK z=~0DPЛ (AĨqkDBb1W)liC' VSۀ/I?K߰:u(xS×75ϢwaS-1le4QZH gnM3@MG ʟ^L`tJ8 XLu2</ChoeSv<< Ʌ(9 ++?41hq}姆峥_jY $8gl-arԀ ,5tKcFhQෲ\1d&ICgJ͘o#`1Rb&|Cpޟ^C|F=4J.64_QCEJ{h4"략j:=p弡k숹yA>hysuc <[bucΐJh0sNpI!p@/}AOh:202U#({* *mp~_O(xG|ߖ~^5G{^5 VZL uu ,FrET4r#XF>hj" ?ԉR躻0OC([y;w*-xLꪊǡ(xe6c%D#ഌ|^;)qDj \Rgi͸_"=4{08{0Pෲ| W|j{ , Uz 4WZq4i+eyGǀu(:d/>@\YX9"uW5kɺqK$YbDRA7 >xFd>bAFZ JBTgFݒf!5waL 4:SC/`y>lԨ3j$j$l˼ +*/ʤ1xVo-j x/I4YVoͿNkjdo'V7@ȢJUiÕۘ&Η," ۊ( 05ҥ ߟ MFJ%OeKco!pM?,w#k_jݴ7'_E9=4J]ʨPΕ3jldPq+"~0dzX$ XO&U18+9x\Koi3L6 V|,3M88q9[YAXܨaZhgH`,.o0}y:$g8 va4s:2Vr$ P {hcM6ҾẸm~+w((%vĝ 81²џҨG7N [d0GCM:2v2ϛ>bAKՔ!B7<.͗C)rR"!zP.SOCC㧔ksG@{:k!e Gn> ̗p:o`l*'RHQTܕ^jLKH^.QeW=oc(򽺮uSuR))MfW67exoOpZQ^|d9~j ?](G*1֥uez2? bhQE/n*MD zWlEV{)ڃiE\ "ך,86qciQ ',N~~Y=1gzZM0M䨹n \j_{؜ZT_m6;}O_mx_~`$/vPm+`5P}/t~A K'D_--ju0im$P2.$qV%kͮfWT^N(>9m~&b!:zՉ0U4~YLdtb+G ݽQ}rҰVpr$B%w*Wُfk|&=WZ\|5蹪-ND1E3q^65Eжy+#z~* $0Tikq0[qapŅWx~iVW <6^~;ahF^{6:pgٞ ڄ@g?ݯq 5|=D2 VWg+'bzEy!^(>пΣ≃y ÁΛq-ڬ܏@E>?{&e[4| fc4D:YUi]y'nӟ {1 ֟Qh^jڌb׉L^Q^zn+[;e~L-4L*MkPr~W #ux.1PʥoQ-WG0]¸| f/|oRK7qq.U#KZ# [a0=~ڹXi/N7dn^(G]ggeD t˲5fO1ODQ)XIRZ`>Bz"+kEpH3icZmW PEڮ ^//\SC3XOREUTMt"ѧbmFDP;ܕ^-;2)t.Uh)U,h~^Ps 5;䄝)t`oI \z sD8 `9#I`M8'kƸ75YMf 7ҟjlMWx~iYx jCW72wħ.Q0O3hCvC'#2'Z$tjTu\= NI"$ l^23,qQo#1W=݃՟uzn ^;׹ Qsw.'`;AcdyB.4zc ;M[}jF[^@RPx=6h=)}~iGךq`uԴ[m3o6C+Wz[/w{{7л:Ckn^v:/_7'h^avf7|^"t|R'psaL: gq]k@F0 F0`rw9RBBYT.{`E3ET캑+~Elڨ¡ Rp 1󷺇tᰨysa&E<ࡡ ЉWL=eaÜ4)Y~kͼ?=zg !L! 0Vn^SI3:S7:{GkG,c zĀ|ML fkC|^ Y OV{Q?)zlT΢:T&48Ihtjäpl加sP]uw4S9匡$$Y)fɃs\8RiK105Y̧3,xB.³&׽H"FE,s\.E!<+S}C$a,ی=:Vkv-6лX!Vƺ׸i*8eݽ@gc0CNX:Ȁ׸DU9m47{m$`pwذOl`bEA2rA@Tnc3_Q-~ dx;R,z@ HKC;- l eiqM> W8J7qU^(cJ" He^l|{3B8V ʣ3O s8 4e\T @"qKd1ݓa,W<-#ԀF/*SdSis/prg/YkI: K{V%"8C.-{FL|T3PFs|VtyOJTEԩZ]8`w15P!#!9Ҟ;P:7!mᨘތ⽤c[1 8KAT5SD%Xʄ ΐD& ӬިLOϽ2~uebs/鷦̊7lxzs5[֔*aD+fQgMp&%PT?SN *>o\ נ EHU1*jr?nR8z$:y~4R[1OVX`5$ g 8A\5K1w>K /-JLe!f \ُFama)+ uAQAV"-L#!j9Q;ge4ƴ؉erzqn"ɄQEt$ǩN' t&|L '1X\99./ysG\S" ^aK}Ӡы0S4ALT!z6ȌA6uuR#J`ɼl"b0e*O/F0>w/$ЬOcl6;/ z4OƜ=lhPOi6C%' \|RtWVc9c k<`jL0iʜ1N՟kƗ*jXB4p*b2FBrm*FcTH~;XN>v$wG`ft!J#y22“|=Dʳ[GtMKr2%STk)ҕTO%*8tkR0E%ʼ;36׾?pd'|L ]`H&*]K~즓Ḯޭj23cNդ}qrڟ:-} $jmo~hKU_~n>99) H/CE~5)EHuJc11dKNjrs/J;-,Sc]AV^} u!,AB&oxX[<G|sL_Ɠ:>MBz=qV\#Ȳͯa>~?x'I:gnlV:[]WeX7gUSA47zG[?1X%XB.n7רM}j.gB0VICrmV=6f񨝘z6_K1U{հ2vm{ Vׯ7TZ#R;*Fa0Q lFYS=̻l!` 3gj~YŲ9t@+`nC۝,^M]OKBJK.G'UhF7bm^]2EXp6e2ϲ8I/({&*'rC3f ts?賝z_{ӢgTY_zvPō{![Ntdsk w6`2*(!oO:ޑ^2!4~d&ۉ7qp{@cT穜袨&#Gǽ)ǫ 8%J;m| YJ)eFB/:e'8:yF/k_/efB,O! f8L&60lpxU=YTz{s'Vf" Zoc u]A22 ,afJXb`Vߵy[h}^QtkTZ4$MY;;ʦf=II9"hŤtn gR1 )ډOhjB@b>6b 'Kg)GɹtG9\+՛r\ 4B$pETWʽ%ߑ] RQZldR rPigug1DF)Nj{u|:yehBQƺL9% Zb,8Nu6,nZVxˆUǍ{E߅±_5N'%k5"޼ ZWkVJnO}V鰜n2@6*gMP Bʧi|:y~4zY&U|Y:yl~4z` Ђb%B<(P i`ֈH !ñ}f)<;;;>hSo!D9p{zyd%zu~XTYBD5#fI!o9ru 3c̷Kߚ\:ߚ1T'X3oՈi)Ug 0}pZm\[ 3(NJdy`E0#x3v'B[jb6h4Dzo^]2ق)PZw%>Wx>WA飻>gȔ˜!Z,$w"zPq-t"GB' RkZc 4ha2RS^Ĝܺ~yDg?PS|NaD:ל2"gLt2\%,(3F .x0lQK] aڽ4Z^SDSG_7h3<^ nΙ)+H#)8%0GJ 0,VF a-uT-}?NHBUjgCmWڏF/,UFTޅEU6Y>^N oJ3@@%"ւ q1-v~1 D k*JR(X\Y "e.bc϶3.|F|t?ޮu9Pq6Lx~y{4y5T?rXo95ks!auz_gy2p~_b!<,s&`VY_MԦ( ݭJG&K>b4AHOWݾ< a_$eoxץ<UmU;kZYEKB-fmP|Sg j":󖯷aY{b- v./2X|BT{f"_=9'tɳ]jGsPiU`TP[pjW֎cԎ8~U$^tHz 3B¹csgV;<$)1ZIV?I<,U#'c|_B-1j4{\*Z,mE!MLjrfoˇ-tt\SBQJK@59RԅXù=7& /{;,m-KɏLY@[@,5+rI;mE6+>:Hܙb죝GWmЗ7"(0ttgҥ/~DQts>jO|bW%`N Y0.Uy~_iS'c^!I&JyRՅtjGUњ&cUP:>x#)G72ŝ) <. =T}hn|Ezq.n!זh͔όDUbjɋI }MMD•(nS/84yg7_fwB0'\(rk0{hcV]{{ ̵8.]_ms >$/czDymTꃗ>g9r p\0HGS1-v1<1S;ἳyWR,kqL4LBEqw6|j3P-_zN{܀'8]HṟYԦ%[.NhP;>Sk3iFUK/^|=gAw{ybshĈ1f7:D O֦t\4[&^D""{& /'I{Vz+I#;cG1`sR}ݡcgٮLGQuA6k}%?\ eR{ǨOI&&ّg9 o)a-5<_ZyIMa$Mg0TPx~8gQU|g3(IA kQI=EaJQ(r#~-Ƙ&^Fis ާ5*!1jOZUVztdRJ0IaY+s慢D8gT|3 iSp~D|T'kM$엉Zm#o: mCDk;dAg~TzSA<1zدy&y&qya*2sTamMg顎o+؏Fw'*t4xʥnv2f`6vۇ$UvUy#VڊR zcQZl+Pûu#FJ\^$o'18Vh!X?5ŌnP&Ad/Ek# 'jb2ǂ$t93qY]ɒ9у}LfsŰJT2"J_?AT]REISIiZfŗRO'E35f=6ی97сvZ e1~SI\P^dy7R!팇J ^J 0c팇ט΀9c[* _`/k>j&c/z+UY^=qhNsC@u"8oƛ?=ki ո˒%\=`K:Ftrf:tC'|xI"佨Y5,:8^u c+dnmWI`$p]3&ac4es[yN|ѦHiگnc~~Jaּ=__򯘇O7op3GёH Yb ?䰏qƇQ+o F*r֚x劈xEZK\848ۧw#y7Q8ט4pg#v彗xD^ΘnM D5$7_D}-~ɯ@.b\#؊`o)ƹ".h)U46q L2}L'Λ?i֛L_n'!k8h_:WշGef˶9|}oЌ]~˴~joVv5Ӑ_U筭i5n= G0(cR~(dW@$J}ޔ{#7]lިg(<2~i?h]׋ _v[AنDkX!> u2EaDosIrfK0lf Z}fk:'Qm'*GzmX~9Oş \J[?[fS?6gSV.۲}]a%:n@`$ѐJHj띥DД1Q:/0+t]#'1FXD _3y0/n2O?5o"4obF^c(l`#\%/ZV3$2pυ DopgF-Oau_0ڹ3nq=CqeY53O=|znwDeI!h#.XI۝='|ߗE<~w6t?5lu#ƛE%h%;d;m{x}x-fNF5LҾ4LKmw ;kLL Y{w"'+ʄvYgïnylxОYtᣦ l0"N3CqU0$dqmjDxy⩁K+S"zt- 4gG:Щqؗji σǏ@8w,*g!}'z.~~EKʁU(wxÉe񾼝. 5fFэ2T3?D l)Pf:OBy+rh#V|1l!Z$|WwxoO{ `&= GLJ^--3V|Q:' 7QDh?OnE_+4Qd2 :2X!bp.ijHQbur*02M6QOjP"oNk=KO{v-y:*6ce ^/YW,R"^k&2oWZGv,[3e(+A2-H4qY&bZ9ktRΔ/z`s˚_TT >~pko7'ky~o/R_#̑>|}ٷwwxBȗ&*-(V/mӎsk_U[D+T4 x#X$4ޒ2 %PMK#ŅX7#5n ).s&DOd7W<Mt02^*0sMhjlV~[Gvz NfHj!7LٖۛN?Zwx'ш{ۡz'}2QXfv26cCgo_ gY٠x/;׸xM`̯cS*hۧ]._$?]cqj>oFL؎v8\)$:,J)s;u9vq~Oc҅'dtg\p3?-Ap)\RCQFu{КՋȋOA<$GD:.yBiso'Us611J8.l")\d&,t]#9!38ʵXLIѧ%,y_'dD'mG198y xp4+Ԩhu=e!\v5;a0ظo'B'Q^??+4T$Q}?s=-`}W0>eI~ l:=TvQQ"% Iσ0Dk:SB:{UFdf¸De,8A6E1m^3NcắQqK6prr` ,}a`t2F.7ͮFq8psX|a*CeftɄI3-xV1M!팂g._5Y3 w8ɂj$9+-4w9Nf䘲k;Pjc֌lvС>uBFgpt{\&i.>0ό!_f_'wߖKXI}CH_8:X=^w3RJC[r`ŨuN{qy?L)A<ӲQhPp/ }"*%&_3y߻:%fqtP83 jCWjD40wA]nn[IxS5AѦr5!Z|$Ԑ)$^KR"V2:cgcA1xpPeY'dt6z]\O>X3 flgR"Iꍕ_(DMKE\T(xЅ;y mSKq."J)3J65wksfX2w33z { `w1E ec\iP0aTmw0useP0'dtǵc.˜9%M8| G5[l;%7L2섌JrpYczO \K ~>k?Zt[Ҟj}WQ=q[Tq"6U df"qiP5>5#XxW <uԳ$nq$IAnM ~4мk_tIYٸdi@oϻ-ΝhqxgD,oG4[Y3(Q扢Viڧ1m`3Qi'tFυ7K;Q2DJ-c3^#ȏ2:0X 3buEg ("2:#|LmT+wtQ ~;vF@Y:0zud%AE^O a" 'dtUkSMIL/wĶg7j5a.Z e$)R5O4ZVy͓9< N,jq!E!Ma=:-r19mufŎP9~ur9M>AjGdо'nℌeB4*{BFgpnIjos†!3T/)j,.thha(|/ 8pѾK(}Y5'*R|BFgpo%Mjz~g\dDQNiR'dtG#̠h'Dg4,A oUvwBFgpLʤQ|O݆Mmd߀~^[1Luh^-NUG:+|oֹ.n9&3}s44&sBFgpo jЉ'HγpTs6i+6i(Jv`yZ FpaI?]a%?B(38ϧSPuJO5Lᄌ4$9gH wub (ӄ@Ddhas[y0ئ7K;=dT;=7 ϥD}0#)&5ъj|\ZX#HXTSW8ms9|;%%*%,Uw^z"6rZ"[/>c(_JgLO*Xkfdؤ^1Fi:q0  8!38˸M^MRjע:ՠ+! }oOsƒnj!kͳ4,pxu@!L00 jՊW5= ' I[8ƽ8i,YUYRtسppL CZ~;hY8|E|[; 7}@#5AH3"P<_7hY\m0qDQ09ȗv fyX4,=ٰ(1^dDZпƞ[cVJx?tHv?I_}|Io5Kb?ђ|c֤ghсq^#}= ~ b֊D= '\d0QzM߳\v/7Kfu1ͯigMN$&AcT\Дm[u ' v?_O8Y+V= gc3uʑ/FjN,اhi)ؠ(E9)p"LA8o@n~K83NH e*rtNRAE/OW`XAQA6äkƛk,TEIĕYl˜ǭmsg+@ÕOA{o u|gPOcNޕ(ymC~a{ #xnBuZ)zf/'94[L >@+ɀ8P=4!e4f\Wa#/( DffP9S"X`~R@BEHޟ5' ー ƍ;މ3T h=tAnZ%t4ܴ^ / ub鞆pH}:J:n](u$ y;H Do3Jg<1&@ (ձp#j(G>DsCHjPu|[B[SC_Z;tU<8&*9BAO@R.#/qqHAD'`pXc x*hDp[B[iǶnnyx0^N}e߲ܒ[Ko}fP̕e8|<ʲb,1{rɟCߢ", +pkM롛+`7 b,G-\}2NǴ 3LVr3hlhy2Wߎbm@Tw`G&eH/2bffhCj$A>"7Zāĭ;CU1t%|gE9툗C>RĻu6;^=]`GX{a4l!0>!D\6N|p.0Ac*::7L9V-1CȉkHY 2Ta7ŭz ݺ讽T3쎋+ѩИ DI'a((-tڊr1Ԟe?óV'X:\vJ}u(4klik5xgVgYBV7+|مH ̪UAY3(_`>C?͸X] x (d_5xqqw]YW4D!B1 Ǐg_e?c0-F~GO;O!-m$tQ,u~ɲG0S诖M;P9\ 0Y/?.NaSxl0P(v1.:nwѓc4/TܹeؙM/OŋKpђ<{#d4lt܅&7M@h SΫEgUrTiKti n>=bs$RsGSaZڊ{o%Ij6FEӔDk*9he#3,3dtE$ f.EM޹ c dcǷ}l_?~;-GyVj128A uU }7̳=N'|;3FUceYsS |Om* Bvf֮}]nlϕDJM҄gE5XwXBb^WL՘AWkŴdqu1ݶ6iSɂ4fvO]tve>XMbxi޴_χM6+a>IepMo);nT:Big-ZN"ٍ@NwGmY%(TWvCt*Fu֦L/ip_FLV-5l3akV^@Ee9.G>(9i^٤!-*_6oSqqᰇPC*5߿Ve>:]VPj`n<1_);|S%GZ^$0DhHTs\HTE}F= ݑ=B#m -T!L '=byt;,{&c?qͻ|RϬ7F@KF  dCPjEȆıLIB-bJl`E@+45seX46f^#M;fv:+i ? O 4Thn[= a{N_zߌ_ImWWm(;;薹&ߦӺқ3("j؛*aq$tQMCʱDJ$CI”2_&6YiD*jΗD(Ƒ۱#*hsx!0 Ƀ:sGPg&sXy6=J`]Gc!%؆jUIh-dIRPPHGOEԿNt_S:Qh Y 䌛@ F5H (0wDCcۧہԵ^pnflՁ(ퟍKG8aVd8UG9ϛk]I܀%O-dC1w)Gw՟ޫ%΋쑀z Τg/E(C18 (` 8E!!( s!G׽#x3#a`JH 0R\) jvI}"ޱA C5ϖ bF:dB]'6ZJ ٣Y~_ .@;,GK|{) 9i$3E͇  )D\^Uܻ#uo7/݋g~pg+Xf.agl ɶ(UK.j|.F#(TZIՔ.N2e+ɴ6ilTV叶3y.f4ƅ[8광]TJ1tT `cf>?gUO c?CC^V-7>MUcq\zN>>Ka2x|}]&J\lIjOO⁌%1,iBvEMIC:ekA]6ϋtYD.1`K#ڱCy58BmFiMlLXb^lNYxj9. NMp a"t-}W:Ćhp8&LEEqn$߶[9&(NxLHKF7PaLA3W$ܺLmkw\ұB$I4I"#$XIPC'bPN A2wm$W2"-!`>x=Nf g`7POIJ[M$DHƶTVչn14-ͷqzhh3!F`Fn/a{K,ҏ=Z$:rZ3A-Ӡn4dDA6![;}ώ~ZO}$H,OW0M%`b]1k(٨?%D ]p:w2L<*% i!HScrGm`42,95@s.dnt|ӣ(vB]Ǖk0O|p9RqV%KQp&Yq6{p ֌ɠIq]5f }qolg 5e3>:II% 3*쥺V ;VSUK-FvmU 9Q9Q9Q9Q 9r9Q9Q9Q9Q9Q9Q9Q9Q9Q:TDENTDENTDEέoշ uЦq醙Y.q82 ؽ ºYN[RV~eTVTLJ1ǂ рFP P%?Q%?Q%?qj'J~R%?ףFY(j"mj(j"j!BíLZn[d+N5VTqliPǵRGCID +)fr@N{Gu Kq 26qoz"ΘFGw=mI=8`W u;˜V\Uf ``}Zn tӤ]֤eNj QG!{EL0sRd*ɭ5n4Zn[ݛ力ku02<,zC5_tjp5ĩX۴rp#`(?uAS+ሯ"*/,u7hw\-ǥs/j$KufAUY !#zj1!35ϴ҄g̤mnڎj&ֿM>mu2 ߲k|zgL;zMn5;AAEյY ^$^ ouHևnq^k@Bj]΃NT>R1KL3#fC!Ra;ґO*J:V.Yj(JGBHcePj{qv`V`oF#V/JO̕( Q)d2:Òau'Zub>*J:-i^LD1""򨃶)tE‹lʷwڤ) \$H6D-OGґ譈NV] G%A%)m87k'@ "+R4l>ZO.h\F`)Iz[hL-8H$:yR>aR#(b6PcXp4k- qA$ɑ /C*YH>jKs$ vVvbn .G9+d&P@ShZ_>br"W}0x0,ׯg|&Wse`2K]1B ˄"i&Yz'i29$PUMՍm*UbBk CXhJ5̈`"$eZ/꾈=t>C&eMVܓu7{YUf #D*i%jeH&N`MU UnMO5m22lDv"5l=^ cR [%g F~ ~5F1U C-L͕HY|LKMr$XIzȋyB+_*a:v!`;r|Rǭ$z?ݪVv]8z7¯+g rV04tajoHr,+ $WJ@I n釓hXQy <+&=]Ʈ^H=O+t|Q-ˣ`z6yMc _i÷dt/Vpthe2z5wh;+AjVHR۫;`+a}3ojn\Th&2 I[v{a4:}mU_l1eW4l/MW>KZXImaުAimJn߀7ZwbEk}DKVMu^`n%WseeЪ'jE]4U',f {5($>'}߮F>M5ެ|HP1(-A RnpT G} \;zs]Ɠ׋G@N:Y<*% i!HScrG@ȬbN OD1eb %{@I8LƓMZ6Kd^;wbfYQU~J_&bܭrz&,JYG]™depfrrTkdP$ȁ.L|^ܔ|I7x}w$Ǔ.O6u!}YT g]sd"WS AdqWB #攡O_ySRmZQqnrK~/_~w"xk"i48_9&t/@,xÂ*E!~q?[J%"j66j 2Kj)` hpHy@΂UV& ;~e7l R V2Qj9j9j9 ГZj٘ j٨j٨j*ۀJ%Th(]"Ī=ioGe/C:gc'B_#Y"E")9fHQֱg=uWuu J)SIKu>AP?_)?)C0!UX-zg՘L2佖&FSn>p0BEst4P~ &)]?v2 y$9[-Sԣi>'@ƛܸIrOoL46Og&' ˢW)ο?Nћp'[& x`7C3F+iIͧ{3 >yY #7CU?_vûonBa֬1][&^Woh3'MҬJqW/ˍߖ[z@Ԋ= ccKZEZ6Ks{|'8V,p1A,rb ëBELaYV7s8WiGEءFh|шRlb Dtrʯ??OV{ ,cix(7 qf%*Ű[)rmq;jm?^iQe4Jfgւo(&[)ěU1#XP1g&.lמr蹫!w-Eا}~KˆAe{PdH)1a4'ZK(r yIXH0 rR!V^ZqXƅ4*L1S |@N@k ၱ>9l4TbygcZad`(֙zHz ",iP΁9KFbu  ZsZi]rQsXLbQ<o}EBӠNٿ1 @1G?Qvw#ฏc8|6ۮ'|߿qUVۥkW9 S|gѼ- qgv\<cgš#Oxvƒ6\$»uZ]>[hTxs|W»rPJ:K_UzRL~Vy j QۤJFx8),~6)gc(x5&x՛qR}Qog$WI}yRj`kE!jf wy.R}"iڡǒOWO} zI ,ydȐǞ "k/%%X .XZY0l\+u:dA7eF"?Ms Ԍ.tgY FlSXJ\`G ?|SZHOz|:|=g̞1{<cx=g=3]`{jHjCDUmS;ԶݾcMmېqAe^=%)%Ê&iqǗt7ȧ3==pX<_Ruyt˅;Ҋ6 ;RywVFzOc|3-g)1LʲQ=^ۆE)vvMe7цu[_JU:4YP5Ocޖ'-F|'LXʖ;!*'OO:|)3e*U*nR3 c tuk!ZRjSfHYŨɼ#>|MyXUYk i$nM,~wCo1F: lpFmߜ cɿ=CS+%yE2mt9|\VXW>6$W| 1L0.*Ϡ|$˿ R~l8\9^7?G\䲑vSؾ~wekX+Vc\QEZe[Z(-- Ǎz4a*n[ t4>W'^Sr>ʺRF_gYRPOI#~oKb]n>ڨM3LҨk"yئ1\L玲Śþѥu~#QrX8o564=ZUO}<'䞱2?a)ws5N/N>[ŴGb_a$(V9 { tߤ7l8 7W6#Q (1x˜ORGkW8f ɾb:WUл}~*"0#Ϭv؂"w4pfRNb,&qI)~N;jvuog1oT]ٜO}̖٧%X'5BG}FVzݴkN~,gLhX`F/+ݫJ91/xRx]t,|-_iWɯN~XQl%輲ıh|ӎ<7Ӆ<ݣtZ͒rc?mCx\͐P}L5%Uk!k}}̚cuFm[J5Oq^qo +ƛGIl$V'? 93 uoc s)JtylQcgur:8S~Òe[<]l˚:7Ć8ji)44BPnj#I~=MRsu5|5L;H컿}߲x}GzG*r TSO;CG}`;Ye][ƎQ(Njiߵ5EںA4wMo?*E_+GWrtίYh"Ёww%z?qlegU䕷6Oeq^⢅y< Lq>eWN3‰uXT_6YͻOXG_1O:by&&VQR ƁZd*0giZ+I3kX}{u=t"Ŝ֗ϓzf|.*J-kZ}'uS Dz`--3͵`ƧK2KqmuuޢwޏG65 kzyM_1n rӶvaRW^4 hڠ[iA5ã 2VSk-nv(ߙŃey|}P3@sZlBMum~iF>5hY@ΧZNm9cLc`Aɂd0e Go_uxLk&-BqE:;"E ZdNfRafX4R%pD0AP9@2ArܮE͍LƬڄYMcx\&g4Y*.LdDRa c΀LK/{?OnWMRMM!k`h#g\ Rz9FdQ@wH Ugv..v bc? ΍ q'J!5NlpP 7bT.FD.4}q8$I5carc\,`&r䱧 {{o{Ruۣ815՞)D0tr!3k{u{2> W:-UUÀ2-6`M@JqTav)t]  GF(FM>^Vܭ9L7| `$hG֭ZRw$ GS& $FO NFšIh>in\R[˫~0qMc_7@2ͳO@裷6{K럚/Mi[KzT&K6xunk\;Pk6 4xq}>Ҳ52{ͩte &(J|(} ?;&,hexPQC{Njmm~l&?F73˾nkB=uj`dy-5cD8bpa͖a^|Q\C^xOޘɇYh| Do{8v̋=z]O0): T-Rl~̢]4k!Cၱ,F1SI`)$ŤctYpA2>˴s$g"(ExՊKǑ VЪK5*C H Ψ 89p+L@UelwV3*Muaϭ}{V'׿2Ke:IzJ ;0eyHKU/͐c¹rplkl2{{̅NWj)⻂Zjh,{$sL(ꦫQ}|}*'l?}7[\ٙ}UWdth9]j>UWW~aUWHo 0LEGݏ<졭̮8 koV&56 & h% 0BAzZŧJ ?d2ȞoSTӓj)#19lIwwM|{Q Qo9)` BChzˇi386bޟ/?ʕ_T>sY ƃt6k"Qw/R<'3d<+1VEEw|zT'Uy:udC!Ӂu<%/;YAPk~.O qg꫕ƗE`{p$jmDr0O/I|ݼ75OIO'p+O'/7Wg)M~w ˏ,Xc_ݏKek/xjyg< }Xm$^e2,_qC&5e{rmo89ld~';[E̮vvѮ_O/SL% aznziV߅Es%_Vsm+%diƳy-%W_?MyFTPOM&Y-by hpMntQ\Ntz4b{ρ*/w9ަfz:vg5{>Ssplu4Y5IL5p|a!N@fzoͭ"/͟,DiZx6]mEאq:JȀcd5x#rgKf qG5O2F~z:t G+KO'&(As(Hg8 G^yL{Nȃ,s@FMˆ1F"h,cV=jȅCcpYP9u*}*zXZflmNm\_ltydLc!I!)F60 !eoNh6uGK&D  qA9؋;$IN&C= $r]|L^(s`7Mii㴲/xEuwB80&q-҉G䱋Z`s 5yCߺ/x`R2X#-E༬| )c0--C1icI9 * b'q; 5S«wIEpPˠ@IYýD(M U+\;I$5.G!Hs돑Ha'K$ϱVSp:!jF< {崔/P|w~%MZ]CFs(4߼2m_ xAYuS@! V"MCo1`%j&9TD扈7 :jgx"Z%7Ifq$Qlt #;K$tgU%$h" ҕS(Fg _ x}o%z!ܓ\ •8= $o1{gcEln,w6^ќz =J 3Liy %H)dv$ǤޖH^ڥs)PAArn +iVҠzbDB D1-%"(9 gdq0r,)u%j=$@eR, \t}%T>&KH`6o.'%r_B LWJ,,=9 =pU H_HR^m ( Gक़?Ӓ l"o&ж#p 5 im8' "0F5U'SK$/q}+(&&WyZ }5$@B Wid+PbR=v$ UKV|@PFЄ-!@VJ <n_"x- ^4,G!AXHW1%BY.hK$j'< <¾}x9B&\ M 4AOfP7%0$q ?SK$-1/goYIh6o1`AJ1GCR%H*mױ: =L]Y/Y\ɽ%d8ZU>`28y޸i MSgDB 5JP9-DH0ua%LPN _2cEzK@B ڛ~@ήju-MB+  ʶlH^ Q>ZM]~0cG6AOȩ| PH 8viĽ6IۗH{ݖEhiT҈I[@B W?tPK"t4>ߝjbW=m`*%DR瀐A7%S&AXBf}Ak)cUU%*+PIJvs^`fdM6 R zS%*~)̍F$䛲1.=3\{J Pj'֊2-E 9X-"ыHH24tu%j޽MJ%F*49b% 8wn 5DЀ`~8r=y?y΀ (Zχs|~{GR(lxpHEY r8jսWV=~~\Z]`UqM(IJlDzp^S^Z:3 ˾]3F/& f#!K\c\)_a`wQvcKDG}Wbʂ';,YtˋVk 28vj-7mc &%ɩ-4yȥj9 Oru5vqsi?ϯ5`D2CVwO6莛OjǓ@/>̷k~q Cm Ճ}渹 m兦kzѼ+LX=g%>Ctn +.YFj8;/)KY6~ęﺭw䇥!_ٲA4;c ݓ{FTw.`.-a`gYc8sv|ǫr?n._Nz8db>iIjeI%*dk&k ,qgP|[]8m.Ɣry9-q <|-7%='IrXOQ;}/k% PIm| ]NBUW|v^2e vm72w_.FkV 8n|_d#ޘzÞYjU 7A r{[>㝳NHj zZ"=sFo^ia*Jk 컥E}3}!":mwkmBh Jmʞ}*wdUKCoq]pZ.qa!! [*9:{EO>il8| `N>*u('j {'`3B{<|F1uHosJ&]2|(*Ӈ*71 r>tW|ʫcӕ6Dk骽EsÏ{ιpֿ LgY,>XZs36H;E#̃ -G#@&y5˞ j ][DX+dJ˃# b+G|$-SQQ˘ VÌcӞJ51*+N7G)&9z"2ό)Iͼ$ b:xS1*U)u‰ 'XVZ*A >$$Da2b2y MM}'rM*̟i:ƍ^7oټ^׫׫'ڑW8`] !DR,m \-Еܛ:ɍ bZ q|V+-hIE_0>`mtF6q#J-A++,1JGxОq.Z.G x|$K=c!MpdXtV{k kl"Z᠁-`%ԑ)2X)N a{ Ҵ:Ez6Ԁ.J P9Fbǚ:!\uU!MPJZ^Z2]̘2MCQ%<o2%GFSNC KGX@G$É4`X7G7A`W)WH^P^x%A$Ĥ:9{ pi=EBȒ%GX-u.bȷ @ry 1j~|ؿ˧ ␆jk3.|J]aN|%@EG ,i H$1)'Q{pL*…COϧ8mA^N/6y4%$IOXg"UQfQ<qbSLd0\N&E-=NFPuS~=qJЎla\Wf' P8k;y3 4o;2׭?޽t;|Q<|K-Hk,!A 1|Ne;m%~;,T\e/_x=|]`˻6qq"L~38C6$)_W5a?U̦/xZ3udļ\坛ޤۙ 3cο Ӽs(@-,zW5M6祹q "Kgijna;7cp}>t욝 ?=k2(:o;c-`2b=6M`Z")p/j#O)SNy4/bEf{AgP~Vw]2{PErp"\Iwdv4[i mo37-`)MDy.\oW)ⰾ7&13 Lz80k({[0IFfʕ~T |CJ@buV9wU%c܊ ̠e6nCƻ= /N4w=m=ft̓Žs|fi/ _RN=uq޻/N_?v:ݮEہ9BlvI T;W s>L 7G)׿`MΰPg3]hwfS~'}!:w/?}3n7erci`:8:3~ P?vIU.8XOn&Bq 0 (8?D!/ٔ>N(O&|um'y/':|R6'Y//0\3,f:tCgHe y@JE=: y{7pD@/ZRX@mwCcHE@qXxWVe~ ='v_u4Ip)cb DE&?[7M^C ר]QQ IKB9UaaR"嚠Z5v|୹CxGt91^JFc pbr*U6*Ffd j0DFx(䮥d:oRrϠ`~I?G~ 3&ne@!hcU=H)UWNj'խ&˻»T6t$}:QVKjĿO'OSɅ> f֥/`qa0?6:,,H܎7opd j&+ƆڠF jI$&ĚTdU$}Kg? ( sZdD)1lf%iCրz*LNcB)7j<["rbrX+B"IkBhBډP:,wcx;0P=}1 qW۬[C =bzi5I]v([⢯N#+27Xk>lWo[npYJ ׬^nf'QDFfJ'ؐ]xO|Ač4WGw]em.*j{ZZ Tɳ`o쩋o≳`oJ=M{R `{dm/=l!~c9T 3{Ok#D`X;ÜɅjMw> Z<沓xJ rH# be}0z)# h08Cʘt38?Uyӕhüg8sz~6:}о8qz6ifY/v݄dgf:m>Ƨ15{qZεQvh5Z`(3*hJ%Q9b0X\NPL2 spc)Iv<{wҚX;>﫟}.h_¶a#cۧ~>[?=Ή\]̀J;rh+`jƦr}~Y/2̂,kg7XH>odhp`%J7bi1g;JaظF'2*w˄Us= / ւkՏ(BueR *in9sg21grh _?1cLLqjybS I﫽P;IԽTjfs`UyA6mqgAYV=')o%Ê8^M2o-Ϧczifcq}GֻCm7ӊ L5_5+o> eRiY<6+έ'ǏwZ^p>p|G< #],sS'G}xnŃN, K " rl3ǵ81rF/^"a ,D LI@ViǬQFYJy>%(ykY/ΞMtln\օxpIeZ̞DPuU1HJ jeY%f wr<`IRe/`0Y_)`5xρI/:<\+ 9+ wD@$tL*"Q* J3K`䯌̒;j9E=R-wAEƜ*<^$\K54! \ DIu u@!W3lDCR O-#5Huce 7"Y?tvnc1ϔ BI+iP =xs)""aB 8Y>Fq/)wޕYToS^ O ixg2ſ\a@ N!٥bL+ۿ=' B_gwJ`PVtb I6RIPe$ c~1A)"umgd A2`B]\OF狏2a@|+]gnVe}j䌀'Ie>MAWtÝ)UP;KsS~-uo~*^|7tg^-g#\ϽA.%nBHmtR~59ra-pq]OZ枮!\"Fa0,d`0a1i[!:w՛l h$h{$z+# Ey\9ϏñK9ȯIs|;raṔf~%N_{K$ )&Bw|^RE㫯tGUųw<u6*e({uF]`?͇^}wOLԇW{O߁.$gn<>`k]_Ҧbt- {pS~wvW! luqn\M^1H2g9lɜʆ1vAI`8+sS(6QnkzAF=iR;_jcwGyMNS-]@,|fG 9 ^Z#-S[Jfygph+ͫ4FOVd\,Y (4#DaAfxOlmFK݇|k5}C#RhY/fߖ{PnF==W+N'6X퉫9q%h|zm"wgl3*c`d mC*.$#|aVl8_̚/tŔFmIER$(bZhmV!M~sO+[V ~Z|(] &_Ԥ5N3Ѡ)Ly.hn,DYAśi,'vx3b{&}|Ϳznm~e%,s3t=_x An?߿`jx"b3iLҹ w.^⇜gtCOnb;d!-)pP$/abf<{h rDZI\ΊGN?BoOU'~~ZCss3 ЯS};~y&m**yI  !څ!piӘG"h@YWÝQFG w`i[#g5?^qlJ1m{;{xu-+:G؞a aB;ڸKLIbB@EQgGv$?8;mq9_;[e-ܒ=L^@Ɨ>XoT-_"PwfonWWpD+MI(\s\ڥ~ӄ03㛎o¸M]aW޾mſό/Kq1fǘc˺:3cuLL^WL [z]ՆItQRm]M tL"[ҶmȸuC@C^ ݧ0!Ix':}ɥKSp&;))`^Z}q/V'eCAb>{X;$77oX= _11o=bJ)mrvDu\OX.TQ։6I6E_)_(gp΁PTѝ8y[re1"b-a2J,ED!EHk#!jR0]*VRɧ"XU6Q٬fmwgEtꠃ,s? S=~:m'˶js,],ŹY sps)!}; D,Fzy)g3ףvV1pe#)#-+˩ "" bY)&haV:u1 &XOuk[XP*/6hWUv8XRm`1@-"qKz&&k En {=hqW+f. T)LޘumZxsQ)@hTj]PQD"92Ux,^q;pjz*մV0p 6 ߳Hhr*a#"h*h]Fw8$$Y?dC;?kわ BȬ3{B$K0Ά_˄!QI1>0kܧ)1Cqm\iٿ?G].|I޻{*"y߳c2c@rN!_ )h*d{l"\q1!xu:&*aP& 0tH0J:%‚y]f<0'1OW痡g&SͮM)HU=olzCN5Ptح 1KmQ|vqAr0{6Nu܌][lx}3g#Sb̷ {R}3"j_oer7f0]~7T|p&i8n> L4pп,tq7g7^r4/#AgdӬMʈVl:k„G_#)@~,yO1ΆtTR{ͷSptvrTē+NK(m#ptU=/]kWSYboP5gw} L7?9_1Qoנ 8B&[po꿽u)mkjo25Uleo]/2+7ywK|)B`̀@_tS[]IKR3|%gS|҄ށk!maxi|^;Cck=Va?8ӌSKUin.7p WIWp7$bkISnQ.u9;zOޚM_pt3wE!ǧ UQ؎ûVR;[Q)gL Y@E9%b)0Y/H)aN(Esfͭ$G~Ze^ƛ<r 8ȹM~?E@Z)"V_"D4[hk|/]8X;=٥dQBR~s_`"F[sYvR\mMv~He0~b7\ߐA(*밤 at3ErR ]a qHh:'$#C̫>4Tןn^ GK)sɉALu>^v𲃗 / HzuDQh1`7k,Oh l)f7LG eZ3_IDk1hFs+%ex9k̫dK iHXUG7tvGXLksk zUO,2e9f]1tzx=.ǴA㒮7@RҊ\iI@huO82ோ 5{SIU;z/sPtguuY8)^]Fz:agz.pxy=?rޝW]10(l"]y%m[sml;Io׶ϟ>C2g-6ߓm.jǶhدT &;8չZ1%H&XTw3WF370kfag6p6&iRCay1YX|^T¦ydlnlK%3{d`;lhXDžG£@TN9p\}}-\ަRD{cE0u!zdJ%xB:yPZ-aL <* s~]`FʩR ˽)vڂY͢bTƭ%;ZtBF Sڞ>g٪T~՚JF#R(#Bp).}6x#32x5sf"׆)5kHd|c 钫5m$-/~!7kW;Pŵ)PQk %I.P!`V0/2GJGt`ZDОIՆ X0;9ia iU3ؠL1hL> 'H е!OFz@&Ҙ68C㱎^i@ QzHLdkmH֯B& d&9Ak}H؋}S=8$-ER*-pI%UMB ȝ"6t!ŌKM;j ہRXkv #X)T+F`@c*rP51!nuD T*2u,P͡B@s͕dLc`јB.b *T*+řU!X1W\}4`* ?KsJ rDјBb *TZ+Íd`B!we%D*TR+KoʫgBws?4\uˉpr{YjxkPUa\];BMBĒrFqbTwfh;Iܫ(dVJx9؟C8_jz \_].T\ %u2lm6lB ̑6 6)`U$) LU&`=Ub dnSBI@H ޵%NL K "oBXH1Q+<)+YR2:Q}@ے ,PE})*l " 儤#ށeT'`$؊lg&x6ad ;T1oA, ,EW`N!KeYw&lH !m^ N&!re1Yd\啐Ȥ9eMk9(tFM3kf > X4.l $ׂ h +%%惯Y= ^ G_GWM1.x.GRaEmLnU9(~{vq6[)gpqG޶ ˎpP+P)YVB Kv!f!6ؐR`C lH )!6@p .J=sҪeY׋=6ZVh?R~@CFr5 5 [n` l-7`*=!vvn?vyX pj4TIBQ]zm߇]}!'Z*ÜL %[McYJXjso?ߢCp;vevŷE&ЋAKQs eNɵ.V7G}߰EXh͇ru$oσM5\>Bzf5瓍_ԚSs蝃+RtRi>T3(FhmMЁzԢLRRƹt["Rb* K 5Ƅ̼iNAxfBmB- ڑPY9j܊##KO×|GGoOf f[^Nc&U_>̿[p5Y+D2*J+'d1YбsWtN F##E$eqTF` Dِ"3z#_ljzZ0Ybn?whvv Х08JdVT*SU \e`+g K0чCMI0\eu"z ƎM$:iƴ2 Ls©D&"eNe傁a:3Ѥ@&9IU2H`4J1}P왜 Bh>]\Jd;6:H/R= %O8ggRoxy[ a,o },NY^ Q)z |ƍd`E>9]##0mE oG-۱4LkR +0EM"(+t`u#\Y(!gv //f-õ[Xj0vzeQW+?ʯuۨ5w\ } t^cg ]QuAIq:=4v>n3R`ɂ?/ūWTOM貜_o# 0\.VP ie5RG#d藒ҷs3@HǢt=fq9b~4fgF=(c@,9GBfTZ,iY[U L4BEY* єjЈ@hj} Z} <]]7eVM8gV<[%D*i%`K2$Uf+lTG}\ki ZCb:͜)QVDcdԩ + `D \EEjW+;+ä|_5g?3.(Tq?o8N0A4Wv#JF<]w\Uitɓw}(ʅ˒u5~ezNueпX=D,%]?*,qB %N%vrENεpFt{OF`/ORVkӿ" .KB)&-'q˚ g6RHQ_)UfibOsjC2pjKG@I.rSc' "I~+(֊ 4Xlyj4٤†JQ%!*eD[:a!L%K¬uZ+oHrlpp`⑽ɍq=|`>l3ba%DPK9!5QLg<-),T.$QyTVw=ݽ #m󆏞40 SNtg .?߂6R=:f*}D]bRG'\% \cpO3n)C@y)O&{3~{]xCn#Xz h(G|Hyb?j5?,S(ͩ rQP&ҷr: 9+rflt.J0yVxP@xlݟ@[%Uӻl}LԂûQJu.,l\Vv~QyS~to5|fr=K{_=*+ ]g4j/Mә??`5fgpݔgXt}9X>>NmjĔ18ļ|axj2 㭏CZONgm@,:9̫ZMΫT=/ܼFo7Κc#[:ctUV0\peFliN?mZ8k 6?E_Z|sMT͍/]F+ D~p1騹έ"gp-A3 ˾@B?p&a^3 |\XB^՗^w +lXZ4_yaU5 ^p ̂HB9UaaR7Ev GEZ2P1AjlT.ۿ1EJ~}:". L*Me*" u8Z͖9j%IPjFd>( wR>RL@GmW12#XP1g&jZ36Fv J9LVw6}&i\J_ߎ]Ue5{ hF  (LQk %I P!`V0/2">@N&I;-xJ8 \H☁ 3p3  D9$&09ruiLv/X+cOHƴ (QHLd$&A:ł"+KX2&JZ+iRNZ  fӝDF6e$NBߏPJ0ɿxM\E]R3E9AZaL)%O~ʅ4l~\muͶꚛ][ls ǹOĊm.i=L)|q@JCuKϫ &ʕcģ(Icr%Tmo=CHI 4D呴+a i-L%Qj40bQG"`"RSF@-a$E:Cʘtù S3udˇ}s㎥ X{Sq`_'3=Cw9ɇ }Y̑U~d d v1gXr f`آ*[?#L_zyo}RK8{ӇA&|ַfc*/e>9&?޾bN:e|z]&ojP"?v {zM\.ʦ M%,VnZ9JE*L<)g)RRA(Ŝ(aHr |+ 9__+pPv}nuohAS U_aP5S"V.G. J l%Ϻ5ruȂ  <0Ľ|0YuWf'mD}Im6wvِmB%^Te[-wLZn)jJ/8\裯t~Uy٢L:|7-)X;Ԅ_H_WfXs:~sr\-DS#s<]wIKJ`f[ &3gng`Sf9,Sk6=*Bz Y^Sh։'kc5м{L&k]!qulĚR#xf"n;m:|gpN8g/{MPͣ&EC`sEY8{&?W ڛ`®z֟hm-LJC-ا+YCݱ!#(ƫmLjJy SٝmݜJxG:_:՛pjpXs]!3_!ZW?(Dweޒʣ$V]Yrqk9N 9wiKsEMj\1WJJLK-I K~)(t&r8'VO|JTb&>}ORP. + X`z2*K詨D-ǮLTW Ld'`ɨD騫D!ITeHMu[>#(۾`0E'+W/;f-r-ƗEݞu_3!$сo}ұZ{ԜDH5>`x3gg8\U|-Bծ> ]֥~pzLJ9܇r~ۓكj)~N=fϩ@v]A;M&oOoo-מd,2UsEέ"gp-aY[)JH^vЁIHF JiLSB[FY%rZaT*V"/>,PHV.*)Jk̚^Qü4|d?QSP{E=>TZY ]gfb,2i5ȥ n\t]+.LpX.,jUD+ ,4DGK2,a4)XȬ: ݅x {Pg `M MEL[U":]dlyj(1ڽXH1 % 0AJYIٙOu2ˆ{q20_k3/{5_V]? x W0xFFk.ÿǤot5EST9T3Ndu*iH3wnw˖n@VUm;7{WƑlH}ɾ dX0S̡lzfxCR䐢q[4{U]GU4KT [XMsz iKxR/nȫT44fuEWœ"ɲo:)5)ZW'g̎1;|Zsuǡ(nRŽ)]YcU%JZBӶ xnt xbISrK&Nҭ*{M1$%TpWTo`EӔfa ؅&>>毸PT'57ņ>n p=}YlY7<0I1,Pjn 혷16QEE:v@{]4wX'"NFWꕛ|S5 (O Eݍᘷ%I ѯ/+wtm^oYޠ7~ez*43"b-a2J, W!-Qtc<vMXS<Lv )`ron.)sLooL| oxyT} l߽Ey˰eU (FTKʿ?6 tv\UJ?0*x/:XQ$6S$K=qo|{6R{ӝztq : ` LCKÛsѸ_^/fsq_g2Qvi,|0~mg ˙Y*2}DXH w!NsWjb{ow3޾۬ҽ}Wnvq_*[v¬W逰  <Xeﭻ6̃\z/0XRRېt0'% :˲o"R`䍰WӠa%\咴vOӹ][UJ@mZՊDX*jx"b3iLҹ wWN.*{;T.L3lsG"2)abf<{x$ A"垉$"6KE>09cvI3͔ɵnDU)4|۴B ]9t,t;DzO#vݮ+:fQךtĘ!ı00uTPԙ<3Uvh5Z1\M̨3hQ9b0X\NPL2-spkl8 q|g-%shbx :Ei[Wt2i m]_U+Ⱦzjj|αt9Tf=pZ;=wWie=QT=h6}`e[ ` K " rZXZGS:Kc փV(0j>#aA0p O$ȱU1 UYJy>%4QF [#gKxO7ONy9Qua嫼M~QN5PtJl wj䌀&Ige>Yq 8ΠTU@TΪ_'?^܌/^΃H+7޻jm휡Ǜq1 M#i8Gj4#p050a)`*|8NٻLNe$hI64WFbQt&<}1M+f"~p6nBVt iu)M]p) vzVē)~&Kf33|%5*_^ҵ[5ޠz%0_~߿߯_ϫ_xo1Qo_?/^%o&ݛ_#@AMi[Cxb -ۜu[pSnr!fK|)B`݀@],g@Q_?"m&VZT`#RM$Njm~dT@uu^TK屷n@׌A:Ak9AkeSm¼d/h<*h0SE7Yjl*z,Y /QhF‚2($$,鮡Z:m퉝N6NL)p˨ށ7`-M:^@UWjK+uGH,K9_iqmbls*grKFDOlɳOpwWnR G!/@ے9aPHRuerFDk# Юכ dhxJ~:s՚[L7ɾ4γ椇W//x*`Z` J r_} A@lAq^}O~_>-@1ڜ r˕p6pX`N|tPI$?ќ[bɷ*밤 qIe4r%C!]DQߡvP H|AB? 1y*rluǗˏ]R#$5m]b>YTÁɥ(WWA%JO 0,ɋ$R&L ;Yaw|۠&|yU,9cRwyy5(s.I΄ R;/dN3W]~[r8w0^46]^ȫZw"}Y Ɵ"G? ۽1xo>6*'+sۘѽ}dW7}x|^_ x|Hj+en091Gک۩'V{k 눢,b-rD$d?X*3ěv}Ad&#QHYu2LwZ D佖豉hj4BZ"ZVy[#g˴&l~ . 2QCbA/ؤͷ<,&o!iN+o-*OnOTu} oz\i真~5kI+}(Y.]^߫ Ȁ]LzgΡu9:ZsPltfC]e2y[vyxsuM-zk~?JZMwy=8~|_¸W7t\Z v<ܬ n'MuZfv6ãBCYlNsd/IzĊ?LGp4~3f"#~wQKaߋy0()eiRX gjx:R\2>H }lMU* ̪/'?a&/j4;Ѡ UV. .g/ vcv;MIMmz)zٟzE5#6%_m/l>OU Fػ)έ"gp-HF%HY7ųﳐYXǹ ńI Iy1 b6=k%e\+a)G&B`L tt *O2vOI~-Q}`F$"Lbe]Y渧Ri7svkREXSD|0arroD1(G¶Ef F:rG7}SBE BSibWD*.U9e&:0 ¤h4PBB[#uO9Gpְ U(G`?M(#DOp).}6]F{#32x5sf"%ckl-Sǖcz|^PCjmj~+O@ʵhd@aRg'ZK(IR yIt$ ħrR,(Ɂ#ȅ4*ptbjA3 (؃Tt#ZGƴawL~u]]UꮪۑniIeI6ѵPva~.}Ud?NJQPӼbq?=  t`[EĖP\"% 3Bs4'+ GS{az=>_Ćo)J,Rg;$~,s<bS“EE&b̕_fd7C`q,&.L0˦KY$*E>Ӓ)E 0AgR!\Ka at낵0V[5u j]F\-mtƭTGTDJ~ Hryk 02J͘",6;˜pԳzV[W:kՆeׁ5> '7i iQJ|@J$H^kWMn+xe"b1iLҙ s[u!]SMʍ$LQ0o1(;JCTI=< rDZI,V#z!Uxd!R&R/5e тFQ4`S8H9;FOqx2Sz;xҿ;b Itj>@33^m .\ɔ7*p&1D38jb:ww5Nc3Z`(3*h2:Z0rD`*:dZ.g )hl\<{եt7~lWt,K.nK gB"w$cy݂2m1efbP3آ*3itvЎ'gG>&%X'7/K7ˤpn!fw>L_]> <t1ȟ/ڳcef?<.דjϠo3e1fy_բ-7^6}*? N/fIurI ,Rg2DOwv"k/%%D )XZYh6{{' 9o.FM8]PftB@+&5i&\\{$N0'ku' ru~dBp u2qg-nL Kzݫm'mۈ-mܷ +@NdKm:.tn*KiSrk.N=tzU.-sM|6{ΐV՞I#-\W[\x3kyIc։S2;6~7zLRrTū80;z.TZڬm}=}Uoڨs<+w'ޖ'-Fry<1z)"&Ȣ RzE8#MA`Ihϑb`͇l|*5Z[O[uI__ҝWGj/U{XwM.z4L Vz/ʠhX䬐)^,RN{U~?;RTXc]\geWl-%ieC/ߖ1LVMdGpn\o װȸ{mƻ~ߧnMŜs^\a./}>Tw*RP2s5|͚e[,~L~mDEh6SHd#"AsE=&RσBQIf~x`kzxށ,(HP!ĴLhmV!\hŻ:qyZ=emΕ$]h|}hWCN>~A9e8jcDK$F 橗NjXQO1uE%Vrn<ٝj6ERCoN/+(v%=s' _u"1󢄖ȑ[pOo $bXb/(~ NM^x5YV_hr8;F? j2MGN8*^a rQP&(=BΊo {J~ oC͜kVڊ> eRiY<6snnKP)_rz *HTEt}u^rI>a~4ۀ{̷Vk%d)WSDDSMzG:Rp Z=B‚)8FҎY | \KL:qyWk3pOl_t nK][+p,?ڔ׾z\nLqwOw]5{3|]raa s"il>8.L,HQp#X-2_ZATxG&)B*"Q* 0Jb,A"A$''Aܷc^-q<(͉&ºWZf6G-<sQ tԖDRa c΀NK/]5jkکByDIl1뀜!"yΰv;$Uv!Ꮽ:|Hl Bod?/{\Dp_Z$KN sq*b$b傴}L0c/&˥~R"ȟ/yRȣ}#cv7F"r6Fg)zSwf<\Y>%x}:*aP& 0tHp?J:'C75H( |Ň9w~44a\8J?pU[_^<^ϗ?ʨ*`QsZuwFXtP& +XE$S ʜֽy¯28smqп*yKK"Xj_f͕Ι;W M#i8t4 iFa{DBGN>nkF1 F6ZOiԦ2#PHhR׳Ws|{NW4ﴀ}78E~2z.ݻË#t%o;3 3=_<M˻).j׊*aVqW $/~|/ӿ^&\¿{'gM$h~{o~}hJƫ2lsw60?B,X}B`&09ec&`=`%8bk&*M>S(hV= hexPQC{Nj}~bG߁myMNS-]@@r A^3F#sZ#-SYwhH BؿG3D4FOVd\,Y /QhF‚2($$0tPCVfă2C63C{%FxxMIpQ.{m72lkq~9-\`+c1F13ɬ`)$ŤirM7ׄX[Ly+c^8 Z:z@ELåH\h'lJaq5a#A2CSJx-RUe9;D[SH$n}59zCj25_v!ꍛc\ aRY &cexH^rТ6$Q3-9˜R&$=ZF/&soh -Q'@'v2ypr!K ڸvXJITiP^*rr3ve&/΅zBɄBȪEa8Pt"Q_YN_BeH,e=4޺ө'zwҶWe] b b DP7Ѯgx|^?5n^RMEK)3ɈELu>^v𲃗' / HzuDQh1`7k,Oh l)f7“!D!e!x0k5f`2b=6MVHKD5rv)us8:n#3I2Q,KJ[k(|/ͧ:Kn +L ft]»w=O>RuVv&r#i} EE66 m:Qg@u1af/0Πu1:+[o9̷t:o%Znf!̠eVݻ;=/o* CK-7h8ŜwYa-?}^id/[nm~$\2.msK ̌~Z01)Tgk1lfb Rdz9?3,'[ҢM@czn~aVX|k 7*|6xӟ8Ow~o^'mwQ{wA=%tn_L>O"?&D{cE0u!zd>HpQ yţ((-񖻳{3|0arroDA'6$EZrC(lTe^ھ!YG>zTr˟!. L*Ke*$ u^#p\-s g KB}RJNR>RLGmGǫ ,3I5qθZJ /l 36兲a^(Z^S^8% rʇkهCWX~tOppF!,59xѨN0+D{ACJG€ൈ 'U:ق|M2QY1u@er4H &$$!Gz/ *Hc˖ZhL+ D: DH[XA⠱S#1s:8ZJk)]RQShp'Y˪6f_0!5 /a^# {\x ''e47ppo0=>MA\A|߾yRڱ;(L0( |lp :+n9p&B+BNˀp$ҭ:;,-j|$EF/yr,?+ٴzEot_oGŷ/+=+'?-x?Ɓ>at7 K{ d/~{we,ef0v44lw NW=_bl.h"[qM#JK3hn ӬUU ;EWE;v_hUeqc*h%{gqЅ9A P#ޛL-IZfh4eJ;no#J0٫*-WɴXDi(T*`pG<QTЕlVyfgSQv4*ήdW\)L+ q4*Kб-;xDR[dWBc+;vJ]p}:(V-ٕLc]I͙Gî\ձ+VHz dWJ+NFGî\鱰-;xveW Dч ݿ_컷 97 P+'@RkDي1iN1e S0zEbQ 2$wT -OO>=RPD1 Qh"xL 9XYsm#ˁ sp}BR9\BrCv m`|c# T=7eE(be2e Ab@JA\E9EWRP;5aiz6W  ̊N[lBrsTa9~\qEB[2aQA7>dE:[S0d/ƿU|W}S PQ~6OZ`FN ht(שr*JFO>S5:+D*0^>y]|מrD8jcDK$F 橗Nj@RxO$ ^ؖx!JzhwΪ}Co̠R)Y/߶͞ű;o"nk6UQ+nq_⪉17Ŝ9s˭#2VFѡ*2uIDsݦry[:Z&&VQRP@Tdh՞4#Dz4‚S?yvΊ\fG,?L&\]_pyn?>tzƥ+ >&,6r໋̺W_>z562?o-=K{"v}g7}`ۃ>&ѱ<{p JXKTyD«mV[oE&E=OQ*-O}{lSyA9B-ٲgj`Ӛ8l+[Dhs~5 hT̵Q*g er+U5HItbq+c^8 Z\7$X܃Lʪq$U.4jC!jGjd~rV|H蘋Jx->cmz8kZ`H_zwa`]g~C}qt6|f Zk/V[)sXTyA6^U zyq*;끺gp鼺7Pm&WӷYV^aqczpĠM zegȹ`1X>v0i/9h6$0K5S4af0rFpˋ>OHάmHz!%[ 8@cqt\s濕Wi+RIbtۙs;J6M% r&JKn"'Jќdsk,Qp2 PP⹼⹽ټqjT2B\8&$( 7^Fm&\I*8bj6!r 8ˢȹMEZ)"V F468!پf-x>̼}Ą56rոYpcmD'GK}$pET$.T/*o|ۀ"F[sYv. ̉ښVّ%BYe4$@s3ErR R,!.xof@>$1oqq 󍟮՟yi{%S#etSܸ?3^Pe폇۸e];Ϸ5'>9ɢ8g2%ZAJ zѾTkHc*SNqpހ\$ƾ:-v XyؐEp(9'r{7TL[^GZ!"UZl&p>lJb 3{gjwpָTvj%*Y+R isƶA{NX[yD6Tr78~OB+S{YHy@g맆7r78*{>ӆ"h~(5<]`SQ9C)*Ҩ7{b l B,6#dZ:_O}NY*x} ЎSVOA~6I:+`|<A!+Yn#IB(Eײ>ecSC"L"(z9nHɭͺ${ҳƋժ.[tFE #Q|irFEK,LEkPJYrTƅUEu<! ŖViܤbb8֚P֎؇k-6y#O>/i/nmYiR odiu͛h4 cDLµ4=_:dBuؕr mqksQ DXL-^rJ&h g6Ѩ#:s˲z8Y A+`w!UV{'"{ d L8iF3t]]2aHEb5'=u,s.lbqM㟤97gMUEzr7NTRZ<'JIIic9$eNJB:b*|XkQJNiCb(( E?aBҋHsF0K|#!XētREէ (хk§tp}E DnJ+]CWV JBud@,;vDiF*I1?؎uBނn]% yc->A`I. pqDVZ hަ:pV J8Ms =MD+CO,-VX(;KSvCyH! %:.p /sx&!Gun>`,C)9J p['ec.IŽn:L>^B ;g!mBxbtJ߳`, CaXYvc;& i]DUb|؀ MC:te0Mh%J"-A*5҉szVEC,o)52ԿtV Hi/N2:hIUT&ŐZ*m0&xڹGc,HCUfR ;( ,34Č*m )w!0ւzD-ecW,BfV"⤩Mk<)@fA|?t7/x`\8 RD 8Ff,xXLa ݃BҔ{mDV<(ZS{N#5ziV=ؤ=ɗ Uw&dd, D ᪥hsOH:'/Wi:?v*UaLMVko֫YcUqڂi#b> ZSti ja3^\0##Y ljr(TX5]A=u@c(6,qœ4 Vo^pH`"m TS.*w. LDC1 4&Iо"0|ңhI5w# 358 N@(=EǠ}o;z OEw9ӔlQŧ[}M{fQ*(R=w9s>Myz>g X[сd)ۙh耵*R{+m(> > > > > > > > > > > > > > > > > > > > > > >_(/Sݜ^$́thbJ-ig&NF5yJHSi9Ȧ&&kbΞ^|]\,Yq'X;N{rT5􋓯Hco̝꽽@>(O<{\-'2ޑ[[|5y髖}]wRzU:[/悉F#|=[~="꠻DRUB笴/jW&&B 0liygD#"[SJ: #jũBATB ~! T)o/98~ѓy!ƑƑvH{v:6g8) z.MX_ԗ k/yԫ-D{T;z敼dof,.X\IkKVJK}{5.҇%*¢]qW 8G4>v1idRtSLO4.Z1O9J3cO{毞sI+pk'w^J89p봝.)h#k.OpH국)" ??/[><ߗ_3/)(Go7b+]vyn3pO]3b;ÿ"-u` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` Eqo㚶鸆QkV׿b5RR92O[}󚐼Ss_]G)lwYL@Nn2>)'@R;Naw\ر?clJaЮc25K`3[Y ;%ͶzwVm8a:Wk5{%륹~͇rRcTԌIAiKjj[ ǫF0sG^U~7_Ǘ8 ݠ{~NA8+x0e'L 0&xO {NmY/˫wX_AefuSݳҴVH)E~&tP@*r!p %U5ywΰGsQkxN$j&9) LN &F`qN6:$(r@,҂ٲ| 0ozB`sm% SVX1h>6PC"[ G4g={8|]|D1O\s"\f *Cd9c(_<*q !8玝POc`^~jq!Ƭ̺9hβLFAJ9c0Re2G{ 2UjQۣ魎mcLkpU*Wci[*3OXDC1: O[F*H3{Й3f;7W~Dž\hIbn/L]-4.8 *_ULqAt67"q??2{R_yue;Kd.܊N@vcJ0a^ u /x\b<!xw{fj,_4Xqt6r  896P<4 B l&u9 d 3V˕ϣ\ g8C#rZ$ X;@Uo#E -Y/Yc5,vh묚U>{lՊeأ,Km1 R 3*4j1Fe8LJI1Qt .j.XgVa.jz5Nzj|E7([k2Ai-IƉ0 KW6)T6XU ]˼ENe=PfF{f[5k־9}c` pxЋ3˦O?B{7H y\tnuCG3*''=6 gPn.NE::7RUb^}{qcrqRtJlʕPl\ g"_UW,[C"s&S(⃉X .}E%Lί,oS1KJʻةf4{#΄+E۰ȕ.2^ѿ >L8o{]+l;x0ٵmVf*+f}0ӳbۇL^dձ={^x|K/x[eW/ۢn ߪzQ(zXHhWGB\ORhڝZ#α@.j\)K0B[eחm1P쬧A:&HOQ8RǻZ|MU(/2;~AwX1r6Ö;-^8>VgU>1C`p6v.ñՎќiS\;?haǂi 1EɘxT 9/-ɨ31Zp1W)P.f; Ɏ )3*gs,LIqDXJVÂyx (\VfX,iAͰ9[&9Oe))bD豺А"QT׫]Y{xϞo-muS(Px[3+1޾pR\KZ?ۚȵ6V٭hh"_ק5Aj5^=,gy p3]~5%ޭwny_xs5=/\FfO׻͞v _1FwSԇ9fF|kΦ?kͧ$ښ~jv͗\6p2'vi0k E˭͏dsh05m: +m]d'[=`$;dQ @ōQEk&tn12֏ż{s>/]Fxe^a] |_յhg&B:d7D.#q8}qʪwGQ?pB%,1)m{Qp* c\t`r)r6hU1|䎖ip>Nsek; - |_ c'&7:D|19cE|b`i7™pf+=s;3'袒QjBHJ :$F(sQIq9tJܪ$"M JKbl뵔b;cWY BƒgA}X6\9>ү>~:iO zKl4`2Mx De46$W. dF ɽb;#pXl@1s2&K1+jJ`am ULVW.Ff`\.;vں֭n5* wZ+(mYT B^H݆ƌ ` b.-zl>kTc-=1 Rq0rdbTiʌG%Q,jl\1zy kߴrZi䴳0E\ͳYVҺ6YHJBŸ3*)+LϦ|&s))Kg*oK%0DXdhZ#hxY%"~'% ejPqr1R o)MOy $C1 ]{݇%lZ4&s9`f&(As(,%8ZS+ܶ;]*@x(&rBbnMWѓD- z' p>=[{*ՑT*W'y⽾)@d7Q;䬣'ǎVGo+ VǂYebRg+(&lP8蓐:V6cBʘNBBi'2rYx=qB:/_ T8R1r6)v8Gcؙ-@ao.j&w$Շ`I1hZ\ 4 d%|~? :ȵ9$dG⑖!4Dl02-"ޒX`ϙK \Ӥ+G@1X,Z`-$4z&WL1 Go\כ-Ra[?*+yaN1} % $cqk$26$<\绪|#Ϳu5 Ϋa1mk6l6.OBmiI"p g.}^O㯟K$,tD$Z16H1Ť5)&w>aĕ2G@1X9LEErN.c2*AEUG9u,Hə*.Be hrfcmm¦9[B*h8|y|'n>8/皈Ӯ&z^d'76 ^ҏaćoRU.UI6rn)$56QmYW7D$0Ƭ"FƭFxI =)Z@/J\Rz|<_ӳOcM1h+ZH%Js%Zqdp ؑNOKu#-p%=|ooЎ'gGS&Sn4z Pop}⒞[ً }F:cƠO|{"/.s0|wz_g 7>kqLg֛nerכ*'^\o?}ݔ5+,ȲoZ9I!'&}\e`rbT+1*%(G3,'|As\j:c׹սU-6{o(ӮX3?jJn9sg2sɲn9.X-S1S<9ZxL [zӫıMDH-6wfa^WY-!2O׍^v8]NZ)wJRnJU|[XnqGRIR(w]aE;c1Ľ8Yhwhoyt֦e&.5Ӱ@u{C!yg&lǼu9+e<&zZ,=fq\XuSVUZ83%VUWnvnSZʻҢmp[qR7 SiÎ]')%1YēJ=6 '1`CŞ^M?929%&|O{k)wQwv M'/N:~R9/ST# #O|f&XOL'ݫ|fVS/4- X+b~p;cne~ yϠ3 p 5 >t/^&/rcgm3[}lP;}[f -fw4,RZX" qWΠDuS-۷|`m7{Z-jh2TEe-Lžm-oǢRhH\ )g#2ٻ߶nd-pi]^lXbO[,)ǽI$[KuԜc 3ΘsQWH-%wd*k\22|řIRk8iTh"¨3RWH"p6*KٹLMUV]} -A ܜ 8uUWWJݢ/Q]xg w??~fƜ$YBzSM|Nyj16|o'H%5c I7hJ͘a^p&q)('*7%,!PD*"}]_,?N?xXGc֫U(B`*Uo-,~LhmB7:p鹉>D"x93G8sYb}E%-涇:ra[l4q5rw ;O"/TJ=TbYځϕC}%$(2Vk|DTErhͣ{H{%=jJ֭Y#G=mCJ&HZp7f5A ,Z+u8cѻG;wbrvHT|v|!7|Ňٳ6<֢(oXM*vbkr[7in:)Q,Q!^uǓenwp0{%bhx8( )H7֥;V=zqtFԥ̹D Qa M|MUDnZ1A!HR$f'kax:E DK VHd%j ~pr3,bX|9?Ap_-|)kC3@[\z2bEUef"s9 2L%,K< L.‡{\7:&Z3?{Te^\gmK.B}j;+0s(vÎu6tLK(WJz__‰>GP.vzOC:;lqj4Jqk=]?ֽWCp_)+|\濇ɼ}s7`,遡rT.VlJ[ D\stlp]⨂^_\P,x#+5W9XE'L P8kqW DpLs&AIH$6`JIgOBZ0, =sVz.e 빱1$ˇ& eQ9/C Oj#g~U 5U/sկƉ 76ɝ(nM>ݠT ȅO菧@=\ꏏ={Cf8W H}w͂ q:G ӀiiFi nII"S##4K+m @y"iZX4t`3*;kYBtdȊeB  &|!5D)|laxi@UɘRyi+fQsCH-leC%&XLp@I$I6 ˨Y ^e)b1J&ijb (w[ _Y"!(O'c0,FnU{f*,}}<ޛ=9[[Smn)f9-YΞUgmvK׳]xIHK IUC:p>egn]M[,[ѵֵ]v>qBZn`Eag|i;/-ft3/D_ZD9ّm-a5w&Ӛɾ-·ҙAww `-w ^|\G `뭑Gˡqkr<>Y.^|ݍފ$!DH.I5E16HaT}"jIl*) Jd|Is!Z橏Pnk T:Rd5$TR.jTGGγ&.'^⻸ﳜ^Գ ,k+i"(YVKy`&(8oZIz{"s'%AՌ9)K E))Z)C}bB2.X[J؄Jf,Fv͸D}u* e &?yRN7eݎ|L8SB$dem 1KkSj^5j45;j.(e,GqD9\EŘrN1' cS %p) bTmA[{HfEyHD[) P猜n5 oJ944.O]Ov{V -' 𡲷iLyn $r11) 0H"FeGzCErY#TcFxI.Rg:EK%(Tɓ!>.?!iM㳳67ŦmzuF|B&KP##sA͝Q*BUipcKzz#@;R4Ύ<$Bh4P_UUV+8'/@>k:t.83AlE>1ь}uޜ}z]&t`Zgn<ЭM֙>ݿmz.>~]7/Zi妑rC$.}b"bjR %Τ Q娀rjP 7܆JP+pVu]tSHE`T0PTчH0 V^$ gCAGB|nǹB_}6gL4S,$m@x %"Z(@n7AKf(dܫ؏ӔEXO&pja݅u36dd?dTCP*ㄏ:hSe*Ooښv2BE!! 1V1=;J&o̮ ZpGCq 9O[A*+H9%7'bwG'[obg^pz%Ruhv{aja\veUtwu(&|f>0g( xh}M;e=%P')i#bLf8g# )q.JI)v<7w1=S͞jI䍣dP('s ]g.I&R~zEp 1Jh@̍^z<킣YNߢȵ#x~uðH~EfybP(\y'{?ٽN6LNQ\7꺹3FÅ8<@`s׷c_=vӇg:_C5Swo_n?w>ρjxv?y+L0Jh0=;~S:JlDNtqw($wxO?}=eۿ}p/ I91:?F9'C݇8|haUY|qq:qMUF X!ƽԖ.me4!8 |he8U RF&p~8nB}t߯;:UZ[uHĈ@+þYσH?Rͺ`nD! 486`$&=4*)Z Axؚ@fkG溰&z ўB.M['4 с|@.wm~c^ig} ,^lё|$K.d%Ye5EUY"QX Cu֜8tG>Fu<xգ.Wϰ4WZHVOq\=ns7k$; dz4(o4gЫDDOvk#r;jWAl!R Qc:e5Y[&i!LYc3%:yc4Z)U '.y.{+=8?ǵG =4SzR RZ4١ʙ2L fe48PXΒ W^y(ݝ7ZbD֛]@܇Ytn1}͉GoH#aZY6[͢xbc5񓾮`訜*ϛ!7kPkP4λtb}:t}`)HWFk2*xǓD!&霉e-8ƵY#s+שtƽbYפ7H^툾:Y7׻R~9lnz'z2gTgT\ i%z"%`,X/d\:"q9WEZWEJi{pEƨ^q \kހ V"-w]"=\}@R4A*s6pE:WEZyvUtW). ="qIk:\TW  $;݇"f=\}@+%H`UW`:o )• pU^ĵ\t7?$\aZޚeR}ſm tz:w?~䒗txLZ,ˋ-;IQ8k4|Z#16k.\ҌP҆X x$tKk@=uXSeǏQ(qami_-4>.EscLwR$m Bc@?̗!}\RNͥqGl苔IAJ5a &\!ZEx@lH!,3 `*9^#ZBqnhJDf$`ĸYb2 U@f6 dSML£>2㕑Jx!۰sqkF֥8MX3|g2Zd#:qI#uSQYal3lУhXߨCp=z oGu^fy}Sۤ!EmKg}4١FgCBVڃYz[7T )YdF Jƹ \gȄ*8rg Y $1vjjN #D#.(͒%ʕfJ}s@i~f|4 p΄_{s"cɔ0WRNK˟_ T9+Cl@⡱:`chJTer7wLIX&;J!elS {2 lޑ*HiRVƞaBwSV6m'lӦ5ߺ]=cl&,-b))(ss襏PM&;^`оXc=eO/;J/ttȔ,kDqÔ "EIDKdlE*4)2"Gt~L2G3z'S ]^VgA l~r4T ${GqNUSkq}ٞoc:KR;,$xz v62 >嵽*t>yј"';Ġ k i,Zos4}pGs!lewmxxwK/=z^k}>L-np(WTtby_C1-6i=ms+#*ܳ^ ĆWm1ry14aDs4:I~8T߬ev٫n^$cUvL\~/VLJX-@ ћ{Ӽ 87)y|Dc\'CTy#13K,x6*^gp"bQ^3` |{ZwZmF@tQaG;8숡e_+Jkrw">MMgk0z)aYE)3 0bH924,qˢ Q, +戫kRBRJzxyHzcTZUDc㸇WVgݜPN^q&ت<وNeŦPOV-?wҳŨʻ33Dޖ$I hgQن QP4VEFmM1*bژ%=)`aqʈ6g&E͵+@62Vg72ng)'j3P,4PXxT,;}--Ź<*p>߆#6f+DFgy9\2R'5r(!X LXꒆ> &66AjFh2{̺,PLckLg7bx׌ˮvq(jʨm{P=3ã [R%DM$L,KKTFbv*fhr +:dkbbA$l % LNȨ&8pnԏd2cW8]eDt="5+QVEDg9cI.ke4 r,$t !%-Ho8*#b5qv#¢\]6KE`uqq_; YY]A)*69o}KAMu9^.<*8)'dV7hk0E6)C߱lLmm'aNz0s9e)!e\ k̦hckhEnUpt&kHF1=x`Y%n>.?|bekI/oD%|pmco!G}j jM-jEbzz" #ᆖeC2G/]65}csFj>.#+\ ( ma@eђN>hA@#DV&Nrj"g*j1A.W$)v2[ 7,vgVyn6SʄE2oIq|۝)yݲŒ1\1A}B He.'CT))G}DqѢw:9εs$ᧂe$k0 lN$+1jTg?˳ށTRKWh6Dݮ-Y'nYFk9ˮ߹FΟ5+ؑI`4eq<2G -l.z;ڑsvk:FpHPa ܹ=y|j2}&x1ˆk_RA>]凛u֮C3`۲v۪v`nr/'rb)0gXLq"s= &}qр2Yi+)ê͍s/P6u:|R(>O]{W[AQ~P_ha>i^]Y=:*Kt5RJl;8sSc~o/fx3J/~v"nRܲG掠x`ĸYb2 U@f6 dSDb-vm#VKxpq*ҫ8xLqYŷ4Q@~!9!b4s>k{gѧۨa9e&g_¬;U—pr,v?⹵gEŶgZ'bZNGcZ0Xֲ|Oykݳ rI_H[M AB<M=zW:{U΁kDVr6Bhm&^d1$Et:g h1tcϵp3Ьh5 Ž^Erٸzuj SPgF\Y"j!oCU`t11餏&4734)rOˁRx;;^v]p|G@WpdzknGQba"3v##jy:ɢH% TĤ3C 4 r.󜘵+0J4gUϵFV(cE $ZTItpAst,.UM=ygAgk.EAhhiNBjK BMWӖISx<WZ|ܫ4gF WB-sGc 1a " Pkzs&1BtDI:ф)QGl>'C}cF'aH+ ![*YK3<)BR 0w{tB|B\iT$"`p2eS+>zD@|8!d43*$u޳m,WJ{A'IzE_j{[#I?-ڲDrB?ڝݝH*aceuiWiԽa=h Hb$Hhr9FD( ;$ũU#v ipK8_=z4g!ک9 J1E?8W ?qIK^9 l9y~G!QN+5<5JP\~z|0EmK=E?_xReQĎd 3v9F"rpNSw`"A1&?:@0! L).*,RIkX;)O' `XOW3 SwxKS2j`e_ԟJNnP(+gK\)0}7ӳR=[9Gոttq/PqHZ4MÐanfY>!0< V0bul^7cr4/#AwZOiԦ2Z3PH:U+f#~p6,>isSSvKqˁ+\rq]W>䊟Iݤf3_%5 6wwLU5bo0yp՛'N0Q'G|xd/]n 7>zД54WM[eh欫_a\ۜrø 1[j% QC4_Ӹ]6Ɲ6Ң6eM-S0 J6VG_= 4 ZT`#)Epx okB=ujdy-`5cD8bpa%`+g&Uɋg7pa)z}GכmW̲`3pf*]hp͠sB袄L Q2pՒJ!49ϞOB\E2$&-ЪbPsۉ8,tY~Ax7 ͱ yQ&?^U&1hPLs ;m68t592" 4Y2)^rV_)-ȐåOOm"g4`rLn)1H/] \b[Zz;M3k:tHu1#$B$^w΃R$(bZhmV!\49G} +^P'_yt(SL w\Ft>t9tuSVFs?N9ݼ^(;q#BW?*#dV%hê%awxXE 7WI[-y.wwsIFCp^kC41Cc10 `v!F3{y\ygڽv4*k:(@YʝQFG w i[~ ^g}t<DZ)ݓ' tDs >aWtGr1t%ᆟ\_-H^^~>2 w]̙27V\("Ԏ{gG #9͒gq41znIf/U7g>Xo" t6-#o(_<.a秨S`}Si>={w̼҅qnW]ۊ KnTnntq֮/覣e=zz$NCt?0wZQnyo%iA5@ojRrtYe.̎0;|<¼aYw@WLXQ(,DˋW^m$JA"mg-I6x\.E^ yw\n xm~GV-HJ,.ơ﫿AWŖaJӿca ³-P#; Iޡ٠6ۛiw%[ z ^ TAtmj[R " s}@8,x0* DI}}'bDQ-xV g(OL*oK|]y]cRD5LFE*8%q,GXQ R%a`UݶpcڻNja)ۡ]# xŞo~n!D55>XYs .Ps m1}ZO|U& 9쁤x1Ox=olaeB[ QRqP6Bre9AD !+9-,kq H)c SFHX`n+a A2 ǨU1kQRk Bzqyֲ 8+O|oNՃ$=hdmZ@=DWOv04lbm+.Waa s"il>8UF5 `"=uxThlNHTDՃfEwZI%PD0@AP(@2q=K= 0IYt+{lb]YMQ 1<`. Jm *H$`3g ? םhڕhZ)By6 l1뀜!"+gK\)0}7ӳR=[9GոL|tfŽ@M#i8G7 Cyyd`X(XIB1{ Ҽ=j=ɦQʈVl:jBM#i`pTV*,yΆӇ%G4T{nqn)=_}y9zK9PO/[DQB| h8OҞU]T Կ&S &3׻<"wN^z:9y &藣o^,e\Jf 5ޯ>4m CSVZ9jW607B̖}B)M~ᲀσNmqM<,|m>tSgKT>1}'*M>{wO<'MGc?FAV;GJQ&?5G8{7^7۬POZ$(Y{)ysϝF[:n%GG4FOVd\,Y (4#DaAfxOtPCWX[8Vi3lZU}=t3ƈNWK.q"dIJ=ŅM9\44Ѷ4rF9p3PrKFD.&%RG,AlY!B 9(!"- BBL˔Q*Ă[ 8S\CԡS<|iָ+ A9ePPr1"%DF#K'5 )* ^YUp/lGo} ;vlJT&]3{K.h EL距}H&zr(^ڲ)'I\qXtԟ S<`s-:2jF'_6)=T&&VQR0@Udh՞4#Dz4‚Sw۩㾁bY՛eҫ'vc}>Hc}~_W6ۭ ;UqT=Up5UoZ惜+[Qks~:-xTrmCc'Hh⽒%z\[%9 `'ѥx*"YuQL^6`e]8 H<;!g+FKqB0o_GSR p"kWP,;LT2!eLXr:NB*ĉLh%ჼZ-s6-iۀoup n~vo5aj:i?j9굕MҜy^0^{$߹8LZ{/^{}c~wC6eq|#jySLſCό対_~_hQoψ)JDBKL ! Woz SfU, X4<xyRb‚yPz^sK-{Vb1Jibb 悷y$4J@lr0kQY.&Nǰ6??xp5҄\o8}L檟FmU[;T?j67uf}nz wz8jt]3뎐kEkthd߶z彞SvIS͘riԖ.ȫ֍keocطծ{5 ռPr|k=6B֕#Eloq +zIm)D.yr K)WD;S{Fa)lhBDMé,]OUjM$(E2>[-G@oeie҉ڗ4]J*ߎMCȝVl;{f=r{|-};f7gWD|QڜIE T@ ^+I5[aY5ARBUʡR`z^ q\&蔸U1H"%DpHlףb3㱶P[GOW>:Î/? Ymd[l4`rD 2%:i2JrLHǸH!W,qV*g9h6L*/<(jJam`ULWRln8 XbV[ںZ`҂z*4{ŠN3Po陉>^0{qb1@.A(J٩0 wZ2 C,*5!b/i1/a؃S j blS?Kv XbZjx#$!&sIjgV&VZaFG1I*‚4e;GeVރ'MbVsf82>/uH{f#"eⴾjix&㨏.wo7[w˫^> ů56ĭ/m^oy ejcӆ\ )T1y녾]]~O\ ><}1x\T[]"L2:(L113mdPǽVT%c3ked D["DG#xCLk\ yXK˭4,팖C.R#w.G3TО;| Ǣ ft^QScgkJ ֣ffb*^^Ղa¾@_Mv{Jk-s9%2LΡD#Y2Jp䵦VDC`a?^ 9!C̳g@Ri-|]2z+ڥB=A*c g]6ALjd7Q;䬣'ǎ P(:SSt'X0LSLs%ۄ *'>r}rCSguM\& ]+tH{_EX٤ Grdy@ܞX9/HgƅA<4n^^aɏafh>h&4j:˂LVs]{$†y`ɑQc gv'\4bG4s$aTq_b:UuR;$Ai%#CLTLyikDY/z[ rpPϛ񦞇~=9 B%cz pؐPV)xʹB+1n6^P]^t0߸(qDG^*Z壴YNUc'=pbrjn3#_A:fvhr?5bqMLWޛaN|=(o]ysmYWX/~嫿O5єwa/?#t񗞞8־ `Z.iUa:;VEwOފ㛞dvï?|??1`q/ަF7"A!g5ϛ97L=f͠`dNM ƗRE!_Fc"_H"}tzk}̍Y>K.*gM}.k|gk/"܌ERW)E ^(c&i!EdiL7]ty{<*U[7K&]rB؎V]WwU֎l6wy3Z lZ1M+PM gZsiGMB~|nnu<ȍRoi mA&ʝPa0il I뇾~W`5!!G<)X,!g+ԩyXVб8c[-))v͵O) \Ӥր*8Kcˠ@薅>$WkՀmrVŧ[゙_3'1z~z|T'nqL&գC%c"K"j3CH /+b9k>j ]I#WD5[s;!у&왐aRBJi>VBG L=>o{ vЀc!nDƆ(6Ʀ)2>[P]5Cl X?jtGVpOdB֟L7w>n}%K I:d"qe-b씬PꡮrdJ˃#`I :29(fDGUPrX39U\$#څh M2,04Hj9VH@N鱶6aSP8uǙ_ɍAX_S!7i,2El>XCw|@y[f)<>|e*_H{Ucr&9z"fSyIA(t_oHY#!Dc[V[8zI{SD]^2 J<.K85x)zzuӊկzHqOў1+g'R后KNOKu#-ᰰ%5V?#0N\pI5Ӥ!o7vKc%5Y靓RJpx$#&-%RJ:I#.|fzfѿp+.5O f{գ·wW.KKzOc0yoʹpB3MDr8X]-N,Bi\ C5 {0 >*x/ 3sbQ2y|)Ɛ,Qذ\X*xb`W9; bTr2*w(G3,'֗/&N l<,ėJxM _83w%MR|V]jr ]L5n W8,"au<)(@גs Ԏ;` swu=eDAhL<F!Vº7@&UAR@VXX-HMRi b T-gYﶦښJpEE!! N3=%JwLeCZ:pGCs\ ]Igg% 'F1tC{p4\6eyF1qD$9l`|Sɕ/ fO}Jm{J(޷}hx;dMrqB *8ErN99׊sL% nN&S˓8HtWVʱCZ RR{M|xkn j;^^.Tqes>[݅]ۿh섡&o8z{8>RCofw۳u?5W^>xL̥?LE3풡ih.&woPz'!鲭ڍ,/(!G5F Y;h8OekFF:ȶ^۞U0kĢI>zsK\>}nzz"~JOhoԜM<(J4S7׿5 _'>Ojx6b E< 0Jh4lj'=?k>F%<•r.PHwϾyo:?;g{{W 4dm=c߷o5祺]s#Z|لoѯ%߿W*u;2bCFӸG8nՑV|U!rLQJ=zHF 9}ꃰS~v4Z=2hd4Q@ρ./\|7] ϔj]-Gd@r  V~O%C4Y)!5QL'l,GbW >o;j:~1 \֛yX B$/Gu^@ё) #" $*1 :NGiHYl,=L%p)S]QsQ -KJTQgi`e:%D#,Jrk)zXQ)Qa0w! ާ8Me~=!z~'>]Sf۫oͿOmSELm+qG|#޶zwdVn/^Ѿ'=ALlͿ': 3_VQ^Kz, o6mnzD8pY\c Q[_Iy>M(NF\UuTBYA4%$r9elk˶Z,RAy*QC0L3E !.J%/[o#RpC'FBM `q| '@úb٢@Bb [!*'.ʵHp}j;8|LOg[)qs3#!4,,7Y̳f_I3G@-v ֫ UWM|˛W?'u©Gfkv0G85\C~hÍvnolx(8KG0Vo&C^7_gd ?mFAhEU̱C:uhx$/s ^5VPc nuojCiʏB:`W{ߙ^|_c?|kv^BwOOz{7bDf^^DD }w>w>ɥTB !A -AXd$TVKQycj{ ';0rtgO4&)~;^WGi/> @hkN|& P !i^l p^(bSK'U3W^XE #P(TFqFqdFqO$w&c`T^h%\ɂk|:$%igLYo.%eKyNEq6 H3)PkhpĄ>ņ(~g#`sg? v癨ڐ' ' 9KCaTRQH}eJfA% т!8|;z0_3v;< Dޒ&D0K* Kk,rv$Z扲>fxX`] 1=dV-bR[|zElpY7䣣D1ѪuzO_*@/]9UڈR rseG/;zyqDD"QitZh)/ jp,:FbPcA9,DB:Qu JP*L/ g$r 6 r8-izn#n6[[k _E7Amfn Ӝ5VP9i5"RȀyAUvANeV]+U/ s>ua7=eL9,#,$xܕ/O{M bQ$HDQÄ|L2pm&J@"5$pTrNRF E>d8,i)DUZ<Z(-P7RpjҔ-X8QEoI3T\)0"p."j@\Yl숋Y{.vؖ x>j4A`ziuBm a2qGǑ (JDpx,xXlu슇0VE0u^q;C2J!7YOûJ"dgqό$jϕ@dJ i$56Qm,PFFC4R9XV[:zI{ST]^rJ@l!GHhnwNJR )>kLx"häJ[ɢ㠓$F^Aԙ~ %mڠ):kwwS%<(B龤}|98y"aU$5:\Dn>hqma E؊Esy:a( l4KÍ#]LL9&pZ&Ob q3pE Ou[*HmPh':A3CMP3ə9,Yw]_3qH&@)5: ; 'kǭ~ZmBv0Yn?oL>s3$:/hzqRZEΌhU$[CL^I3^xWf&.ʫV-Wr ӟwѽb?lYHLp2lYxz˭;.߽7M;6@ӆ[iJ:lʐAd8 )pܤ& ՚g-\#Y.Wŗӑ-VgQ ݋%hN a;!IІ+f0C "'A9Q?-)88YdNy~:ؠxgk)%usTR0 ٨TÂaϨ'lR/q5gU{zQ)S}J&e ![c\RW^.>.R4hy8zmB(󠜭mDfgLCQ!2&Z45,jC2aX/p6q}]jef 8_GڽƧ&7x(.*-xJdHPD2$xP^P*;HZ ]QNwޞ3}:ѥşYlxm >Kຄb(\W<%55b2ęgҴWH'yEA KH!UpJL̩oO'/F J=>XD"54rg:băCx /3PgRIRc~tW(öX { P!;z3/#{$M]nC%8N+T1PˁvD=/Z]z|[Bka8q5C*=v'd #=}.ꪥoIey'#E*! Y A}J0 %Dؚ$ᬠ)G` s/T~p1Aủ¯+r7r9"r{/5~Lqђ~HSnarf}uLznrݩcO /rK<%*8dM2>P@NF c%y>q`'xWs4q`O-I3+eV ]).2c_v9{Wc =I~=Q'8J3;M[9̈6w~o8?^xVsaN0 'rs= "j.`; ߠddkO Vt kFn'TFQ+h8 i}Ky'g:;8}> گCz@{f`CO2h ~nqr׷SIYCϡk[B?/EyC/>1OIOqۺNwę+_??>ze}-,*KU1W/\i4e{d썹J/ ݺR^2Isk=mD?\ }>_v`ba󣲡vP,lfRݗz6 +i>Mi*UVU41~6kpo=xG"4.sS D_KFI"Hp((*F6K.Y9Xeʞ_omz|؛WFF9uK^m$^S5 ,'U{j-J %.JKI -d,BZB Y -d.LB Yh! -d,BZB Yh! -d,BZZB Yh! -d,hW9:::=[ #mriJZ]}vIUhtm*Ǚ$%Loax,OieEQ!2&$*xph!QŒJI D)ȣ _!ň1&(mL l"*G9Kcי8<.2ڀrx*M$enͫcSQNاo+*\0TZ88p UJI * y'̡-E(o';Şτv6up_lrOt]BK+mκ ?(Jӭ7 SW# dz|мI:(FX+ (Oԁ!=D)~6J&%.*h#Xb>5:b?`RJ{p"+@oKd,,+]3d `#wڐ6-Hb&sjlDwQ(ޏo2[ 4L*GbAoy!VeHE}pnؙAǗa #4xh,199D?gu̹'0ߞ3vlr{Ro0Z $&+ zG\&.(  [}L4P+lSlڔ4ͷk#Cm8КӴrҲ^bR(Ub.u0"}dKc 7Kb̲K!]]D222A*"&w0%HQ0,x}sWOI,,hK)"p1xDǁWYd.s>w;ڠЕe5qvL@,y{3mNOH rx^`jP͢i:-^ْgs!le\wiz|x{ˀz^ky=LF nyλ0[:bHz=pWRzC륹zU]"`-ܴyV;9%:}rl͵v+=J`c Wp{NlGIkʥ>NʵxGQM>*ؕjS=TZ& .$+^X DH%QE${!Xٮl2sJVtuCjdZ#*WIaglf MUΤV/lHS#įf:DSl2Ԩ=Q< 6Y$"MVxߑ7R^n0[T‹Rr+rҕۭܝu ծ 6zJƭT/DV pu}庸;2㱹@hЇF4h9d]Dt̬O{+%*' MK|uQQY8+&Ok92% 'ɲNdfaZKje %ج5Ku/ (ma@eHYr1,q\8-&q'10e]ƚɻ à;G}w/^Ӡ3)]¥*|bF{{皕{(~9 "xnd 'cM9V\Ȝ@ςIIUC>kiv9>QkR=V4Ow u?ԧ z.uufHBDM/Y#Ӎ17e{9`8"B던^]}@5~e0xͶcZ/ĴLկ"9iҥ\&]Ve/'dyRYTΌ 14J*Nc 2(2%m'dj^f!H E &$ZZR%]j:C>%st,.VM u~4_$}]Gf~ֶT/oK>ļEKT]J+-_˭%m]T>;q#+9,$#hpdZ+ 1bDIhN?hrJ7%UA!r5T'ɓ"$-Uz58_PE؏1ꗔ^ZYwY# SNGNF>aqBYgaFmum+gzi9H+)+Z>MBrJ{#hq@_v5̻S/i(_WHOodQ/'^\}Gs4\m.R*U5-^?\_ʟߟo/O_^'^f)m"d47~m54ZZehSøkr˸;fEC%h RNu+F) rBQ=~!XAyWg+:WݟӶ?gkm+Gȷv(h Lcw{H("c}zؖc=bSql]KxN1dA'j !Կm<ǓgۭKA)( $+ɄFoD ʒ2;"^C2D+qk$ݺ#+,1{ԥ$uA~ƫL$ %Jyi5*%u<ȱ 1qx浣Zxܝ>:gct zW/qAۛu0X4TdOSu5vI[_z}Ûϟ,jƯ=P O3 =`9cV$sx_fp֮yfSx]-BeS0|75sn~<4OЩ7?,xyXPܿgFAq|l9b;aaLZfslX}togZ^}xu>>`EM=Wl?_WGgawn4xƳ3{c7NA=]ny4BLamylQϷYbdwLZJp'{QM|6æ;k]k^4 s jrêy?tt@Jv|`kP;7Ǟ;oΎhvԈa=al@D*dI D0uTG`)N#ΐt:+acb'x)/8, ('5e6tR( mb )AeSQl>S:K뇥5~%J•ma=7u/D B<kS#ڗ巋' eLPjd3tCcrJE=s~<*afo-޺e{ѵ˸s[4:nWM\{ͳ&R(d w<7!ht/tq8fx $jQR-EhC-gR3q4F`rnKŤFTZUQuz!9+ F` *AηClen(q1͎ǢoھGnIx^E8_h^ޖtTjILM4{E 53dR\S<,ƺHJ`/tv5) *+OLTD. vyx̜x8wWL͏"bhGxٜwVʠW+UVH) jy>KB6EـX'urJLHͶ&x"8O9{]dqq\:f䑸-..qM\q%RL \ao W(a0^Pz\<. 6C0$0mAp} E?4~o%ݢBbi&mgطs2 ˅񎇊wG.EQ?xGfzX1 ,R>h%[]B +iHQzDQ|($ >[س&ŀ "Df&ЭCmǔȧ4'+v~sy{Nr ٍ)J>7]~tZW[= UkMx>!FvQ|Y dB6$g$!vEe&Z.8yAA+)ZKyij-UFN@s(7^AY֕ѕinJ{JB a {{2uN.7[+;Jia]npL˺7κ8d#scw/)uQ`U' DK!S=\LsW0Vg[4(!KS"XDɕ .S16BPL 5hTGc:d[z݌gd-]3mҗ/TM^S{o|}JAQnh Pt.z,Ы*ӻ>nuzWޟ[wôq'0NOxf3aZ^4YӽחaEX-vbsi@gG_X zxm 8"$ vZp(_Qr&%8ҰG,NB,{@90rcʱ@UT^gvZ>"B㳸f:Hd:P-wzi׷=zWσumJ]X#j>Na| jʔB1DJ^t\bDi:i ;c1!Ds0} +/9JǸbȋh--̜+A qQ6^;՗ؖgOHW yҳܳ_yW;r{ T:u=tG -Џ4GGNW6C=18,)*x-!Zk$a3@̽(6N`jnU5bȺ77WDzZnSZ6XDm'sw6lz*->\/j%Vqh brh00s2R >'6ታ#{^sF{ P|shpG}&$g+aW K>N;lIQ$t9Yd{kzuR9fjr,͍q68?!!/%מ2yP&/l<̫9ziY` yzgϟ2{0-f#kJowm_ k4mM,=Jlƛ[K3Cr1tf09+:`$( sc$1MBĩ[4 %oDKEh]%R7\BW˖^ ] @n]`_]\%hSJ(%HWPI$c0 5Lh'M(IKW/⊣֌6hW ҧNWJ tKW/`v'=r(]`Bc &1SuEtŶ| Lbj7J<3]W祫>]\lb-]걦U,Ic*U)tjvt(9-]@"^q*1t*S薮^ ]Q j]%5\BW Eg\Rt#"D"M7oͣq}RXmzͫC@ЗaC?PsƓ/bfdTz"^ zNUy29(WEO|g+_bwᄁE\oƟf!@.@ᘷ d8=>>_+vr|-m 80z^+_gAB+E~U_𮅍a?] C ,4_Q,ߋ;RHx_'4/Vr:ډ֧zzwQ:~m̩ͮLPGffR;ĬW逰 ߜ 9Gl\X6%PFWEdRjü@v<{O*$DZ=!jI$EsjME rHFo+냉KM!ZH0 8Cʘtɵj]rlף_\Tro[t6btқ*h~L}QךĔ܌1Cc10`v!F3{y\yg\41E 62(C/*GT K5 T!Sq ^g}r<'"ݳS_68U]MQֲ|&Sɞ?.B?sВgJzb x{`WO;¶o>uzn/~rn/vf36݀| v ?/0x=DXG}7Β VYly_[ 5.?mxkgxʬ5c@MygƽyZF8 #\NZbxbpA 9)"pĭ9"!xP3F"McVbE  y9NKI DpC@,s$#5Cmp'z2ztO:2Mt9[_[Pbs5IګӬl`C/ɿhak$wVs)!qeu NY4REf)0c AhK<5p4z*eyR  ͑ec6B3ETO:fA:'uOpEO;ÌXgej@B d6t,6=^f F3̎aJȋ3Ekiv,9QbUm(uP|66R:&~s?w. bUUA(עT׺(מ}RCAd ,OGoyw}.CvqvD^?O•/_,Md|dLf NJly߀ɖJ?^;ǏԹZx=O!~hދ3ӏTev[֤֘lqQE1vCCslbSaM 0[ƾ.rZ_t7/<\m4HK=™`jBy$>Wh[ 9TZ\;:35հwynϫݬa2J, W!-Qtc<vj!4]a2>j67V;|yKO`%_\IgxS 1BNFfLo_'̸߫.A_3xRu(b:Nyΰc 9SDήمbxl<: +a0b bB)fSK R*k_Y,[_1<e\HQ}3`ׯGՀ&)ƙ۪49aKMQ<8TY9ҧtgV<]ŃIݲr /\y/znlE0_"̻;⪖l-]U5C*ч|Xpl|p;]dCF6:dUU}eDRCHqϏÑKUO_<]|;ĚJOL`ZJ?u~6?.w xrq I3;!< w\YCv-Q ]0Suw\B򏏿|8ߧޞ>9&?{LUA `@A?u)iho4Ule.|vx].P+_Pb?EstFiU~; #}-S0"(1QQm\} F!?&U2*hx-v[\R5(=:&Oo .iLz􄖪SOP2ݦG1eHY5d)V:Q]$`RixE0F13ɬ`L )&En=֛X[L5xe| b[ J^Idҭz.G2XM@+m$GEfGvga0=A֔,$]LI|DR9.Q 3 2Kk9 蹎ĸYb2 觀c ![Bnb-V>&r8ܭ%"E[]yYe] ~o߫ͱ3ɘSm29{$+GG`tX`3 $P0:VtW3^ҫj'${R0j1(F d]L~0R&͒N*j@MTI .:0)iQ0R:UYXf @\JK#:'qYd.s>wKLcF>! U7i3S=1,1Ř cf[[ )GFNEi3L+YERrD@K@ JidVϊb"X DHEfp*" -Zl_rRlR*kIqߝJE3xZzR62Vg;2*հf싅2 ^S[akܕ>#wӰ8u|y ?nGlV,AcrxԮ*c+%F J`dK.4cG$&.L. Tq:bʈ]M툍a\11OiǾm+PcS-,- &8#HZ:HB.*5ebY \2B3U0+m|@8L̐YA&CX Z$a(JJ;1մ@}e<&v&|'Y#v\^>hs1  K1fˎ^vD%AM]D222A*"&w0%HQ0,#X;@,ǵ>mcC> ^l[^»w]/(AȍMCH׌}!B.~$ pԺKfzs4NlV} \Ή]J򳁫"z;WEZOZwp*&:#"UWm|VS"%> \9%Y:WUV<*Rʎ]}H2Ʈ%.km"io'Ӣa+|"ftY<ا+AJok Λ\F>4܀Vhse{MFA\2){U׻ah4~b% <Ћ]D O ׵f~%2ӑ{BQ'"gb6) ZFY. uX.|q GS8.Ѿ?}ƻhOP?1O; ?3aV~]TlM-f P'-HwXY igGP̨,^D5POe4z-h5QjvFM6SRbL>HpZx[20=X9;B5qv̎0)mMHú\>hh0Ol9rx1L/h'YM b^Ft"Lɇь.Kw/b؞gD%e5Y3 };M&Qޚxs*SF !$(q͜: {p:Ųp)@U\JE5* ,V}Up-51Xfw5C!Cۄz)mq4 Y/wA,r0~E㯇u7&hBKA(ķ, R8i5:90z", ~pn ?i>/H66K?O<].o9^gѪoE i4Pƌh#48ٛ$ܺr#tIf^[ ,ҢƲÚ ک[;6q 1ɢJI1Qnǚ.^;u+V "Dd-ܿؼ<2鮡6ZZ&2 ^Ph !QiFd+s l{%3i=N2HOOx7*\9~]?Ӟ xnd 'cM9r`Y<05 *W40ff߃ ==ȸn<:2f;r=k'o^t:v|m 8z/Rɮg.1y )sQ3e )v1 ?D%Nth* 0-c Fe۫4ͨ] oiGK{ߦInQt_/p *t Y^w!$wgӪnl-_듻u0T&w^V!W/l-ZvLkcZDAVYbZ>\*Y)ggœ_TƯo*-ǙYCnxX\c;Fn}l/pr^ce]G#d?F:1CDN`'e62<(M2BDA0IȢr#)ݱp0WRltK'vfۯWУCl"c.׫ j G6[Υs"3^/D ҅2c*%ec LƘw:9εs$TֻdesCUW^پ_Y?:Fᳳ]_ǿ<56EkXn+:,KiJ.ZrJ6_yCȘtFq@_fz0esgҘ~,^W[hI<*I-JdW$D^so݅-t;-K| ۡ-l߷WWeew:<M;o[ ,-)4j% wQ&BS F$rbk.V(iD:b0v)c$pP8q6rV8QȄQx;Di h>7&Κ l oF\Bk?jn,h>_ZԅpcѢ;krHE`T0w6PRS;"ZQrwm/X-(\gu1}K{u36dd?dTCKqGdbKgEL(xUhzi-'!w YQ"0xdv  O<;DžxHnM{7v2Y/vnF~~+=gqXK&eU:Ўw{aw{沩1[D\r)b2 Y!TNCӿ{m0o&p;QxJwn)I&S@6NF _3%90^p xq{\%pezBrdP(sg+?li^EP*fǾ 1tl+KɇxչLf gI{xq2<E~u׺ wh옡&ɯ ~Yq8NT]IN}{63⯕«|yrv0Jh:~閩=;疛빝3R+*綜Mn//q/ȅ=EݰH7be(: w5u݋h$[<,u:E.+0a0=R%CįzL;M=dhE~8U4 oC㿮M<⯱un>".C'LҩӓÎ4q]'R%CRY[֗;{y߼>W߿;Lwo_Y|."Z_l!漩i\:]&G=5 )_ B4r͈@cIc1M/ [ lN:􂑌DUt!T/NWcL#VD|N+þ[hy.DSzj]0 7Gd0 Y =M u+϶j])f}hOyJkBqf!:;Q`LS@d!0MIkF^CWwSkbo;Y{l2j}f`tvWqEMx('~Z||蚨ۜH] @rP8l!)a HdHma6riV΋F|J́*TA"cbA¨fQBpkM1V)4جM8گo04 P=Wp*\0ܴq *Aq"%BT@AyepA7ޡ*Eݼl {{ {wO& P @p)5_~7IV<5JbJίU*:,_-ߗGuHQ.ߍNE.`S0sz p<ۉ/?NZNR׈#dp*G,YugJ9zQGGSA6mE[Yz};trMj񽵹 ٴy ]g “† ]EgR74٩1(ui5ڹ.wFA9vyH.{o&p?9j\ǃYf Ol%p"vf85T@GE=K y8 Yv"hai)<1 <0n2O*݅u!QFGυV()5XC#w&@SB.*7F+BƮ4vwW`7`aōQ}}=δ`zqdΛLۼ* j*Ew`PfnP* !Fʊ0VEm 'ox}f52gAې S! u )5$z *:c% ~dqV[o1߬Z.0K:s۶> \ ~:]fA|l#ñeGĎ u@nQrϰ=8(W&A)aY0yY _'=q`ZJ %ECS"`2q`̙ %#D *'A*$Ò3guH\8Y$!\bIψO Ye(1iYMn>Q"%Z8+;`AkrĂ?[8ԇ\qI@r\/ӿ7@u !/PxXPB4ζT-ЖP [RRTȈD͒J[[9sH4>yhY`3,OBfeKb"<4:6`e]b>Ť|q/l^߄<K -E &xK/[z=`39&$ $*C$,*"g1p$ZC5Xl]U6F)D;MM.x뜡+K$)P5rL6L/g͌'56?bYWU9{nấ䰥˫Ŋӧ 5yj8EveꢪL'슦'M/GUnN-Rц}!V.Ks>e wxw5|wQ߽yr=+rw٬; ]6wZyu㓘5ZszE79z~KvQuEÓQ w:7%fKs:A[1:UA/FZUI9ͭI7gRsرnnC2^u@Ԯm/rS|# pF!2!ō1ۢTeo*;Q7y]"4.sS (\{p<ę\P!SaRYﴠ22C E{bMKHXuUNQ jYR?Mq_)"Ѵ"p$S.sIRf$- r*Z)6FSBԍNM򠀥3.8!(s@5i[Eė #bc⬑:[%"fqq^-..lx>*Na¨D,G.  ]accS< ! wa wk' ~|$G.81{?VG g!r:qϴ,^DR w{.DQQ Gdz.E%.!ubrA3eVL%csUzf tx*TG#j\k\ +҂F0MY#5l[唳͍VCuB/ QfE{HYA:MW$04$嫏WAIv6tYSY"WC1>" C1\Ho;W$r ^kfE4 lKfHz-2TfVij` PigULfΞ@EkϽ4wH(m)#Q*!\a/ٛs0%{suKfiYJ-ڒϰd/)}@p0P*K+Ur>| WdK`*Uv\p{R*3+\UW \ei wB)ms+!* v5W(-н,%k9•9 BWY\ U#\)+aWY\rJɨn•V@lWY`u8oZ \eiwB))-\=C2B0U/;޼9^s/er_9ϻrx Zy I{eϖP`TTBxIcw?g' U,Q .hu)4<?5:xʧlBf8i>7EJ@Q]y2_ׂW^Fy_tVwQ,gw{7O h(b% -+4pBYY黩gZ+7Yuz?Vӂ{eJ{H_!el`J0@bܽ,eR%MɔDe6*eF^/_DFFMo6y%@mP#揸Lmupâc@=9_[-=M U/CN,GX:c-wm TsZYD:>.~!1ǷSo.ήǼW 16XOzk֚k{`<xK%Ck%u`im 1I>r-(P V 1q{c89rxaw$)ZK+&Ύfh".$P֊=?ݬ<1}[ؐ8?-^qfXJt׸Ί):P T2W.!DQ(ahZ%Iz'iJ2 G) V0*tG99 yl8U L+` BEI*H 4p!2-V3/޳B]O7S9*z2^xfdU!) KK2I*1͕w8TO2GlM5!W u9)QVZ 1u4Q Ti >mRمOtbcI{ ^?@g,.ŔuǦ{r8lVs} Hz&+ Ky1.^񟓻&>Fxz_*'}kȆ+F(TR@+ArZ'N逜^h9ԍF`:.Ϯ:Ӱ[XMc 5I>(Cx{eN3|KVZdLb:9EDQmqFE2JF[dNEB?iM}rv_|eAKrmMNboCڌwO׷?=11Sl̜+rS.jO~[CftM"~ZDTN9S>RzTkRr!&eRȜ2J%Q|7b\n՚tR7p7`w߮Hq 3MEk_Vh5 ;f?OU ߹ݩW)W}*MkǐrgZU !|ՈLHqyFlTSeTvGeK*`&4E!PAP% =(9E@Jx=I*֨U5}D8 pY4E  EEkc5QEԐC tA q*\p,i@!qɍTy=ͪ7wJ}ŰvBZ U15/6OkJk~Lfrf7@~SEQy{9,r9.,7>Xp)pG.p$UN`"2:Řp)Hʘ8e=L0Rd}>&%4DEwJrLXLXb3c_,ԅP5XXtsuh;G0ow_\is0<6VIҠR,hyQsHRy.VI%Innʞpg6!+"jfNc*V%$T$hS 8{qA<.6;EmSM 4e`rODӨ5) [n!|^F&=8K1dv(& H*PCFfHjсF@bL@ F>9I@+َR?XmaD "620ͩgOS7IQfAr&r>s("ZC QZn RpjҔG-X8F/ԘPfBs9eaD,&A/#.Kd,6KEA⤼\\(h7^GiUJH !$%;\[rf2"H .>. !-t?<|+MWnǧ~̃_6nxnW}d ƀ` c'.fv=ax'g_r%晍i3WBBPg'ɸNUjz>9&jaF^-/oHԞ5DžDeNkn ` G8~\v3im-6`+e8RN7]ʎjP)|Sy])S^hahN:f-=Ya P↗>J˧;mD.ڣ8]pK,UV*#b qD$Y<$HnYDh1M \#DYxrFj3`5`U::pBFQN8e%#,m/%9Ƹ|LqGt؅,[}osgI{ȑ+ ٝia Xd2/}/Zu]Wrc]nR0xYI|TpOm=A֬wpR'0AJ. i!!qؘ]f9 6^a- 9wyH|+r!\0`̱|:Q89<@rFϵ,wJÃ&tƒRĥ0na!ϡMvE7+} ^@)v-1T/rQک{Rٵ"Z.R57)5.'Ŝ .s,`L҉Na{~pLl hƻvu{5$D+jh|B==M( 8z)#H( j XL#BXYL^jʈhA #(H8[Wk q9}.P } @ y;6p6fDAuG.Iry[02 I^D'lt(p2xf+nF t=&?dq<=vлlr6Z8zzn{F]34Cw{@4^G0B53pING Yr(cCÎAG`fU.O2y.͛fpynqZ>`]Ilr 9ObїWV0~J?j l"iCu =Z=|#vQbٴu#V0f@ "εɥt*CK.bteMs(|9c7AS{z3hތOpb7Pi u5PB,/NKr||{ ^rL UcwĜwUNφd01 3q| T*ܢrC}v.[;J]gmؠz^v>o5b:L'%2{RQzDJ\MX `i\0' ڰtZ@[Hg[}T9ϺkUizS`jBy$1ESɮݞcd7A%҈f|ȍ,60D$*x}p>b(& P F[q>z$0Ã-Z/kOA#R"AHd 3âҌ1K`wבg&9\g`$.ξ+8Iozq[|S PQ|F`zK#4Iz(eg9`qވR0)[ZS;+'3#Z~/[ gE`$Wx̅A[ovP6m6-op0|T431~LMӐinfY>!0< vLmh87zr3g?]8Aճ֋lihNҝ@iH إg_1K>٨4gP-sbXJ?e;r|NMJާ/9#"Χ~G94z+gRL2怣sD жwAR$(bZ>D6R  5~'GWUxW-͙MqsněO49!OL9ePPr1"%DF#K'5 )T@|E؊Tp1_o"xlr2 ûB[+lXH@dQ(m^XYq-3W^gNJJEmjuj54H 9ٽ9M6%s66.In"'Jќdsk,QV8dm(TV&V&dV&85*H!V{.B/jE. $rFC9DŽRnMP&VGEd2w9lVJHĈƶZg6:$+xqI+O3ޕqdMnE@pdI0cJ)fSꅤ.a]>U]wNe ֘*W>{Dԇ~ )(m0Te:H+b%,`W*l)N_JuU(*밤 qIe4lL vE}@@BTyXɽo]em ,v|(&BEQ/p| j^R Rk@/,$}=eO/^V{k 눢,b-&nX"N "N1C&C0!B*Caj3j@%ye4zl5ͭ28[fi%lzU]g4(`d)HJ@Aڐ^p$5V^eϭӏn*Q}`F$"Lbe]Y渧Ri7sżE vGE38 BT)&|xh.q6op:M!d<ͶNvxz<`ٽAɃLNc<#ׄ:DA.( R;/dNq_C( QTL@1FL9Q1DJY*&h2$w*=:=VUQl=N;4[ʽ38WaFhNPHb&й"~ˇ[lKQ{P"M *eE{L9}b^zz,it$OWdm0TzXEeFގ?3:GOz}xnh)o^0h3 dKP+K9z(sm')xo|X9 `ģJLS*t)hdy=rC)ʓKmGb2HB3N\mˆP*Ljdwź:w[zE-?rwrILXpb"LP(i G7)b&.Ə8o^aҒ=O9 PL Z}(lp&Րj_gnsJuUW&|Q!l -is[_ݶqʁo&y>-}pm@ꓣ+4?IֶmT4wUN ]-Nşz]40"% /\fK)ݎ X\T8V.Ԟ-߯Ӄ'ͤ+5ᐓ<` z4U],4VR(`TMoEEjxY2 ݗ19HءZ!q"9sшJAd@F}CԪp`WD" t]Am^x:pMV֕7MKó3*#YahRk做:m1`"{'FDHk]=0D!XsBSmS6D"-[%sEC)zX1i[mK)`g2bL }g%0l\bf•z#9^s== ;"e4uwwo5v@ͅVKʷ7X3..uB&:\ Be\YfiS6UREf)0c Ah2`MmHEJ%G,ȓqôO=(z 89Z0Y|ˆ]0O -("q`@a+6ujbn?rD6 u/R[vض2_inw/9Hxl14j%Rnq> } @@<=wɢCXq$ BxdhᨕIj4":aC1K?Ϣ x8pW|>5~?רyq:]#7>_<&㥣xM0Q6ݡNy|aTǃQOô:E;q w6^,9wt͇vLb01-~'5T4!<Q-ҸiuwM rն ^18k$G4$p+k#Ty:^kY jBBNLY{eEs=T!\33iZ#%4\&Sٶ-pm6 K3\ut=1AmseDZ_wiBKq%o/-&#uQro*w{fw"i-m1q.(VӔ8ş|pZ,E_V!?'%яߧ?~mp⻘$]%?^@+~y?wvM~ a ~}r9 N:Db]&}mHo]btD:X̺Ø,}4||_w8B=Ww3RP hՓ? b-4m|bxwۋQRP إj6pHn/^%bkj}Qo- &5 )EMG~5ΉuO}ܬ:AAZԝa{E=Z4^ݚo^Ah%m3yQV;y*MW|SWlEWZ׼'">> |u@ncxm|(?$s+kʷlMNR.o//2=Z11g{5'.kk$j͋K7FՆ>H6Amg,]Զڼ7dwmv1ULT>Kj͍:l{ú{%m"jogS`U]zi`jBy$WhTOՑ )~ Y^V_h&W \9b ݬa2J, W!-Qtc<vɉ^Z.U]LH U8 E)t O4)VXco>_^x|P dz0tC39bP[ʏak޴zͳhqd}cEa8f)|}-湭 \wMAmϴi)FA`MɈĥdOψTψ̈ϐL= fVS CJ1t2 znQ maMaj@WE~PQw~Ӳ]/tKUٻ6de`G`$4EQ $C 000"v:i!B[o~H/Ś̾)8Knu}"wWKVTA-i^DD1""LB7Ƀ(o'o#w{5]}) 㘿ew8^4_yK|߶ٙ333a0jPq U[C竪XF3t 0(5 IHp68SBNx4$hb[ J5+(Q_~MALl{i5bŽR|c& 8'MP3H Zc@# '763s *6G(-Mђ(h6&Ί%z~0)؜ΒtCK+|UY)]dR|a[5N-.&WE@5ԃ`BPTec C%,CB)@ $靤)=]2琒`sLX?Gq#Նit҆ 'BG5p8@qgjAX** ܘ;*=YWQY1Y%DH uRm `JLs!HuԭEsi4jiZe" B(#24s߳DYiT>6.JAJڡ9*Fdv"5;9wxϷˤ\?Vo]A޿<.ThvbjaGk.{GF2 Mrxy)O Y ;TN\&WJSY7GieaԠ1SQB9a7B+9Npt8B>+[ME Ԭ.ASJ@\˝y"荪4~::?]8.V|΍+du]7E뻙Ww8#%FQ+ywu7uwFF:yȺ^U0kĴaKTev+uG=~iD+6Zn"N,;?~CZij+~%h+tR;q' ;actK%"eHr- q9G o4ʓvYΓW8w"X;>~~i0,pQu]TkǖB=hcʱI-I:2>|ým,Ív^C?]5KiKD֒mԽee &t"XU0cx!w$W¶jveR2yP ƼRfK$[@hI*',K@`ޙfkq))\sIRl$J"!.HZcLH{JLxjUc,2=DC2L!j͒W9m5ܱϪRKO/-J8(+gTXÜ*B Q *Y]KuQYrՆ5XaJ]5ei"oII j(~ARaX2Agq*U l - l"Bh\1"5ߺf}T <}ʉj]bbimb^^Wշ/fn.\4Z@zursB-Fx(ؘ~aD-"0]3krʕ>Zt&),(BC.D.\88FJmcS*uIb:Ug+@;`&W;ߌnagA/?vԋlN.{5K86v7n>_tL56x̷Ue{+şB6--UHY!T( mCTR Vx9j}p2Aga0rI(& q. x̋`4Ӊ&.bIY{?9f#59rOHe*Qϸ(hmpJF ٴߔ8+R!eS<=;<81n˜{Ubi<)K7,DS؄ ̬τĴkj{8UFhYh 7)8Ϊº Fu)?{ƭ +OɖƥqsUn'NuphS$MR7߷13 /@!1E` t,QօtVR%]KV/5 &{Vdq,'8cJqcr}~9)Q #ϲem 1KkS;0(; ]`V1\&PX"qD9#i A9@dSp*У*z`Ȁ9=0"ALjM<(Á!0#摢mm"a1q6s8dtvKƝx?iî;ĻmPaf1?N`x!~2%j. Gɇ6X<scJRso# >ޖΘ52J1hښK3 qEx-*Ut2H1qkv3˜^~ݧhBgD%hN toZ#9R:*GFʛ;>U PIm)Ɩt4/Џ;GNNFOlo&J$f4;'t)7'IF LZJ,::II"m jcwK2MUySq'}f{龤}|28|UZy,֤lCqmћ粸yDy%V|W"> TR_+q07#o?,TMX4_G;vqoE-s;}L5"\ 1t2l m-q+$jӅ{f>ޓ\,Pp"sf%ԶZP _tmf7iLtן=_Cs;g/ qGwsv;g/+wQw: {F*AswB W,1üοp&q)('*7%ݱL鎅FT.@< ݮb,a.AQ'F䉠2bģE)^vW 頚xW[^]X G}rB\k҉l/Y`Rϝv:wJdž~W EQyJDP>( k* &Zg%Ms9xK{w&y&J!$T%-,n:$NBOkB2._uX[J؄ U*Űʸ/X(;,zـq4nx==`| Ft%Xr$4MR/S!&S+\HknO"6%Cvl`;8˰ɥ酠 *oGI ULhPta.&zvaXp1w.ںC΂]YF3QQ4(ot Q&Bd"iQy$3r B1*IHeӂzhR@/kB$^ȴG!Ft9Gbکe`D,>0";D\gS:fNtJN'IP/! Q;OQ=9H("M 785I*ʃ2θ༆(G=z gCaD,&zD<=8qq:\ڥb䞸,.uoGrک L0R0^B@3;pFH Bpq+xXluia<-@XVa'~}xB)F,.BB6!It$/@s0 kۃ]7VmAqϴP9 *e|*&o Vxjsz2NgcRdfh4K1q֌2^+19,39d@7hdqq,4,g.eSRyv4 s՗]xufaȉ&eqk]v3.V;W bt "=%WSo)Aj6RjJP\XDcP%Z3+a`;c6cvcodꝋf6k)C}5Oq̵V})d|e-ךROSCL$]TwItOZ#͡+iwˮG\PnpqA`?N7vy#2x_u%b6Zch֖`g<#b0A)' ͕'hT}< :hfs*WZzW),`!g;SwJ햔!:fGF$~^P,i5&@ph* i| LYe@D B?B陯$*=:/c㲾j|t& B<\磡LJGGGaS'[3Tt^~ MFOFjL+M;8,raq'*۶٥V'c_o6)R-@L-/?g6 ݘ/#\z<5$]yLg Ϊ`/q'8kdelH#=isɒ_r݇# ` 310>G8괵ȒaCe,-l3#u220&;5Ǹ_H_=\9tW:lF?`E4sX>l;8Qqy*@s_mxXbģ(Icr%Teg5j΍$LQ0o10;JCTI= rDZI4VK*Ljd<)V2hA #(0)RD[Ζ&ָ|<}9Alpr' .&Wn.In.bs>7qf4`NOGjuH_G*DEsEy&brig ¸w!oTv.Wh11cpA 9)"pĭ9"!TSC=`DhCK) dJ}rbhp NJ7bi1g;Jaظ[g\ێ𛆪u6}Sդ:VhDۼ`ä u.n#Yͥ Pƕei,;eKURŒa$i&c߱FqxJŜg5K1?,A}]Hyt]b#Xxѓdv]-=~4, |jE.'>d4/=avy^4Xӷo/.H$2^ pVSa 8d BX9c6>D3o;&Wxu^g4軳_w?OtUYC)ٟdUw_e'VVT>6_Fͧ! l4{3('+[-~fC bUhz j~/`լ#0%p8+?2) oX">Zly6Z7/87j%I-IWO&k|cU#%L>p^px*hǍO3=~3Ŀx%T6d\m㔥T}r>?{ڳGv7@TK WyfZ#XPp U뺾 T($ 툷1MTQзl@~_ Xtqmi' 0wt望sO?|lݩ*hdWՄPJSI"/e[myi+Qc6fg5t2:yGDGǤ"9̰V%PD0@AP(@2q3ǽX  QV%8c(9+kbYJnG-<(phTj]PQD"1g  Oφ;ѴU0P $m$1b$49CDra#"(pvH U#v &zw={Y?ldZPS!?𥩅Ha)(̲Vb$٣XSaja:6.<>H6"Pɥ~S"ПZ'^ECLfU`~zX2l$"1IA)gpbg I/ t^WbJilB))UK T kQ"jy32)dXβ)IJOoU':8؟T|۪ 3H&Ie>8,./&?dTV?7el 0+|t\-kd(T:/p0| T4Ri_;RnaH07,B e 5*I>ꅞ?6LNe$h$Fm+#)FPH2u0Y=5Wnqn):lO]>+tsxK@W解(!gI]+ ?)@$?募_L=;|w:|W, e\o퇦azh6C6g]-2eu\.R_27Pܘaiz.>l'N6OMWc"h A+à z]JIMvQ&S. Br A^3F#sࣃHc{eNk>R¼ EG44FOVd\,Y (4#DaAfxOtP\CݯHƶpF@;HE>nsucwN+NNkAnD9v-y\ݔO_lV#g4`P9 \`D$| vqk|nRFb7hku>Y5Lb8J=GHHmKz<A!E!e玈FJabuZIa6vL|(L,kXoWMM[_͟Ïa6wK[$B5(i>C$a/^L ɦ/Pz^^l˲'|'ݑ|Bt L.'k"V pʽ85MzoAxus%-fDH=s4oRYV^J )!]."9S2ndJǏ+dQ~M)qr"*sw!Rؓanzg;}_# 6a9/W7G ^!j;"ßfw. @eldg&{U6S\<W,0 AQX{_K*%:NZOEo{;eoG)cä$$%Z1T-?|";yUOb C8*ct698B'ǽebzq}@,EfBymŠ6ZMj[Qt=Qt%*j a+FvqUe\UNRa&\}[W]c=p2zL,c05%a4UTsc%bD[Ǽ6 ɜa[]jDߒ+:nre5swyHŀ67ƺ2LbKTO&XLv%:O)St߯[)6UBmס)%yh`Zik_m-L6tHխ.G ;̊oiUp]쟾n/e?U4(Ŵ1k)8!9/pGLM(~2ŽvsLqp'S/Z dE:"d$'S N6lKK=1ޘyDTiyvr|0ծm{þa~{nGhα_?6+5'ufF#'zNji-K/Snc{Ѫ! 䃌#9G.lXm}KǶ6D?~{ !*rж|<=;:^M9;ڲq.55ʻ]{xGk1z}=ǕND~oBIR;;ӴqhXf!:.0*N&#Qì\FysǼ_-i&aY2:E5:y`brU6F\WٗԳ8bG>ږnh_yJz]W-W/oL{ug|Xwz__S/qy_åȺ*8% n iP/Ng,i"\nP?NSr!;d^*A u֪ܔ ޅ\J3Y j>ׇ;>t3tZ]ٟok2цsPU,QFɖV1IumЌv4GhѵѦd龉S=]{'ri j+s8 p!kg 끒rWj9ԲHrwafPO[0:JUF-9+ZGpKFk$3%2&6.M66seXdieѥ Faߥɷ|v 4fla4Bf*`>)$,GpK+K*AM>o!Mu61^ubAJP[GRB9YUuP i(%5VætF{D:nNqN` :`֞Jbxd{:HQ]Zu;a:ir@,&TPdOZR,ڢ@ XT`t[B \ 4sX?nmfJʆY R&,|7ŢˋucHm̭lL3.t-a6tGV1- YZecEP"!k`fd9Q [(~q9MEX mzZe3Ȇ2Wy VzC9S}mFhe&ڕӘ;F͠j`o]P?;1d a.! Z B@  v  OQU+3 :,,p:M;*L~:)ԚAI!1Ej>dD5å<l24Jo ڛZD\`(G79/ě<Ӭ=K* 4|Ai[()`[+_PCPqj{VRR}ko f1J%)k0@!ѿT(`5J!q 3 Z*kOȁu{P(&C@X{D:3-ip~5D{ыyJlPBq|3 RAUvNZC_r{S`x\Gd_ Xd6Q t56bXU&9K Õ.MKc4/3}dP֨Vx ;P< cMBՅY,TG7ChzBb9ҶеtY$QȝL$h־l+C,UǑ+Ob f?b=ѠedA!۠mU@@PP|TPFQp voaJr[S@Y) i: Gm3xs9GӈX*X;(m@ zx:XBҬ5M,m.8zlZ?-nDГ*s65gs˥Z`Uz\ \TTǢ`ւMeKP$b4tAbٹ:ՌقP5!U~Fz+yG-D(GaYˠ;GKq֎<,zI B=Ր/<[,ZjIKζenp*Acߩbl޸>lI%_I@>6wm~mr{> D)F=OGN$> H|@$> H|@$> H|@$> H|@$> H|@$> =WXl<%:9ᔣ%)=G3 '> H|@$> H|@$> H|@$> H|@$> H|@$> H|@s}ףÊƝlyY=0oɋ8z{x_Go󵋫q}/<;ٯK tׇbu@i'Ȫ0u^2Y/L U@+rq|8!6pqMX7 .e޻o=}ymcql[;n_A;ŏHgkG{=[|А{f, }L/tu`E:?hE{X*[fJ+|Z}{bǣ}dZ'==>ؚpj`/~_x^/f;) }ŷ[_sښmO+smi7Nyvi ~ ~@hv n;cD 2;1A>CS*>)&H1A RLb )&H1A RLb )&H1A RLb )&H1A RLb )&H1A RLb )&H1A RLb !M;Ɲ9 %zqZC{> Q$> H|@$> H|@$> H|@$> H|@$> H|@$> =Ї ADo]XׇԔ[Z\߷o]{oU:=#C%ͲywlK׆-!ZmKes-]z+hvZ \q \qGsiړqKמmgKJүTFqeR=rzw'_~~?wQqxKKM[&Jjt|eHݻ}c*.:UWǞb7:ze4TUZBǦ0 jmwr5ߋJ.-l?F.\{npJٻFr#Wd;CoY;b";헝plq"$uL;Qb J_~Jd/\T!, &2xr`/ `\ eb]~޺Mgb{Eᢓ>2dՆ5Ն>8ڶF.0Pz/°s6&x]te=1ċng4ZI#'*PvʈAC8u]AgbRe]qj+Vn;[ʄ"X'_ g'Zc  G2m rІg,\o@Hx1k Ƀ.б{4?-n+ o%w:R(ךZM>^6Z#α@.j\ B0Bų ={ ď5Tthc]l6>|^.i"oCդV||6'W>T._%__]D#;iй y)μ *1liV(/rͻ?^V0Y{h|>1pը&z KmhfTWlef]n;Z<az㝋'oF%Vkyo7~Vl|@K'Y4,^ɰpԹ/O+u[#Ͻ^KAJw?wGz5cZ5 ̄|f_& #Z|bK>h{Vژ#AQ荪;~7kWVa5!!G<)i94D3)R ~wqzw+82iiʜJȷZ9chޱd)0׊E #\Ӥʌg90Kc8QQvX2GAvXL-!)C?=o 3!oQw~K"%&lgJ oprv?jVKod')h)("L &_6pb(^*31Zz˖^( Y0 'irq6 K! i `EJ\7(,Qw ,&nzY1 6b M|:uzv vz9YEU[[οoeaCgؠLfvGS+ܼ`\ qJ UQ,Lw{G ϧ6o7Ckhz{0юNm{+ݎ>UAK4]ͻ6Oz^44ZAϵC>Q}h~/^K9|G`n=4~+ΎŮ9w}}YyUqZ5):d|smX-0 v+1 nu09䂑sF2ōQE.;(lj윞˾,]vV yO8_G̥GEIb@"2ܱ㞊pB%,'U 1؄QX ːjBDM  HG4$Wz#wB(LkNh_:85~V\@KpuYSqѳvNc^!NG7fS1jvЍӑ;mrf߀|8Cw;W<&r##rc * &ZKZV{wgMEQjBHJ9:$F(@O ɸ#蔸U1H"%DpJ#c1qv#c9R }PBbApk.3"M.6܀}iɟ4@?:o]Ҕ&TD 2%:i($pn5SBrXl.ag9hM&N/vXD9)WmJC,%nv^VǾ nQeVThP g3}Dj@YĉQǀ(faRY a,zFä("M0og$ Ӕq52|F{4YE #b1qv#MWGȥS,JEⴿ[\ܔ |T*) #% xC@4Y 3@r,,O-.O}C wg7-~i*⃥}~F[2l#UOYIf3Uu/~ 6}YGH[c ؏tM$m~V ajf!X}o")υD+@wπ.nbb !|'X&9wՠozwޗ^a1ul S]l>sM.Zћc=9// okft)VSR#Q**9b!ۅyCĻ=qS@ qrQI7-z U@vIj%<(:\IEq6`JIg\E$6k(tJ{_{+8G#X鄶$|ߜ]ܑ7-9a,}>uܲLM,.WR-K+ũqR*;uܾ:niE{j^-YY<~ &( Ḍ/*l|"%1!-E3eBYmijVSkű,]츪H*+}3#qLpD"f!X+RXui)@(=cbDmbAe"Hj9VHtcmaK[LYz y\_dsլ\r~-{h+" wP-84KXV;jӂINYÜdS_,Q {7iSY~}} *Mzc)m9z"fSyIA(tl7D$pƬ"FƭFxI=)ZŴR%K,&n ^˳ހziCRŵկA}IҠ~EYɳ4DZg;R-GFƝB!4GR[%=ߡ)NΏ/_=s8ͬI)] XP%pFpA+m%qđ~!(#c3vX!Y"*~ցz_wo^jaT$eQDd6G30"xqE B^eC Ghv1Qi>cH,Qذ\l#xD\p-GgC/\>0y[UϹU-w3H \E[k3+lTxgRƅvyC$6"FwZքsBoB =E7U{Z>Լx3:|j?QT9Fk_oя'oo9Yhz⇔%ߪ3?|5o3%8v=&/w8SM$/y_R.%]ל\˼oS3?]oW'k\nzܶh{>|:%U&N!l,K+[J KCr8p8S\Yh54~Up2 Ŕ{lxT\E<|>N?5\8F!Rژ] $dy 1u c~Ze.;$WxKc!3uЯax<7ﳂӲWfDep Y*zJYvQU7s"D@{!U/L8rIz#m2F paBU,y[r_VUVZ<S .Z1rwł{}{)6_p 8f=֪DVg4qJC`9]+Yropd{c`³ĉ5cD\%]ūVǧ\<~K@o. >|vNSeg}>y<2M;G~K(fĖ{kb=u.^OkXz(T2IU":qUg{*+2~[՘6I&]}sZ,x0tk{ih.sκ术(#e {vLQM\ 4$sE&N%["ʠYd܈FYZ_d9]g8F/En0%GFMҫ(:ıib rl!jh'B8|цүUu3X{i.-]^1liC\j/T/ zmO:\~qOQzdϼL@qŀK3KqAcC:~}z{sNwrmr1P7X{3 V.x*BT#DϟL/G'G9@E@SPcR)h鉽5rL@׌2`.wBYY<\{Uzpgwr͉4{_cK6 Ej_f"/`EkKe[3ۛi2˛BF K贓z'm.{%} ZjɶVʈ.eV'!k$5́}~]:Wc,'glJO4=78E~}~M\:;O x2_zi+JL<𙯿R {UUܖU|IWp۲A:s,o_L?|:3Lgo^}{70/R6ݘ7#@o@yMSU^4Ule|v-t].PK_b= ЙS<ŭNMsg[q$/Bqv{ڎƾVȻ)ElUQ&S. co9݀ #tsAiA>R¼Xv"~b*h0SE7Yj,W˸qV$ QXPY A6cNv?+X? ?l^Z=qxՍN';Pi&,[6 r!7MmS_àfh뜷vK78mi X\ˬܞ)}OzQ [LQ,ݰr«VP褐L̡^, YM$' {pñO3+!}⨼827CуGj3:a}-Eܝf*7V ]jk4q=LSPO Q,RęYO-3/.=<(P.4)皔YK}:i9ʾ};9 =J]='dߨJ\OBLt>2P Xi8ʀ 07gʹTnvlhSVGNQX)u\y " 9;r2G]Q-JТUZȷJ8Lz PWƚ&O 0_?0ǒbLD:WCW|w,EGc ӷXmZjU+h9*j81 VK>* F)RӛD0EOmjDN=wLZfy 45qu3U&ac藦pUf2j zD9hkc"BaLn±5߶˱kx݀eGe6I3^%<@rFϵ,wJÃ&&Y40L:їY^%2:ܽqȯU!חuF5AG!U1F: lp6HFZ{1[,} *{0aR/?tфFb -J 6HEl(]C:Io4M6M%s'D) Z15(O+L&J^:z,ݛ<aH+9I1! @ӟ}K6³r%%l5N1:Zɝ~RDhnQ"đUaIA)hX)s(" ' $.@~ۄ&m[t^iFkؒNzϳ!@N`v| ALNyɺlD|'Q,%$y:2NT#=<joZC=:(4{ 5SGc6H3ě)!D!e!x0k5f2b=6MVHKD3rVT`:*_{nmo`W(Pu8gZ߯P&Wi7^m]xiQmdJ9ջ\Jdއi.Wj}&@w1av+0ΡtT:/K/s0yJgfQ!̡d^q.[5_~%x8\R_vyv>AgU=WfKGȩbY~]ty:B|̣G?~HB __l4`@esEέ"gp-A2 _cQע{]ȠD8z*a@E 4D^0+ 逄#hTUC|ź;|dcoЛP%,ڈVI #T*-ƣy΃ٻ6dWxT !Y$" pvIl% EJQ%,rSSuUM]IƜˈJd|Is,he.Ck T:hbx6P-~ϐԀm'9d[NNŴN8ra9٧; UcQT~4,J%GᚠH㼉YIz"s'7EQjBH(J9 Uj 'zP!_uX[J؄ U*Űg싅0 osY3>N_݋Ph~4/^8bKØQeJ4Du0Jh tF ^l)qg#9YM.F/LP m;MbbdBxK!"g7b8ǂŸc_օQ[kqLhD po鹉>t~P5(<ę㨹CN$iA-d )yT5!/YdڣuMNѨF96'Wq6X싈0"{D7]92ũcF+IK\0.t( €|k=VQD4"mpjT,eqy QFjhI3P* >*ٍ#>j8o#YKE eq;88J%SA`a0*5 f"qKw.@@!pXw싇0V0YQaяO4Qxcvq6NmgQ Q=Ӳh[Wxb2>H7;*Qy:E}D)(Qs)2m-qrAF.hbb"ADStB~,3*jG#u o<Ȅ-@hiJ# t)rv+ .SzoiVg[Mp:!g͖9@]f[15@SB{Uk M=V'W U=A^,>Kz[i[}n;v}s[:7͞YsͶ״F+?G)ij:4u*^#ϲem 1KkSw5fZLX@s@)c!8`yL:hSrc9@dSp*Т*0dR &N &rkT@ÌdXGtJOb<0Az欯ɇcr~F\ܹOUCۮojrGvWٶB||9fc:߽bf@I\V4`Lv`"sύ)Iͽ4 b:x[7T$4ƬP*FVG#$^Ҟ-U+KnQ'[Y/@sYԑusW_QP62ۮ-',0|Õx~)~%~Fj:U\Pysgԧ Pu*-ؒ^n/ЎGgGgx[gCٯi `7%3[靓R`JGݛ$#ڍh&-%QJ$v!zSzM[ ]GJI3`fw=};-r܉gC,9r{:䮼}w5:<; }l.gD!4DkoE D`uA l4KÍ#]LL9&pZ&Ob ]E OɕmB$@``r`hf Jq&P892;Gb% ]銑\\s]Y7rg)L_F8'?}ko%.FN46eJ]/ N緿C~QiBq_$(_5_6xCtFՉ8=jlh}x8iN٠ǏofX[vZOQcpg[llr9];Gsw:}X AyB!N#٘sZ"d{7k5bā nԘ8m{9ΐWG失8Ek5t: /^kz6AiqutO .*x0lPlx58o|{`'I>y !Mq6l8.ꈴb2y;Ww+Al5;Fz뻜װU!Q]ͰFV^lR٤w&gjRĖ1 Dׁw !Q_w>dRKJG/yP{#]zY04`Kij+t,uأq;zno.bw2FfIj2; wLcÄ1o%elo=j;e}/k&wiV9ӰV<*4ͺ c I6\5cyLRPNT_jKzLˆ #*.@<<:M)(YPI \dRAGO"Ae 0gGcpSAe鬚X ^xXoJ9?O~in@cO7nj:c|-%ުp5.Q+GBN@ThY$>0)ՂwJHkڼ-SInrαsFrv!3t`un)HRV 4Oܬu1 oAT3/iv qk5D M0I֌O[*&vu-YF{9܋] vvKo>*8`;6֣wD)$5Wi"uqf+I S)zs:xUn=hug即+&S<ʃ DĂQ}&Em!`T7J1V)4DuE:YG,弜n+pQÛ?e\0ZȂJdHPD2$xP^dRܨPxsN#"H om M0և<Es6؍9[޾y"&ξAOFl鵇n y: oYf)VBS9L`s*yQ 2ֳRX斞CB%(at\`?T; DP'JXXѐ{CrlanWl=5vYg% e}%8iJF &%0DŽ)&ep[Jv$y|tjevYMM'2a #D7aZ2 +`H6T*_2.Aĭ鉖L]GajaY~2)""rm<( |j}uW\3);U)i 2&MP@vNF 儝Lə<ǣ<^| xqGjε$rV-'$w"%hJ q;+"7\ ov|VזE]՝7ibsr;mLm $*~~k櫑tP RENU?\yk섡%oe]^q\gf3PoԌ,g7#~u.>9ul 19{E̅0|ܬ횣Ym܎W;oPdd@lEeXe$:sƨɲ~'uk/{nNed7uծ{f H@a2y\#~JO`|eRJVjЯ7/?|?7xaaKڕְ,OkdP9}pHÙ+NXҼnuuթs>R{X&ޖ/Z᠕Gy_$if?)άҷ RC<8ߞ_$_o_<{EO/{f_*go^~Z弤v 66 ɥٟV4X&ǫ\ڎ9W.1G)#U@F!1nIzni|;9ڹogR?ٛ0y|]yriorQ^M\Zzp~YQ&\vݕǏ~L8#/ug~ֺ}uuҿ7İkۛ+|@vYw{7_pX2E8uA1.vQ-#JVjw}*80Ao TϊiMlһM>:J~7=6-%rdN(j]˵Rmi;#Wg7V?%'TKWaq7iY/mO./OϠ&W׹6M>~?k[ʻ0y|fA\_]:WڰaӓúɸV؋Oo&Tix\()iU8D"O"q[^=^16[^PvzyAs:k0r?IڜDw`"t{ͬ>FiuI&Rܧ8:\%?e1?/Mŝҍ$Ow6}ttvؼ7 v|$co2tiY%قקլM4 (b`~5Y-kv(d콆DܟbOp~{m{hz$ j۳n=Ӷkv}va;.^{{tYq^R>7bv3myc \$n',k;?[p+f7ml>b@I9cc~ŜEfKM=WIJ}<'d=oģdx;ɏQr~VU, jk)T!j\%`I[ eaD>+ՖX}}-s-an O,J:m_t?})9LZR{OpǍ KO|._??8S|lf_п?zN~^k Kkg|FPrAO҂9*O5{|;kbÊBkPb ]%gd=?N^yI;]J-@.*jEi̬IѷvrWf4B2ht{z~Gw )KN._Nw=N~Ntw5MkҤy}ePxp)?|q…)O|]#vݡ>JWz RJU-zYv4=G3 y .nk6G\Zyc523pE}&#Ʈ};&s''2Fvǃ^GX%(vե@EG9hU&z6$bLFCi x˹Yo˃?Ze)Z=kx~=-:'=6{Q.r0칸-f+T G Bv .CyZy}#8M +$V…2$" # T 1f2!Z֊lBZ{!,CΖZ녆\ɤٖ*2Bx̹"X,FF1CԠʱZ[J(pI_J,Q %sP9{[%Uu46*3h-\{rj(j:娠R YU]`tu{$B!b.HfLcUHD3(fxbRF8|M1al wDD0s~roJM>*!db%"Dʈ1VLF W&X\LԚV:[65I`ppNHoA&u^x^߅_aHꐅ.$&2}DGA#d ֨bP eUVX٨R(j-s 3f[TNTcrY wynL n sWCh4?ӆc ,g*zr->;}vH=f⾞کdK jL<$b`ް!G*yN379bD,ʀ&kmP؄ӵ&Oo?ͱjSAϞwoݡA^p*]Nuhr;5#tۺ@k-Ó ]Xo|!Ļi:9Stv9a1303 cE KJr<".Kv(ԑoFu_\j7r%;h͆'LzDsU g[h>ZlojCPŽ'bsIq%%`h9KA9rjUkO7frA/bY>rq#>EUbO՚Kռ[g h=m|jp"aywӽӼ<;?{ @p,,*V3$瓵:2ܨ6Ū]LY :RE:_r-X7^V;wA>Z7vQR[LyAZEOP}x7}n ^KbnR ~o(!41t@yۿ/l#|eot2t!LP+( IwhTbջOza=w5/a0{-vNa{n6]T=NC ZЇ.8ǴNKߦfᣑ.`m9%0 S ^KX˳8M!]/.[54rm෿.W?7 G׿"r?mMXsE!! K۟mn)oI DRzWT=Zje]S;RT#JRR NqIUK|0g(O{# 5\ᬣI_'=qܠ%jI>*|U&O:8!=WRQM$ؠq#W'!-79$K0}hY?;:hjs<һBdQ )Ry.±Vw'*.wOt_c&9Xe^nII"˴b]%4i5y8HcTݨ@Ś>&(1"E=:ﳋnרreE:/J*aN_]õZgb_l&ݝ6wAcCu.熾#0m7a2A=Aϔm.?\ys*S`d9wFeDcW82qT]vV)0)$ eד(IQ| s1 Dwo1G7]/&юa"Ib'oєH8av"Rؼ& Dԥs$n18j|7@u[рM8ƃ5l">:PGlݲ TcVOGU  ˻78ouy?Dq9y$N-c=th;+~؝Vv3j UǶP-W[5g:ds7\*la |l_Mp풦49 L,L yW|뤡%)6yQVpH!W,qm,g%9h6L*^5A%6r*FW0Įb0b;wںcjㅧBpODAyS D"4a0Y$|4 BQ NujzECeX< lMK ((`]xpaڎagok~t'~0E"-Aq'b'Xɤs.Tr:ɒ(IxjZʒSŁZΊ"^ExbF.ZggvnbE6 u,Ji0HɅ¨MEb$Y4DX.އ]<{Y︫=,;=ûg/*qWF?>R tW=ab}0-? Q<ղ[ e|*&o8;W>rDQQGJϥHĥ OW짘1R>`kEU":e^(1l{yLK ҁW_zfFra.-CG!}(ڳlz9.xMY/܌Gӥfqi`5b `mHy_M2Q,U[T ffN9Dd(%Up,yN!j%xZ(% D߁͐~y8/do :nhJX,|0~OfpT/~>N|</"LVR.*e6 }?߽iAiBy'{\EX)ʯANR|&>. 3"˒z!^0^z{5LO9g+,h}a 卝#Z1>}Zpq4GmFX:~_lr-*HuE;}Ԇ QƖfbαG%SoWbBwO{/2ā2f5Nk&dgigl10س"x4_<Lpa6u8 Waut_/0˭/^C9y#P&e$ߘ_MOK[i*}fsY}ΪO눴j::3&YPC(o]Mʹ$ $_ \ss;0m=^p/\kX<(6)zmsn1[`QJpK轺%ZA*^Oq ͟ l]uej- S:"4jLSSw|=״;9\&=فiKt81-24'jM#e/;WfqƊg R.uiv*Mmg 'UcQLˬ %Pʜ$hMp]Sjٱ$&EɈKm .ytr]YBt:NA)%us.dRAgZ{8_!e #a8@] X)qE IY/3C8$E%Jn%N_S]}CZ"x55)e?o;J/Ig(5x ^W=dC)(OUͩORk?cq3}x tB6hK\H  )4rB4"+']^ZV3~.JW B|f)Z-`)sho=XXE7*] $r bj{|Y' h [jV Z,1*F\JYi4\a55b6`ۊ7PW(RԨ,oIN]mdilW0ۮS ]25q-"*(`FB3aLfuѺ1%D҃ [ې-5׾I[cbz Y.Uz\TkbT.O/ЙL ss1Kf tw:߆I. |wqozջRߗ03Omn Y4)ʹ.`>tzÖ&n*_3zo kF[݆ѸfDH S-ސ,Ѡ&( &BqXt_6.z)np0sH[]D]ʠ2zQ`={ΐ2$Up;o8X U ,&՞k:FY^1Tk%ikK Y'{#C>81s6 63{"G8ԫi&iqx)bZ彖SV2HԾ;D=6: 'ϒӡ佅<C -T_052zv!g7vCCڐc e & ¼&BP5;LjnDh:vF6x&y|M"ԚJ0;^h D& UMܧ5tD55XY m#+džzŹuـ8=-Gt=;E*B_ !lsafLqP,rjBVrʱ ZXZG#1L{$n.%40!S&aJ;f2Rʃp},X78z=Z_$haVI<XRm`1̉HLc@=L5 `"5k 4>#4|AN '"4#R"aGǤ"9̰4.[b +"XAPVdb}At(ܳO=JH =bDYM^xR>R-wAEƜ*<^f]ޚkkd!10= m$1޳HhrD9FDQ`i^H.m=Z{s)ͩ=H} ?!;X "Dq?c=_Z$fX¸3I+0&|k5"Jate\Ut].')XIu"`Lvjz; B#L9BgRP;6Qq~6_  ɊbQJ "0tHJ: 7?+dC\r7ci:;?L$6~)=u'seJERou}&zd|yu2QA5"e4{6*M2w$靠 _װW5.aSgnLuNU<][C,8[ҙx:̥^켚[vQz\zߞtr{VqSKZ4.!̬| P?(uԠbN;߻Lewbp4O#A[ զ2+~Q:`5 > ]:qb'yO1 %¯hZi nqj)˛z;a>*}KV覭( qw+5_xiC$?]^C TC׽:Exӿߜ?yuoO?~x{ཌKaM] t.[oټiJjڛ4Mۤi ߠ]rCXl q7җ*DFNq#jÎŋ-bc_4QYmR }zA)?m:.UZT`#ҵH)ʝFwЌ }g߶Y5:MtIY{)ysϝF۵׶euDbzC>UA+i*Rc">Xƍ^Ќe=QHHdZ9:,Ӷdb+SX;Mi}[nm{u,3ƘAث%(0G\=MTϽ_-D` =TSٜBKZ0nb4#k&3$ sD wFR$(bZ@H)B,̰ awڴrk+|9#r?=&-s -1. 2^:AHaśnn΂*6/Vo^z4 sQa~9;Rfiw%nAgdL߳3DQCw~ VGك~*E~7^l#OS+Ina(H'w%2?tFݫ+&Tf(I`)/gRnЯG%~]:3U_%$NSgLv*zU NSN@E?Rɓh@rWP+َO^oDE:*`~'Ay*GJ#h * #<,g(YHH-X/S^/.˜,U`*w chnRs,u7[ چ2]{Wf)I7QgWETaEeJOH=`P2c0XH5Djv3.@M>x)wUL0ACS$F^Fs ^O^m6o?&>0?3c*u&oN׏LE$a~m9L{ D$<6RoE4]|*Jsťtz$PN./y^<ٛ^+,vWҟGޯ>.J[<5ՙ9_wr,QM Mm.'FKr{":::::7ZLߟ3}?gLP#?gLߟ3}?ga€n@8gLߟ3}#9g~LߟgLߟ3}?gLߟ3}?g.gLߟ3}WD\QXxro d+[c !XTle,(U0xa23ŃLH:bm1A`Q{0W,pkA8 2)KǑ V*N C!jGjd~1[,l8l">QH/Lζ].]P@rW~䔇Ӈ >v6\6v*v荛cFGO*|לaA,1NJH^rТ$Q -9+R& ^yJ ևAA㏓74R6#J7)Fʭ4C'TrNp^ Ӄy ÂD? sZ$jZ0la%Y3@22G29ĩQADsLDxV+ .ra&5]mXPʍ)eEmnudXDA/sGkZ  6SokYeZBeyr$j yorNp[`ϟM}K7 }j|!) SSm \xV]s⃶&CQ~Ȭ [핓mV}inQ"đUaIA)hXʉ vE}F@Ŝ>$ȷ?lLl9H1>6:χ`fΡ 'uo>2|" Ôj__D]/gQR:a b@S4 /3RJ7 | N9ťڦbdF/cL%նfl;5c{X5]ؚdl e˺Pd]W]8w3Ndy:G#WS,@azM;pm"$Zuc'b> :Qt{ƌǜS.,eNQ]YoDz+v B؀}#J6ED0"[Dr bKcM1ZmѪ(0A R@1bLn @7QzHL8cuDPM0I9Qsil qfq s*+=^ v\H F @0 -.nw[!nfxkнIPn9ޏT/6>+`80ހ#ׄ:DA( R;/dNq[jǯ(ʝv芢 Gl DlĔ#U C˜DFWQQzd JGFY; ZDl)F[ "% 3BsB"i)rֈ_.]OfFׅ\I:>ݔwɚuT0SK:MHvsObZyqUBաaIi~5qM$E:QN)93ZCd+o\H_=fX e&"3Kٶ#IZa|Fc㎀K<"?ؿ'=i"F< rПEΤ1BHJ*FV`4kcjL3n$aȤԆy( Qy$-J*$DZ=!()Fը0bQG"`"RSFh0!eLDaacyAzlG.0vѸwRqEgW4B9,!0b FeSXocb`QBBQg4i#(cF1ZleF͝QFG ^T + jCM{W7FjQ85Kv]{FԺf=YW#iTa|ĭ`!~|-8#9@is'Ŝ .sc˅2U#z$9=ú_O*8۶oK}JcIVgF89)"J`TBP( R(Voކ\lJ?@?=JIUL3X]}9+.o,T|pz8Ws:t4O- +5 SN[  Q-"lZq -0<0 QT[Ĕ HK8Vw| k穡VDhM%03OQLIY,-,xGI4 tVY'z"[v.QS4s~~~QՏ~3 .].?u<4l߿, <R  $B$$!Ϊ9a*|!B*?BY*ͬs A^!.x`yѰsg x< YҲfDewp ,Ҫ^bKų쨪;:9PV~nI^`EH#`g/no uBÏa Ed7 0&J^ҧR"%@\"#;. >ﰃ79*hp8(q*eHr5 ڌՎR*}2Jdpou+XO]oiu]R:IclbOs %G{}n&op>Q=zg&5s`3xԏ/ɰ>Տ<ǹ▱*б7W;&EMЅ C lr6p̳l7QW|U݅>U[(|T^ sVd՛?>בc~CƃI\]h {]VqE ΃W^'{ @q~xx6*I ~kmbۙ.C$&˱6$Qs-9˝RBMdPInb7,8Ŝ9s˭I:F a\5oޮ$Ї9׀)D55>XYsm#˃džzŹux){{^t ҷEC/P*3 hE ~e #sZrس eIHI)WSDD`Y)&hakqLq&XOY;$2)0pZFe)A@N'Kȳw ~wj7׍L GrZaKipYu\:&,7W^[2-j43|Uc2K 0̉H\|dyfA2("<" 4.|P O>RI)N*"Q* J3RH ,#F X,> B)eb%`V̺WZf6G-<(hX*.("TX*#PK/(?iP[Z5{@Hg\ Os!), O-#5H66|| {Y?dC;~=k!%2+Ù4Ibi6VB Hc2kT(1dd\l0y&Mab+E@)ɓBDEl?LfUPue _!tFs(L±u&SNzWU`eŔ8RIn&RI3;(Eܖ0K,~hR $apL$(>UHMG{rLǭU \ukC@ e^q8!LgT$?7o/fӣEbʜcs~[/vPKmP:29p񲖴X7t7X>i0,0jXƮJ;p?zrf7s,FnoV]F5*Z$oPHje8.) 0f|wp6,%nyss)48}1{\DKűy:zi+Jx3_UZ{UU\G׊*~KWeS:s0ɿ>xoҿ_ۃo>`^|u 2 27 ޏ}5oҦꦩb4-u5kkknVV}8JvSݜ:>uBK1EҤ$}CRdISH)` u/-wWs;[q!#ҒHޙliҘ݃އ;}VT3y/+Hv[p鞴ڃ_]GceAKJ"X0礎֋;0;u¦Oh<0h\ &@C4^zF*@Kc*P\kML#j?CNQ QMG5,w>StFWk˵6OVZġ!p"rЀ1 K>x>Urh;1LVkZ5c&e-1"m1'1m S)vUӪ@e 8FuG6xpxI?^ԂVEZVA̔a %h=-RڨHxK4W@/Uwޞ^Ɨi PqM/*/ ;,Uk6^9~%7`>4Vtv(:) vo\XNӗBbAr5ZHFY]RG CQEIGd:gDYbF%wVf$jv7a0gi%Pp\v?K%+[f)r5 ~&>}Sfף?ߏ:ley;Wܷvv{"uIև:v \o\&~W,󥲅-mTHTDuwq57eb1 bzm|(f٭ͱgGGbGpë,yШ7*] 3vlFƦ=:eWӮq$vSK}@N• K͸œ8;MYa3s F8J}#w.:phuM UP!2g kRfeXm8;P9.3ܳM&V^ͧ8o`ycSJ(5*vFfaWILWWZ̚G ;'p=4Nrqf=Lws3}P [gOsec۸8Js`|rY۬ C4mpjΪ7űS6*0h hX`"{eT|:'qI%g˘f r+`_\9].x Db"2cB$, $n )@s\aefM)|OgeSKԒQ]S#QLԇL 7@/5C4D ',{z˞^(c HdgWkM:eZFP2QA.:6Y8q0"3Do3y ^ LezYm8;gZ(mЖ(DH%Ab&Q* x)\k HFJuCt%MCQ &6( h3v,7vC5 VǾ*QgVB;,G˨$ҧK8F&^&IY"8i "-:d k0hD.QA$ERI@Ce<6x*ˊO}WFD#bÕνz6ˆ $p۔x\2^7CH7NZI\U"T2ɔ#iBY,Wיִpv(|QltJjdO\T...qŮ QdmQGm&m:Y $F1Tqx*xXmu싇2VѽI?+7MNp㑼2F!*-7^E dH)Q̍`އQ(e{?uBMZ}vխU7/>= 'luu֑5q&!,bn|IfkLg1Fǘo>8:X4YYBvRYHL6Y!,FeRVX$פQU`BcBl&']dsLϯۛ^r5nGlټJc޽A([rd%MS}RArR$2JSLj:%0I:2Is}!I\:|R%\@B#Rm8%x-zr\D_kOV Lu-[5#i䒭YG+H^V"gd'{QALЖ]em8-y(\GH}rzzu)?t<ڢ()ffu`"122-Vd$I I\0"T.ÒC ,cx.z-v5V#xz:{ok>_ﮇg3Mz&9 V<KZ; .f04F#A2#CL!r 4LcV9%,iBPb,2g[%a*2C1J9Gk$Rp \Qvj #@`=w^v'#-zFL>vt;#* W?F3n1H9 kkRi y<:`kn]VO9Ǖ!{< ==<;APxHI*&%d˲)^ld > =TABCuhA_##ǓzFb L"AU$W\A@ Ҋ2SHz&^;.]G PжA})ٝ.>J|^4 a<ɤN*P pY"A{\R*勿3Yz(Pi^iDfy0!5Hr˘PCZF#F^^w/7'SzӐǣM7pc0 ~\^-cHx9_䃟.&D"h\FzUvuIo,.&˳ bs8~|Pq5LW|2_Mct/钳4ˏ~}<|\wvM3_|aew]tIl0%+þq,n,Cj痭Gr<0ly_MWm^wBZ4+jaDmrbYv\4 hXx;tcV~Z%SxuNub.ʃ wZɆhgZ{8_!imAX&AnY$.pCꇥ"iR#z!)QDkZ͞fszrT#;7YDK&].git8AKRY/AYi|6,}y=x~2] 5ЂRpXNo&%LroW\zo >4/*U :]qp0<^We]F7n^XBo_2 hz2-_ 6~ c sk+%o&}B\`v,՞"RA (NDQ=(OyDeeB&k \ϔw#<]QJSyۏiZIb:;^ˠ/d>Mx %[b$̈43PDL}^qVDZ3OtF~)2\_`T"m8rU݋-vWtMZE@Y1wB8]Qc׾*qӧ-Uk>ZUx|>4dsS Yق ǹ:ax[a+@)ԠROB=u/RHHw؈.G GgeɘhSb:E8x8 T#T^ߐy:YL%Kx5xb½ǪP*@.g賀OM쮲ttKl 䭉52LF Z2*eJ!B ~g9An4c;qϔAKĹbq`YRg8k)之YbR9QrD0? ޜ0^Aa}taG2xY4Pea-Q9hj&^:Ѳ6&t8_hýWJ}c^w絫QoTvE"jT7[F([hW }5vvk]wSl]nEa.d7'xZn- -fM 0 ;|,f_cgh(A8}|KϟR7iN *p4[،IϿ>k@+?[Cyr367[x}cn0s:F?5'3F^c]S_G}PV{H2|W5IaMK;*?9/;(2I񛙟LO95~~o$gt ͗`n+b mOnUo0=P,o9"T듸ZIr2rm6X 'N8$ ^%ǥ XyuΘ|JZen\J$pL~Lf"G3x'S p]^VΖ-6?b9>}j^<4~|6>w}7di34ig:,$[f7tݭw=4}v{ҍ}. retd ڜ?᭝?aWmuJkҶiD+gRޡs{u{篍kd@[|z{M&k[^ݜ--l~2Λ{voY;sEXl2/|jojz}~C"*^ߓ6ךkcmc+Y 3 mG[A3N)%AdB8pjῨdG3.Hd&Yc(Qh (M FiIE-A HITd_tԗߞ}?xLrf=R(>CY. Rd9f3T+ibFR2LSiBR |XyT}!\(adTфvjͬ j_wڇrgmXo*±**oQsZ$MHOS钳Gub9)7kˬhw0'`f={=O`X_uԪH&&P x!&LJ*cR>|W_=﫯WX `.pn4=g۞iT(+Ɤ0w5+1JP?k"RFh(o9fLyå:Kc)H & R 7 PQUy$-̧ .vs;;9U3t>Q/N6[RO }HSr f %h ѹ s"Rڊ!MzSŘw:9ƴs$ɩwPe-UH ZxthjWG%Y^q_}Fu&Wk6d=)K΅pՕo#kȿT{#-v/CfjT r%roANQfVk@HIyYA`őA[fpn=j9ijЦ]{_]yt8U ˾$]AФ* 0 ʥASa'Pl3z``Д'vR !ĐeN)vQxb u"g[6bPX厹h Spj*T", ^_nNYS>-A 4k6pYToAVK5@8prxaRbAJW⥬\Io$R! (?|vFg$ %h9+φ/'GŬj"FiDx]sťW~v|n62tx(?y7G~ikO{]^͇X??,`NQ hN~2-_ 6~ c skt k2u B3b9&Y=/E@Qp[&xd1}S("Q vAIF3]W;/\mtx>p0\O9+t &cON^8BoO2(9٦vCI .3"L$䨓 OsSRvӸo7$Ԛy"/2:7,sOz}|.>ʃUaEߕ$S:_)?֊5e-^\xd{g ]}^Tl8j0#H]tAV𨷹wux( e0R,6 aCJN96A u\ g1HX \J8`BLJ;f2Rʃp,X78yrPFkih;M|ob3&u#i&2Τ G uͻUˇ7>otUȢ-p7N>$,60DD.x}p>Pӄq[qq+xT|$cAV+k#R"3Ǥ"9IaҌ;$8"` ( X[s½ 8#X=66!eehc\&hTj]PȈƜ*<^Qtwǭ顶&5{@HHhrXs`GCR OGFj6)S z(?Qy\{D "L87ݞ/\-D`E_\q2W! $>+}`6p|(2ڸYb?I侧 ZLO lzɂ1_#v9F"rrN.1ČFnOFcW'^(Ŏ!"0tH|J:tE_^D* q>:|b~{N?䂤@^.NWקRa1T)ݥ"/) ß7W|*Ë*S5O'ߊ;n g%ޓ\/~^~(`K3" ~QO s⺑#] Cj~Yby C(`bmtUw|#AZ-nԺ2: 80y0t7#_Y='bGO4hwD䟥4_tG.&VrxtOe ݸE"W$*!ܕ+]Y{*RK5u?|^|ųo/0Qz~3 auS t)3?Д54W Mk2ls%pKqSP+:f@k(mpkzbV jUEXA+à z]猔IMvtɘu^TK%co9ۀ #ts𫃧veʣY׺i_#Uae*Y vrE|g5 Qh 4 {ds6t#s.`9׽lKG_|2/^b^Y (L&TΪdLX[aatrXnȑR+J)6eFTZ&0&¬ bn Y{T|%gS4a/"5qX.?5$ӸL[gmNb&@Z/!XN  X`n= Zs-"b N~ϛLg[pd‡(-a/⸍GOEC"N<7C}{ɝ-Lx,7N,sYF*n%)9oarAե:.(H%"4^WɟOt D1Vі"r#(G-t˙6`(DyAg 0h"It c(!#_qS6FO#Y)G_΂h HYhN WKCrt,; RNu@I>ِe^&P6$|ax8kaVcAƍ%LžCc!fcC`MwՑ$`7P>If*wi*&iye٨;EŲ1ew6;QW|~ b$TV#QҲiUn:=LW y*_G5tMgݐ u*;1T7͸Mn1"Sɡ(kQ,q)p=ۻ֢^t\4'_rQݾEmIXn 0&S[;B'y-s`poobf/ oo~3x},0l5]@ER%ktRRHpꭞ~ffYxSU &K3mDۤIM#$\KSmLb2cRN›!1-.ot.)KNI8_Ϟa=-&^߼)Ց&:f%Uz!lؾ~kBM;JJ1T03!MBȥa1wK>fg+Z'A裃"gpO E4O ڧC[Ү7YRXG^%cNօM,iI1FHԜ&HUEzr7NTRZ8'RIIbE1\2'%d >Äzs*ÔDi;z7E"*LMZ%9įZ)|0 4fmDeivJcxb>I'yRt0dA%pMNԵHF>{/.Qc :`ҞBv=5$}m.5ˎaJ$|-/VQ,rJc-OXUAEE'N@kah:>vNq2Qh?(QLnByzVv 2hFg-K751ѕyt_V 5ȤuՅ:@vdm>XTu~R% 9ՠhKP-W1hFymC5YK1Z1pߩqcU0Ov?nٲ~sD>"CO&XVXktdDPHc ޣ.O96H!/&sP6DD —9t IgmG]I@UѸpYf}Jh v% Aڱ- %DWHPl5CkHh'Ռ6!XX-;l,j|3 B2Б5;5vǍEQpILMF"h4$?yP*ZE Q8TDYUQõn2im"  e#@ A1.'J s 00V =liVYI$e(mJMKKުh{9 i ߤn$BG lӨ{tВ0LЍ![[`M.1ڹ<6 ]C(N:jNFW65EO(+Bb}4CS"p#fD9ZJD9ik1 8%mCɵ+F([1x @pQi8n!X7ϦrшUrid"&|9\jhPdCJ,=uR@Ւ0[`Yuhx;S9:zcAA(E۠VܼX6[{ 4&!wkـB=Ֆ(NOeE Z 8 A!X]fvyEU8o{[8_hO>=9`u~'Xsz,{6Yk=]zl Ɋg?7mq(4䄶lh0xλK2׆[}~su=j7=tZ-K3Nϳy8:hC}:;׮m[s[~O'h㺭lU􋝦RJI=DAA?T_{8M~|#Q{(o[=\)nHkѯ[=ʠ^n`js,VO­7g܊q[a> ݟz>zWt[D0nŸV[1nŸV[1nŸV[1nŸV[1nŸV[1nŸV[1nŸV[1nŸV[1nŸV[1nŸV[1nŸV p+$#nE7[Z^=n+cꯈ[cb܊q+ƭb܊q+ƭb܊q+ƭb܊q+ƭb܊q+ƭb܊q+ƭb܊q+ƭb܊q+ƭb܊q+ƭb܊q+ƭb܊q+ƭb܊q+ƭ^-V!'j#n.^?n+cdꯈ[uq+ƭb܊q+ƭb܊q+ƭb܊q+ƭb܊q+ƭb܊q+ƭb܊q+ƭb܊q+ƭb܊q+ƭb܊q+ƭb܊q+ƭb܊q+ƭbq+/Vo J7[y%|31{Iq"nPp ƭb܊q+ƭb܊q+ƭb܊q+ƭb܊q+ƭb܊q+ƭb܊q+ƭb܊q+ƭb܊q+ƭb܊q+ƭb܊q+ƭb܊q+ƭb܊q q/}wiG/z0}aK-|]Z 0+@O'mދ.V'dkfĬ͋Ok:[gy}7vYпwO/N`gjjOF ^#?*[^H)JKyi[t vs0yN0yZ\8CVF7bmO(1Y#IF f7ʶ7C&MLq@AsmhXil(3ÏWQ.ʽrwQcp|vo>;J˽vo zu~,[G s7Y6"i¸ʋ bIX>rIzzu=і[%u~0Űyky1K%Izs2sכף髑hyc,9. z~]]~=XOO_NwdAcD&+}&CY$YzN )<)Ypp7UB 0bd2s;A#<*O4Y"bsi]xPDZWd! $UWJiEQ%& ʛ{k4O=i^~1*ʿ&K"Ϲd ы alR18|QV͕@@c,&j@ >WϜ{U;?=㾞O䥛Gm =arg'mے=4ڨmy+qpt%F@+qw-7Zsƽ3n?{!w.- 9w3uK]>fuٺ_jz8&m<_޻n yP<"j'*̘ݲ*֏Gxῶx7]5:yۛ}l7c|q>"/?n3-BJ}ÕikXcG>ʚVRՏ'ުi;eQ=͇RTHRyzIgu44==Ŝ$9-ʨg!#Yma_dE1J\S$͑l+A4Ք(yĶ8fU +PQw%@HLR ƼRf$ItQg땷&$SR9a1Y ;Jd/,h*6ơ(ONij;m+|񪆝9][>/aOh6 g_LgZ -FUGR}LR}⸞"1adW@!0gʨ+ ,&]("AyDػG r/"@oY"T'%tTMYaEjAX ʆh5'5PnZYݭ >uO>% KL֗1ו^|SXFW1fK)*[@z;1){W/{rOKmtHIVGjtZ䁥>iLNQDƂel2:XfkxX5>ӋFUԋo^?ߠe&+b]Jp{̮=7z2=kU\Ɠ\9#GjPZ50 ?dкcjzuzM6hWf!eͻ]zl6=_iz?&-Ѿ|Md7sLq35sv|pW/M7u{[hS]o!yHѯ2?7gy9et怘' l Pv)V>VUB_9i5 yFL=RvAO[.(hMh\)C%Ih$Jx} Ih){i{Ϟ~}xNMΧ>5c|H9/\ vyR&C'ug jtƲ'2fhf3Xrye%4 "c1nT"h/hKX${.2½g!lr^5QgvUmJ!zqB1mQFmӣvL4 hu&GAU62jAy"3AsR3.Y*7 `ȠRyҀ51Xb&#PNTxXl:!]m6$$n*?ED[m=" WNLs5sʕ>Zu&+07% X)5,NhSnvJS`qGH*) 1!fBsXtR I. ϥ}u%[ eq;(C:koF6H3!P ;Jwto%D9>{\lqq_ia< {VnzG~JrT&E{?΍N{Ukv}Z^0͞Xիmi=%| ' %metC)YyRLU%s!Vk%^>i& WYhmqD{r\'͘^FRiN8Z0`G7A0-d)sj|L.- YŜ`(Sb`Ά'3B5M>_Qt}b:~l'qZ_Qwp+w~v2os%^!> Ɯ-xLӔB]1DNnCY\1D41I2sV%KQp&Yq6{ zfrrTȠʁV)6| DrgOV Lws-W򑘥g+H^D6ߑNiwFCrވJGlCzHw5o&Zff w*x9ɴtoNJfV)<&+]PzE\1lˢj=}?l9ija+}~qtpe=Dzr1dS&$r#M,ٗML0( 24c0RLprc wjOH1p.Ql:anyѝҏ5hd\c3wxǎ1sHt90Z1Ƹ ,UHXIm1P!(Qx dvv{鄞yDcx)RV/>0d^mtlSyP]wahA##ǓhPhīQ)R/ĔxU+Dw,R_L/ wuբCf7?J( )EЯH,/q.A[9xRpC ^"af]̘VRFec , zr&t=irJ >˗?"Ϫ(r«b^]4nl8:|{>í=vNh4x=9E?hv<|{O?kzxz?x xMX'O_!qϟLJS`dv8 ύOޤ3hf '۹R;)d>fŲd'U𿬄rZґc"?oXg_@V*J+\yb@4L=Β^^%竷˵N>zGu7|BEO:ikMNp?{=/ku<L4'ilWVSUܤ}˸.l9B)*{e8!U?DJZ;3$!-C3yusw5 86' Gie+%aي`xrx.' "^K.KVy76ӌI|֩hu|iҫaD# '3 <{9Iadf=LhI,]t6$}7*_W,8ah'糐fk֣;wF΂Mq ϬK1[%D*i%z*3͕w`TGoG8GCMu ԐAN3{(+m"1N Q TacD < mS;ä܅uQLRm₿GbJ?:QZf͡-"l0 g)Y$|[F#a#zB!`j .4œAwMnpL} 2)()98֊sÁtGA٪}(1 `jGTJ@IwOs͗Ӎ pXG Oz\'.8pqX^Ok]%Xٗ$:<|qv:}q[^H;!~5vIˍ2K8]fGTs~7Nm9\̓Wg'*gwL/9u<ז˵b(6^_rqGiky;GNa(=1(XA\E<'b1ó/X22zKvY#xՑ2K>v}^~Z_:B~$N5t9~~5? &K,0ģ78WFi!.nk;+U_毈Qxw;ͫ_W߽97?/𽜥e&`1[C͇0bhn&Co. Z;qRKNc!tbݜvt6MYpEp7=]޽K~u5MVq2$x$1`Yb A[0gCe<`! JVgZXH_)āgsp;,kyp;Sjɲ/VՏ4VWuUd_dfDrD+S9tQC!cRg}X֍]q2ɚ@yΑFҰ.xATVD +4:헡zYڭܦ0Qnt4}ڎmŨ߅5]܅ k zWrrŗ>ΖRD} ajrP{\(a>ShMh&/*fTJ`GIJEhK"a2İd,!DX]&Z;{Ix1[e7]qd{8϶woş|{Gpi0\2(4[/s&BjNyA[ [*+o/;fo߂^qk [.Om v3KYl|GoP a2(Ј4M"fm>?:>Fz!r:!Б9 #Ko$A0::)_I6XΞ&K|^tgA\RBKkR2D'R@p&m1As'<a U[k4ϱ\/G,olNcc}N }Tk)êktrcũ̮oz1g[d_ۉxK]+.LhZAOzO u\o\茶tN| *w0qs<=J[nzzgWkbs}ผA('a"Z2dXnk">E)xPL-#wR|}/NKfoLV/#BIGIٸ{$q4k_3D6pRF\̣Oܻ}Eު#7Sv톲)*eC9 Dd*S 0e,' }'[%hWx^X"%9d 4;=1BIt)2!c& N %"aI,3Cd%FO /MqqV:'#F:}89O)Tu]wYՃi~}7UOn0<|wը[(ROiۿTqXNS:hPMO·'I Ɔ$ s|zigwkogO;}[9/$U%[Ȧ5/9R:"YPuN8p: чt}<~p^CHJ.mF /@ hc:ݯ\IkL_LsIwq:%m4=d­'Fr ּ RU6BhS@f*&V6_ˤ%68/)t YܙraCW6XEK&Ol0H+,kŰ=۹tkTeyI;ЛVFzVjTnkK ߖJ"RRq9 74V^7Uc{vXJۯЉIXRf=iGD/b[{K-LԳ{Rh*M$B\1"YHιVˤ4'&%H1S:˘; `zaP_]a2dٟ\S@$Cp*#hQ )22k2G :| =]L6Puj %3o.MWħH 3:;-KeDA{;H)7֫}ԱR,LriWX 1&9K)ZAKe^+i^vH.R[iGk~ۺ'>dd%lk5|f`H vJ6Rˍcn~v.OPx}.v LqώɎV϶3vC46;iBcJ&_5UPb|&+gnOVmNYZ4몿y]W!j]zxtXi*x)+r'k)b%;Dk5뾫˓Ӷ9zͰAOS= nP՛b]fq}ag(Y{4+6+Μ4g]1=κ^~2ծhLJì loL g]od ݏk /=IjއCʹRfߕ%wmYmD" /bdqM_EF-/bBzI(%ty;w <}H[CϯNZWډn'nQ[%ExÚbG 8ӄϗn^IY)/JO kRqV #z3֒NvIF\FVW C/!.J#InTzmprx1LdK1Wel)0DRʃ;wf,GP>u{1f]7#U+eM/nMrhj5 NiYXS78;G0DZ32swo-Z9 5?Y6Q> B$Ќhud2n.smkbkRB3 a0<%j$uL. maa8 qFS ,HSX=| #bsd`Q6n{Pjrl 5\e|O݌H&,W0QD9?,XʫX/302h G{NC ЭLUw El(k֠0.t6$)8Ya,G>T]@sB),N_*x@"X4 @@0H3wt)A SDeBυQF)lF?nAV#CWR咤 VaB*:ReNPۧ}[*,~=O6 vY.)t|@y|3.R'0c)c&!SX"` 2/5lTe207&dz4 (hYի;_vv " -v&Qi(t +(ٯLf_@*J  & LEwzI%\N>SXQ*@zْp@3:Z> T3.Jyb4]suE>M!*Wv1dơy>oIyfIBNiNh褈AY:xYΔi-@ʥʄcX_UaU Y!|_/uVzRi%xrpY$rmJ~Dxy°2apa+}O(8(c8)dRGU2,F#0g4rj"[sY[I8hІ8:΅BDDn`r7RO UݑWƋ-K%G\dz&]` WQhBvwjѢ;eK*:^aLJJ!ArI-H L)=X@a\܂#-K@-0E zȕB{~)Q2B%q\+5@=EPBℑ/:^@$\0 vkJ>rU(./"c`iVMV5EI`$9Dݩ K]q)zD LBF1t^=\WtEC@d #fL 0@5nK2er7L&d`pDQFJPjg-Ɵd7` ֘]!DjxbcLU [~^Q^^qMdITɝy<:JIw G~,u@Ut@"HD: t@"HD: t@"HD: t@"HD: t@"HD: t@"H4L: l=HWM:Qd4t@"HD: t@"HD: t@"HD: t@"HD: t@"HD: t@"HD:UǤ2xt@uY=`u@Et@"HD: t@"HD: t@"HD: t@"HD: t@"HD: t@"H\xL: hiUht@ֈ%gmˤ&HD: t@"HD: t@"HD: t@"HD: t@"HD: t@"HD: s]^X+W mŸ.5ͥ14[x~{ dim|8 ?3?NaI Ti8=c֭8>~,lOG0`Y0|gMx2nyG =kne)diqZ0c㜿f''mѫţKWZ.uDu#`rbF  pG7wH][4"Z=}YMuPv1 ;7rx86d&r.5xJ3Rdq v껵;uyCHG=kG*u@]]~dY1']!f\+O5W6޲jh_[kۧ*djEZf4<g_>|HNY8YUp;mX[ړkh\_zwٯ \񪯐YgQ^fQ7%5k]x@΍( u1FՇu9lӓQyrs]p.ݠ1\lc Cp*YUuMneidduM[Kpl4ʭR4}3z]2!`T3`qF&Z˽ʮ4Ȝ+A>v#[_WX#N7ΧTi/)r)*o9Ť*:TgTSكe%sc!T?!;uzL?MRi'r.wc\Џt[aiz8mZ:_'F|^ک G]Xog7$; Ю6Z{ȳbȝ@y}@L-1}N^dc^!t!DM3wpfm˖ۡ Pf ]1Z$;!r&^l87!$u.%UncA+[!G6|\Lu? LqkKXk}5=C]ƀNrLoF\nvmCwSOGs5oʓ߯`il8sLRz^eQ5E[}nSL*geSؗc#7s8d2/vעz,W񧭶O[V{Ίԁl D };BI*wpX,?Tj;7?ks<$S;ociVʦ9(K]\ϔG#fWs&⯏73 wV:*4n+:7YkhͲ |+J tہU|a;&^0?Wg'#\;!^ftx8Sn5CT}W|2y6 d&T:&mtE`\5c[//ݘ%b6,ೋQ;6M`u1"K),"c s{M }cb͜8@`{w_ G H>]ɫzWr-fy=ў|;OÔg:"ʗU N9R CtfN ;[defmݙeZsA3Ώ2'yI}CQ] |Ln1:km"{Wyz<8~QJAG>ꓫ$ZtN5=w9ʈ{|N0Y7 i!@Q[ԟMSbG^*ʨ0\m N#{K}$˓._$ORVIoIǏN_N9Nvҙqy>yXMs vf ^d WT!6(p-m@~{trz<2IʄaWh6idtrעu_v~9<9n& 8{9vlWnWxFw>/|k%Yu]S?;drOXc)h:,]6>\R_O"uGRk'Pbs2c DZؼ6͸kZ6HK獕BWa+ƥ~h[=)oeK jf2:ߥJCp5T$w)6:X^ Sߝ]fb_K>Ǥ=`|z;g}@m#Rmro*XWxnjSN-fb!גW5KT g4k~I}kwH޹*kq=Ԕp§F޵5qc[%M~$U3;Jˁ5/ IYVAMQ[)P->wsѡ$jhv۩kيr|;pD 9JWFbU+GfKtc'g+2ppx.܃J" %ԁUf&rsҁhu*ZD(pHz5v 18qmGx%Lڼҥ)Z2/I^ΆR/u+ze7j\Zq/.c'żǛn;^˭%.&7%P T2Ru"(LH+?$I$MI#:;_ubi6[o+a&0Іit҆ 'Bk45*5@ uR}l  F;*^ZY7Y1Y%/DH uRm `JLsV1UE>yXkiڨ!Ĉ :͜)QVZ 1@ԩ+ `D ^=(RYEƁ;ä܅ug~I~~俣=RM=XG+V ٘wD`V`$իH yfò~[F^i淣ʮ'⤺cLi.+zԠ17rzw甇Na׽ Ap؃ڟFp%+.S;4DX?\5_7^iK<:;RYSCc"E !Ẻp ,f gMۢ^> /WQza˶z -I~#(׊sbPFr>n=j~{f_W~l./xJ &0-3t4蟝ۛ](BQC/Ggh9A-ɪΞ\Ӱٍݬ {TqQ8yJ> ۉg2 {p7rP y9ޗ+\"4[8 zwh|5سc_w)q_v˟pXB{Vפgx4ʋR.jf!.vjbto|wfM5Wt%ΉzFZnNy*,*ɪ^[w5=5e?8>M34#.ٰBsLOQ~F ??3?PMghMb;QmWJNVbHhwt2|]2rd[f Գ/r)d;t)4OJMѢv%cDz+s/4'+A)me2p(,>wاY'4 QhQ ɿ22åHlB;BAßn 6h5Ʌ7Lg?q+!ÌN% ֪vΟs[!S~XN@yiBc3 ^9T4$g S.s z#<ӞuO\d`5lk5p`Bvhv^B/ji|džZʳY*p^5NB:cj!=W{%1%hN?6Q)#)٨FQ'D$RIcEwz1"_K,h`)x#&D@+ZFy F- }#+RG\<s_.^ .nՓsq<bGJv1Q8_ݠ"(_}6X4g+V!+vjBo(0ψV .8ݻ ?ǷKy¨4 M+}T EPAs t#]B(V0bb@,@rpDb:Q"u;˳/Ҟ„p6܊{$hiN=F+J-\yI'ô"4d&r-8Тh %Di1HMJS`)㌏> LhI3Tr0"_G\ ڧbR%. Rp]}G@뤽:JRB Nh !0L&( qJT%t"pq/xXL:CZvx-!P 5|oDVf)?ENtܼjqB(.b̨Zm] 4뜠6D!+w4zڷ[؎KyGdz>3.g< &$H\MDh1M \|DY|@}2@ Xв2:pB"}%F9ᔕ@fi]j8qyb< /Zζ<]0qׅBIv9}P(OTڤkeLpC==$.6&ޟ`S.K:,RI"ia&hܓ'\UZƐ|\%O`hR` 1^ބd,ѐlRPK9IޜaZx8_F2TZarE8lĘ"%YdD iTUU]ufVUzBC D)1H b$.AZѬty;&WNE|< 3 z 61KKA*l\8♈:cم9_ҞlצD(1a8O-p!-34hς0KGM`9 Jk#J ˚&7m)VK v؆!g֥g$! `ۓ!q_r\YNJ%‹r:4AK+h)J ۭ2?kL{kR~ mAxwFjx#MۦE -F ?B],Cqyy5_ ~{s~_wpt9\/췞Nr]wl+mX\ i Ze͋piJv* A*8mY٪"|:b0sJ */%tֱ$S8a:Y=6Yz4fO$U1gT^h&l8 (t'YaX=s=‰KIHeXUI$3⩣|biĄ'6 B[ܝ(mK— 5s<{G 5%D̷Oy"$IaX5{!5ΕRQH} ʻRdA% Ѭ;Gۑ530ounw$=ڑ bG▔4!zw:(~^R,` "8Z2>:D+|~d0n^ DɘRf船!$^vrO%&XLp@I$)#!lQC5ܳ8-克<>>kNݟ&qI[>3Mwt=7=+!׊Vu!m9:1,0.uuٴ^=簺Eʠ֏f!,nn g=_5%qAK-d4ZgD7Df'Ѱ%O}ǟ7ݴnh^yuӐUՑ[?RlzlSj栜0*_-o6#Ԉ`U ^/UM)?j>y5d1՛_-2|2m8?~U4|bs@R\^mo8F瓅eRn0zhzv>q#zTTv)ױѡZ¢|hW%:>?w6=əO[:o> 4emy{d?l}Q},2ۙjzqT3Ry/O]yA71'KRIȌjNr+XI! /H_$A.¤@@BUqjQ4x˔Ԫ$ ÀZsOpW胕-= MFW yˍڃNiIc9JgT,p̧͎͢ߤ֞M&6ϓpL+#D ۃiV]~~%oGQI=YvX/ysdzat𦭽_7.%$I{~s!4ZF|]7E JaJo *78a$@^F$XxL Gy.qxS[ J<\me8a$b SyNTI@my15q6?=(,=+lܲhhaQ$tʁv\'{b9.1)7ܗGVo%a ~VC|^9Nq̓BQixsoϫ(>C#/>/LPp߿x1Yߋ qAa| Xg7zb֮xYٳٻwcWy<2,PΒǛ6m?xm=m$}GKμVc(/Y=R Ag$Z g|5b#}ͥb~8_Ī9?0'; E}hN8rod?`</{u/&|Mb:BiǜPMi1@dϘEmG37'SտzB.aΩz yWM=̛#0"= 4ȎxFS*+z )sZ"N1ȄɞxztY/L96?N: BA$킣2 'T_g'ﳤ=4[[Gam9bzfGGY|YbAK٠iDփ~2ڹ4ZPEuz>U'<~p4=UW:pqĻ!$J-Iu/@C뫟~{8~| 3}vYBڭ4QбoR0yoUACJGH[z/ɠ-N Tb T<|% /ʹ ]CoBk,H|$' )$)1L#~E\ dm2,0ZN/)u?mL3(O7mU 'c-૧;e.xu㴒`:#sJUQ~[Bނrך嚮k]AgA=d'|ޫK)OmBr4ESB QsʺXa5j$H<I8ʈp< tiD-̴1v7팭\3{F5G!ZN0+.Q99v :$uwvǝqgwwv!Mǝ;;;6%DǝݑvaǝqgwwvǝA &r wv/ixX KlB]bc&׬ɒdˁњj\_lߌ[m${#䈐MÓYא9G@3q&ϐIϫa5W\݇805F?c><Ll) (1TrƗQ&4F)UWzA3ղX! "D7rqbw0&ԩB2Mޅz]q;D1SN63:ӆ2dQb^6|MbRBGrt1\"^e֎^QI`rF/Nش_c;c;`BÎ64X WGl.?B IBʩLQ•V C#WF`,Д5)p>s AtQyT$(Nq$b]  GbKBrc؟jd.Er( (8S Nx6Hs$v7 tB HGgI-G a\rAYmsƓI]64Nn`%[ӓko!uT*ٸHe K?ETN#D}dQYP APy3M 5;UϨ@H)C=x:$R1(T I[nViBݲ/T/ܩ/3=v&O>MfW]ƒ#041JJhNF B@R9."hfN0 *F%Tkh-qn4 K!xִc[ -{mv`iL{" 8D2 M8=$qx1DvU;T`Ȑ =}M+xP,YuCN1F-ĹA}&C̎q_#i&4 ϜopĹ#^F|b5\ڧ\gkZ_]8_\EOx>j4(%`2qKw#pF(A[ӎm!m\X_2WU渮~|Ga[ՏTIBL0x[?lQ=UJKxb6>(7wU#hw;G;\ %.QO96H1q.b"AD4Ӊ@ <!/'.N@!/[p4RSF%l ָ _ (+2l@%{4~yv\myꄼSnZ5<0ӏBtt@z=z`| ëOW~[ǫU3$59\u! ;"l|PB.gut5|[JnKv`6@{Y}gt;Zl7% Dp!Q!F)Zj4xm{]Д*M ZO۝2) oƻ?_{EQ ^('8bO&)F(Q&b 'asB?W3+ӪH@Bx帷+mH_e,F`{{Ttx4}q}d:vܵiֳaP|f%C=8ùXyvEb#eCs1%4 b:xnU&oHN;kd4Jc,b%VG#<#\.it*+K)$Okm8sF=75YNϞ,u XoLތ)͸C, b9ϊ#=RhGFʇ;> *PJm)Pw?0@\-w^ ds>D:\Dn>hqea ;i&N0tl4KÍ#]LL9&pZ&Ob 9 rE Oɵ[*HePc(vd hf Jq&992;Gb% ˦6-B\p,;͵~&Ϸ(5<˯Jv` 5(ד;aR |()&ߺFbM P1Ah*z6*ie"ZSJd<- xz:(z:Iu9A"5%F$7{ `y&e;ӊLgN$8YrE ْԆ \7 |PxpM5ZIvR`Ig2`Rw㻄ⴶOk8tPJNiBY$XOy^Zp4 &E׷ś?)371#A=/ã/Ț ޼Kdş\퟾VɅmmrȇՠ쉁5?Ozdi?'8|a5+빯RÜ~]OS%v-|5tXvsD%K+-xSB`vx8=%`@A(W"ۥKd18K^{y͗˵>zɺaѯ"?!GfO:ah QJ/#c ~-m{y>&vQ^*L2*IVU32nG-}+ޝMmF2#WJdBԏ愆n;ޔ;Ao:fPUFTַ &sf C{R]cӢaV-/6a;ߘ# &s2r0ٜзoha;^qgi״bȃhÚYvV+89SzVkQ9:47q'iK9 cyR™ĥԖtVV.&%e=/ [pw N2dn"V0 ٨THDP1YlnsLL>8%VkkUr{$hQ/& _Z OGٌ0\{~(]l'|=Ǖ~!Y/h۴@D_/-tJu isn!g&IF &%0DŽ)&ep[ڠ͒䉓S=зT@5[&0aPZ0 <`\;"P | Ҹe p6DCq|{qDZUꪮ =OPь2w=!q.D(!h {aL)V" Iۍ'yJaǑ2F iYm\D"&8D74CpNK:?Wݥ| >Dd8TF~(s{!]CGzR߰-ko3<|O^D.eڠB>D@a D$4%CpC;_u` ~E..H>')BOf JJSV*hbvDP o?ݕ+B=<"쇘SY!mnje?dTCP*ㄏ:x2UQy໣顎m 5$@U*gG4D-99D@Lx١8F`Nӎe]{k)LS{f|֏-?/,.{HK&eeqhf~Zboe_i1W1qDIR|jXRef?6IN&i4')!)i6 x[\5# daPNш)9N&?M(/bߔſZh)+ڮK>RByt<\_7|6;8;T<|J2!IJ$btQ "e4{j].}j!䗄2gxV#*UQڜży63⟕gن`. {grmW EHjyJًs;X.n?!7ddcO 6t kFnV2ba%p*|8^,Eoz$k^FFnu1ɦ^ hX: q#cW&(R- ĆdJĩ^yN |ߛygS,AGQ&A(+p|Qal+KzT↩J7)lq{sd/~^RO_髟Y7ܗiͰi{o/~ݾk[t5lӵjso/6?+\j*QMh-;o6M^ܖ)Taחj컄5hmEx ʲDk.2l߿X {Í(T NpgF=EPwkֵnY²b/h<ܧ:)O)p͓SNha3@.W$9,TȶjrE;$GD"iR[=AֲɁ $` 2<81D M YǪ`L9RuٟKgMd3 WdP!Q@\rnr7@~<Յڞ,}}Eg;uYCbBh5G,y*3@5|pg4R(K(qTL"rHqB-_ 7-;WJdZ7'Iꍨ&^) "[yzsu(Э6^#j zSijGK>:?|>N$$M.rQp %B4Z32EHt[e h9 AYn=Cb3I7*C "r#b& j$xo9rߩl}$N=̼{fU"l-'?79\p]WX _&ƾJ5{peUjDRHJ%+YNyN:KwA<_ >aBQj_"٥҆s-@38 xyAfOca"}F'E0~ A5H%v~G_yq8BFvۻcfeB_ZLyrT%wszX8P+*M1@?^:9ɧrX9 ikif[,F h\tyӻ:N-s9!~b\~gnytgq{&}ʋ^gKk_[^DݞwK^=ѱ[ot˘loSXJ4%!3.}7M_I{fO\x'#HM̨"󊉦ɄwƝfMKsƍ&9jSk|zqg@qОGFs[Ƹ5K0fY0,%'}.$)'ƑhǼ%Uaa2Unޡ:%a> Ά@&)IDbw$Eޡ572tRV pcb_C~$ wW=xE)ߴ'.Pb9x<[)!B^Ǐ&2 մ%#K!!+JVW^fu[b)'QBG#^Oݞ%e-nmQ{ZĮ*!?S {7l2wKF 694^r*lx-(oim);'Zި.^eD.t6J`ϹIz5]%Q4,75H>s߼N(hEȏKե 4O=nO}s, nzzu^Z. >,jLecGeBZ%Iz'iJEOo9Ip9\3oNNQ>;S0-NPD2V`T#ECBT3hn׎wbqw qOz;*XAY1Y%"$Z9c Rm `JLsVyTGݢMu4UԁTCbD :͜)QVZ 1@ԩ' #bD ,!m;ä\߅g](~&cjqߣF1 ӍcS ll8/U5gH0$4h)hpm02n4 ӖT@5[ O=+x6_:(H܋E~@gQ&C䮓U[U~ss:S$w=7?o_˻7/z{L9~?߾ygsl~=pc_k9ojjo:57bu|ymr5sq^YXAb3AnNkW-Mn ^"&ȏƞ'Q6N#!g%pe fԺhs^X;Z/Mg/%R`q8WLs)ZR>-?-@}ÔsJE Pm#mR p#gM&Wͽw|8 $4"eE:PsD-bm8ќ薶] ÁZZJRFi:%FXBDe4OVP\/`SwOXqyEOX\B8~ȼ3!Beb/F Whʬ{w{}٫/Ÿ]{\m6L{ndVxP&=@ƃ,pOYtYfn/7_kf|*HxvkH$L4<|vz5$:84<7 %G|NgU*iX)\C\.E+m'$gĽ/\ͯhSp EO0RI[Kh!50`\Z.zgv}'fmMqA8@$Z2dXnk"KA('~>W¬$86X'` c_:7!}5GeV(?` Ѵ搚}+%WQ*:,d>8 G՝θ'() Q Tr,x˱t(Gn?/Fr?8^%{q.@K~G"bՓ!1]t >\tƌ:wfgϺ`PZ\N(?{WƑ O/]` , _$&2%)UIz!EQCrmY3tWU?U^E94৫m;OD%^7H}Ri!l LaC̽kG62l D=6w?p6R ђ;ڃ'=b='Pu]_?a2vSYIV嵫RoӋY+:UyiZJ]Lm(e %!o 5LN Y[4O+E.R0YH/ZT!Dc5>; )YH$DLri`F?xz6uqHQb p"Tc_ IBWB(^ {.wPwYzԱJ úW:Th|K/C$Xє2Zy_'TO6J [H#ِP[c`1wz*%{Y @OxBXKX$;"&`J.{ +HgyTf1[%yHB; CF "dT!e $,W}(ˁG`l`˫JMDPGtB ʣuJs4f-42*=բ]=9ׇ*ޙWܦ`ej{ [a85oʼ3b3Ƃlr~E'gd$jʪ-R㝁&"`WL:Hnu9zTxk['ێIO1Uϭl7RhL ۴ k-i'qNnrN75H5d =z ϕe;>[WF̶hP|26N,B)$WٔѯM^(M>/}Ԫ44 R#gO>_;:£ 16AYmf!Jobv!';WݝflO;{ȸ'TkJX~H@U0: U6V Bz d ܰЇ!6LxO/_Y1W D@Ny]CZ09eHPx^:7P-R=˹u,Kds,'c#z$^ު =,|<~Q%FT(;.琳5E@BsNRSK`UwEd(֑:etOODA'a|z(R4AI/0:4bÈT D0]"XW>Ϲ V/q4(ϿtѢ|țNEp48NܠǓ|[Vֺ9PWpv\^LBaF@)2gcl˷@?/oI=֌k ObZ}R=?X t`.UlC:i|72ծcpRK*v5xǖ֬zzÉvFJ82:'dIZ% lu-$ d54yp9[=n˾dPvd;~|Zm'IrJ a%*p)A4I;NXE1Q"ec6JxKҲ)g#O)e@Մ(m4"HK&۹Ogcj8+j>) K!lw&';lD%O/~BvIdž=U+wjgp1lZ̀ECSaϵ,,IoNĨV. Xt%ª&JUTlzrĈƥV(V) Fkpalag-tB½ќEEi~qi\0?}p:-6TH^)EIm)2e] 2$ƒ[c!:]@wTH셤&RZUOo'lU- DtuRqWybxV\CڝIǮV;[nي {N&=ջ&X*9I M4΢hE62N)FvS{Xu! !32^tٱIYDJڒXJrN5+hv6aW0j+"--bo7d$Y)QT\!,V4+c;@Rn=J!-}N@jg(QD6Z1'$DEkfΆ3i&e!uv&%;Ezŵ:+\2>ykh+ 1@eRD y `A1coovPagұ=C=܃ 0U4MIp} e?q_xll Q CT6}D}lK.ZC."Y]"+s5J/-.A$̒4Gž6)xJ0|hkphj;ˇbcf|V&8\T'Jɼh\r>'>E:]W08K՗ иjT-0L(Ւ0U, F`E6VWbh[JS/.!=Eb^b+kX-wfTO^#3 /Ot|3H/ZX6hD09Iٓ'I1.wu_S2&K( D2YM I)roR|m]e+0RN7`1c#]Hx̕hr([-4J@=JB'|)$xD[K* 5Yq+/- WsA:z}bukܖH\6*dZ/=w I=`g$fmOڣ w~>~Ȋ%#[Vk\z_[c7uOxY+P ͯ߿۟e=XϺw_hw{Goyo~rb|0 |-YWRZjs*ǢhA:M-UJHvxOB%WWЀ15.Aﻣ+EPA:iBfc [ȱͥQ'}^ dS;[7`riBAкn:4]:w=:wKڧeltLmB1"%%)q^IVNcBidƁ)H)pkYE*oU,1 ~ql2{2oS ޹lzt'jgMvfug^M7??c%`mIR]T0F[fA7^FmƯtPRW3%dgĀ1pZV$ u#i U COgyXPŴ2,PWU;V0%;kxrFkFGͲnUvsMbo67G7^W=5 MŨdv˽tKQ]O^߯k SPRb%^^,dESvH޳_IM0KYBZ-f+:?;~,MRk)uHO{6%|X@1C}h~J^ [|O0E<,;AVpx#QP1`nQSU]U]G k8|(":Ymrݫ/ͻ_j־lhA]ĭuAnz)KoNyG^op\גW;w/haJZO#ihJ5te%PL;q?^{>}j#EXL+t.hdhm}C͍$LQ0o10;JCTI=4 CeHkg"D-bU{GR; GOEv|"F ZL'e, GlxLBQT98ue41Cc10 `v!F3{y\yg\41E 62(E/*GTԒWQENrrW@ǩj[I]h{ -fLQWiXg+.۸-z7G#>K}N0؃ތ+VgF89)"J`TBP襌 R(Voއ\J$t?`GEg 8TnנWQOG,CAYǁu4 h[8Wj"+x"{'Fa/! [2'` .H5A;'41eC$U":]d1*C=`DhCۈozrS 4I]M)X2`Mmi$"Yx{CKrz /ּG[tq?1xT`"=0"Dˆg Y؂V]8lHƨ~aXoXWEVhBpH]PԴf׵w/:&c KM0DN7|sOr6kꟖߟ?Ӯ 6W{U{ig/_fkV )OsìȵZ1i៳]d?LG UQ!;\%_4k޽凓@f'dgxr5M::;S0 0BÃ:YȢ RzE8#MA`INL0ŋ|&ܯ6}MƊmg:_ it5,ҁ8KƒU8H6"@K("QOQM9.g\'̃džzŹu>zqCK~ +;cW00pЅs(?cx' dC[5 $, K "rlNǵ8c Sn*u |%L0!S&pZFe)A@NEGg -gˀ9}b_$.]^t]G+@'[h\YNHvs$ 2?ԼXy:l#aND2L0M,0m;hD| dC U()@1d3aQiƝVc  B' Lܭ"KB^~(9o/+{jbYMQ 1<`. 3/rTH*aceOo{4=ѴUI0P $m$1޳HhrdHa#"(pvH UR#hcpJ8 J^v.QY\pG "8nϗ"hs,"a !QN}g60k܇)xh\xql06~7=W˓BoZE,[S&kJ5V $"aHg:RP8\ޗxLz:U@0rPJ=@!R*| m۪LYskC@ DS8+.|!JYY̌ga{]\4N.gN I拹2^\z%CZt _^ T\731yiH4O,o} LPL'Rz{՝ܲ9AHVnh泎Ch$M}=4t +f"~;8w?|B9)4~?t.n x2/x t%$W_FUY;lu*~V`r+95g;"9zOͫ?~yߜcOuzWp{ -НzpcN~jJ櫦m3lr3o1/Mnf˅X](!&4pNq9*5lI1fĉa;+q[(.njZ=Hy2 (wR_: kO߶Z5:MtId^*U gV<ܩtΚ/xR@GmI0(HP!ĴLH)B,Yaq8|:^o0Br3El`}|ҢcɇRNhGmyqIc2S{L&>G% x*wHx3-v[D7oʾyn* uѕ.d#'4\ r)ˉR TCVOf%OTyyEH)[8,9?{$͛.#R2`"k"QKiJĨ6xΨjgo7sgA"lJavXImUQa(N1gekT*hHr7[`noFI*N/ R[ERcH1WOp -q 5$hx?҈' B[ayYC*!wWgC6Wٱ_fy,w Y%YU'Xu4/{Et6K| gŝNٔUmHe$U§: "ດN(CCۿd5)\R;}T/F8eg}vKF:+Fo7: $[Vy}fᔘ<(4ɲQCi;_N'$Mz@_8%S$ RF(.z 7_oq Z$vXThS@jnA>fxr0ɭ& {=*u9![#R˵AHTT OFHm4!V:kxƁڀZޔ)B| D@9#4x/s 1 рG:@;oOCGdh\*""JFc&ڪ4@ ĝ?*LWmäT0MJS6FYaᔄ5)2Ai+WQ(s9hدQ܃7H[޵#"i'x b{gd;3/4X("Nҋ[R)eJ.PA;kmȂOц\;AK< [}JmaӦ<6kX 䣣OD1KyK5Ž+K$zMv.4}CƷҥeG/O^>RҾBI`0%RtNR0!s1t< *^^|JZeaE]JK#:dbt&s#^6(tezYm8ttȔ,k+zȕqÔ "EIXdlE{[x}-پ|ofs? w_.<i)r{Wt}|r /SKU ˳bc%+ lC.44#n1q4alD1~&2iz[,% PLMDb>U nXa{?2/<^LӰz WQk /n~vUϫF9vXb?,JYU<]5{\ xk?|H_xO;-O#*w5gg$M=r b귘xfbya g/m ~Cbb9.1xC[wp%772W ()VckʿqIv֡MGmJ3Ŭu%u_b|xDjοn歷wO9>5}5eC-w^Pbf;r_8h_]zi|!@ *zL1&ȘVCʑ!0|d[eJ%Yfut@0kҀBRJYSL^`!FƆa󰢦II4MNBh59٫BmVrxK ;VE/2DjIE hrg(6hBz(`"&ǘIJAmm̒ lE3xZzL U adUaa5 Me, o̮ 3P@npG4,ܿMkП^8bcBddѕ*Ypt<&Gg\)4JV:4#3ȺAŖe(ƞID!EMM(F/D#\4l;f]@t&Błk a0դc_ԶQv1mP=3ã ԻƥmRH{XeɘLU& Ȃl,P/~]'r"8D,WKVk关XDR%)m^ZBU_֐ ޴tJ % {ٯJV:_+B* =915լ?(8ގzS?8z{׫a|{0jaX?Ty:e" \so؏_MLɛL4_L<46od0"W}<µ啵x¤q-ۭ5z;uϼi!7Lj׋'kv4kYO{%̮&2UpOyn/yNВ'_$mPǓ5dO5[?Ll8s *tM.y4$FGC‹~* U"͚ϑDZщl ]TEϬR u|̍`sO% B) 4xXǤ$鵭Ex@$PMb㌂&x^ιSG+*Ǽ>e@Cjn>ϢH_BiSvٻȰ׽粔1}`[v#?DvŎtop؂4yV'w?X] bmѶ>0>]alh㈩un9]W?7> 7x KGorݝ=Tk ,[@ڼ <9+Z_%h׶[Uhn剼.B8FYi.i`.B)$a$saE2u`#P(?Em :xfGWgJztp6%3>P!I!+l ʯ)zN)n}Ϝpu͔$sC)-,ޗ$gz -7 ݿ vԧ&;RkcYQҝmE|֝^P.,.2Mv_Vdc$0<2 يh9d={7(C&keTHEMZ??wFuKhCsֶ7g\MK.%p-"QdC(ԾsP_/xh&B]߳~8{ۺ}w]zE~w}OOI7xJ=ը@ҒI˭kE&sYϓ>撺Yh8Ƙ("(}+~>xwwGiL:m,hڟz8)>kί5/m9-k+B.eoU{qBa}N*9 {ۧv9~ /@^AZ}ԟWw\bv.+I|B He.'CT)){6W1trkPg3Evr"wٗbdyF,)eJgC0 P1hƢ",/ĸB2,%Ʌtb PL92blC$193 dI" <(qBrJw r))C#ՆC!@97zZ>)1p@sZۻ)ھyDY-^oYtFAN-,5s*`YJ BvS xFeg#t'ZU)iv!?{ܶ_74~7yHҴLz89}h=:ѭT,EJ-ڲLJ4.FYR|dHEҚ^iE$uݸ^Kѣ;tEp?kD{`dE 1\0. -("q[-,EL_2Eբg?P2tPRڶfa-lI_jC']E:pf&5f=^f=F5O>{4ZY{'X< "ש &J>en q:Nzkunًa?Ex4<}? lv9?Mޕt+{;<;џG8/b5(/I/H XfIp|O5i80Lpn0?N8yd#;hAY3n]n$'9cKzJx ;p:Q^3{.mls<́mdAy5Cq>b􎻣aH "ΆgbFe'^e}RAoy7{rNG3lS --oaa|k:4GTcB`SaRWR8 őb,C"ԡ(YSXQYbUJZdrD6{^~?0WdyIUr_3}<{;1lTEt Ly_a} Xj1[gO`٦3x\n޴o `dvMMo}Z.I~pj(6K켊>~ '3'd<>//?O<t݁_ˢUo x~KhZypA:`.G C'^(s/SSRvqR .%RZf" Y%lgR>S`q9 !?kC4AZʤT dƃ$^0uORPR}VZ |*2`z} K7uuRWxV}BqtNͮ_ ,5QRMԤ$n= Lu'eC1FUt%zvD%[еUko^nq?SX}w5HU| U3L[P.g\Q]j}slP67ӚMU $sg6 ju3y͹3hcު$"y{jw5`ŭiݝ%x̀fǦjk1vO_x`E+hzS`jBy$+B4qhՑ )T6-Z{.,V{A-E->3"I7k#b#&HUpHKX49&qb'Zz]<ɋ0Dhc/fmD!jQ/zYd('_)6+.@E(Tqʛzf1A+zĹJ17\1J#'K#vGG]pm+8F_#&!~TZ-`)ho=XXE70*2~x08J'9*@FYbUHh$škjkŝVLV 0wpou벘Lll^ol]@Lnl?ss,],ŹcC:lޟ'8f+nCъJ&^<0aVWf=`64` +˩ "" bY)&haAt\cc 7z:nwG Pz% 0!S&q l^YJy>[ςȳuY֨s}Mt]K+`(Wȱϋ:DWOn:o,z4SX|]c%uFÜe/`0Y/`5P_u17. :&0" Hd 3âҌ;$8"`fzVw[~, < 졙uWZf6G-l8 Y)DNK5T㳁[s^G8>_i)JD<_0K"]UZAůJ}?b*g;:&{_L>;W^c_<q|{]&sɛ@ \?4m CSZ9 kpSnr!v}_:f@k@(nrkvAg;`)]ْ:cp$n8kuS(7®ikQ4hexPQKȑR;FwosokB=ujdy-`5cD8bpa}wə ]tib Zq 5$h>hH!߃EJ>,W!CwWک?d"~o}k, tG!- YSX?|0ϣW3=ʊt |4Èr9<这fE/Q y`?-hR_{vޟ{ l慨>':˼ lxNmߌ಩㺄Oa2HWEoKeF yJ.eӪ:sY2;%98 GQV?N@S.2V W)uTL;H7tC8딽\6(u5(\RH11N{YqO3A>7DD BEn]GwJ\er WzT:RWU6"pYـGTlP=v dD(4:qSՀ_:\R˯.=S5@⚑E~zxɊ$DlfT,YG0\aUT#.Kނ ?{d_eE+- ;\PRϏ{.RÒ] 3ǕZoF>UH!ǐBuD-p<2N)\KH@TMiFbjhznP^]˫6eȒS:gDPH;p/ , {# ztnőpP*r-'"B6xјI&*  y_ʍjWo;y,n42 |iBQqV`4B8%!xp*LePJ8j9mi o׾#KL_mͳWo6fÎb Zt_se˲8土Lb uTX̵Q*g er+g ݅ 6^G-&G2>F\E4AgZxH_H~90.fnbFW?$)ˡ,kTϐ E#ҀmS]UUuU/1 ư!Hrssqϳd@Чc6 Bn&hg@olaPH_I.J;L#uE^>rܪwzs:E96s1t{0J5UWdhR)ie-w)E.18@"\}Fd FNOB edeMrw0%HQ u¯6X"tw.j%_ ƪa4('o[ԓIZhyO؏fzLbiX8I38;.ʯ瓓]VdZ5xB0#nww!lEﱆ6?k38nj>,++ js`RDܴAwMv|o (}Hk_p©ULY@2!ub --: x9QM>HthqAٵκ n tyxL2zGe{4OhPvg`rوjgft.fΊ^a%XKe+¿vH׶&r#i͘DN'5.$n Xl-{j]SKk{QsO6l;5ܺ}VWhamJ07yꝷyRm]{:'V8HŌܧ;6kTmgTT;}^4_vRMe=dLs+p!XgV>-2D,~c((u& .$ϊO1yxHE5ll;/rt2m M>oOv*t|~~Χt6tuSZ(]Hڗb2Ip20TBy&^h6,:" Wͨ1frڷ6fllE3xZzLJo9kJo7Uu>Յvsf]\6aww Ѕ᠙}"#̂s1dTco;;DK0'a}e9rLNYJHE-3a2)$IJ_p۲-1r\69ɲ*z<(j^;%X*d "gh:O'K3;,0\7Ē)^T0c##!Ij6;+gUCjHjzBO形TNlms0*Kr^'ՕRHtLIGoqgl0L'\HC Bpn,/:x ІlCy.k8ggš_U@Z˃3FX98'#Ah=ڳ(* t2죪FXgD uVxey-}wwi.|.8svxQx)bټ]ϖsIxb.5_?IYlLۍ}:VccuqFe] S4ȭ{o릾_GGxQhu>][HNn_k^{hg4"^ {\´%Cn)^q)=@'TX)B`2+Ԟ0t9O#u>"3ZX+k k(j_ԁ!,~hʢIg BJ0,hCΆ&')bR).,9"Ja)2p!^VA;km_A3E EΡZhvu;Z5$jֆ DLJIFڣ [r6)mvLi~Aq"`w$.BDL2P`?(4t}MC ToTHSBX^?N5?#*:8@̧jݏ`Xd+_Ə"?.g3QqgxTJ Qj.(Ey]<=3|p 7G+OEH9Y^hW@柫`2osػRMg_l,i'gnѫv ~ٺ=b~S5X\pgv>5"~q9h!jhRG߉ښ R'T_vrϯ?{1f%d-i{2O ܜN%C#nQO4Ι'aB.#*/'Enzw]Za&s. }IڣZ[!o3;^wYw,qVU/-rCS*7 Y^>DmrL\r)4F%p )QdE%Jyv/tq~??ɿ?s~Ai+`fmn[/70̷oݒ~䕜mY]k=x=i<kATmu )&ß:[*rJ-TlI#J&XQJrd VoTΠ!d,K2U@ώ8 HST'CpVW>~>ui,@.onB7wt0hj9 c{w6WmyX4G+,{0vN8vRMr+ƅd ;:QPGi/gGiB9)J8R\ ݄rK% k]{w4uf~T{Ĺ8_ߧK’mrN'__.k{^JJ>[g(|~D삲[*B/n:8UZ@_糖u ÉFsSs9ov` bJbgD؛4V?_~?ߋ>ꗪAK䯷eG}WJwWҽxב?sf.4]O4)\!N>ɥ/Х\$]>G%ST%DJHfSEiɦ8@Ы$} qp;SK a^'+hQCDw0PQh<- \E2@1}Ւ´<|CzBL 鐘MBVL?1LYaM>;dO;`1#2Mޔ1Zhړ H):&YF+2G21y _g4Ɩ)wa)R}b'lqg;AV!@P+u2#\r5S5&kԠd$8IJNGKr;ޕq$ٿB'{W%y g]^۰<_F<IdX}#Gy5즤2`M&QU/"#^>h}a:pe!)D:puBܕN^$f&(FÆT5*pRGzۅ*g [jG l.bZNon"$Qe~c A:$u;NGt;N>R)N!t;NGt;NGs0{jNGt;NGt; 1sVb.\4߅W>ĔDK/7ucˢ]p&XYK+%+ 8j+ d.\ t Qy /MJ>%̢.%_Ef1:̑2 ڽSAHAL̲vG\y7L RuL ^D65r7%^a0x{qբLBhܧMi[v5߲"6B@12!yJJ鶬A ֺ%"g%Mӫ̉zמ/CAkj:%0 !|RGf?"6ָ.lN#S$Vo!uH7?"o!d)F)E+-nG嚱[XO ^K}pyb+^if7~v6OE5Ko{]wMsS=g8|h) -cߖ2??hti7z&\M?LgؕF[䖔ו:p5zK,NѦ#Rݶ잺ƒ c~C^Gnpg՘ݱAwvBfZv3+1Hf^{=LF-?]Zw}͇y2ȇU]WL`3 U"\-/Zye2cm{h:z=NMr8|AA) B\5L&w_h8ݣd&YcQjXR40R&͒N*j@-1NJ nJZtv[L‘dSɻ駃oG*aYE)3 0bH924,qˢ Qd9"ˬn7i4 T^+^^E!"ר"EߵٯM$X&N[ }NnDف7Qɝq[u|1Æf@~RU~Z2]H[- Hp2R0Tx%mXu@ElU˨1fR%iY3ؘYtBXAcrxNp#\)4JV:4#3ȺIŖP=Bb668pduYY3*T\KGøbծ:ڶնP=3ã Ի$ƥ 9M$L,K*#!;Sfh r /:dĂI@Jyr@N5)lN}!ŝc*\,bqEt-,bgw$a$Gќ;ɥF06k.\ҌBo)4gU,gL 8)emD.v# I']lI 0A`8-⻋D.NK묶J|]6_QdL6hD ʙL+T^@H40!wvv\aq=!?UL$?4]Ip]#e? %uv+uJhj|BrBauȷnSTq!jrp;*ўy9;>{F|6, aNz0sY:;[e䔥Y2s&3AB,G셋ܖxo'LT;A|eV{)R6%3vXl<]mznwR)7mÚKÇD{Hz5Ms͎1^}$ €W[E.ٶJ*2՜%4V,Hˑowg ΞQ1WR(,DL1阒ހ`aO &/NM6ܑ6{+k5=Бc6d2CvY ƙ6>348/ͬ]UQh-xo es4<̴F9ϒH!K*.@&}~,rLQ'z`LH; e} k1G1I@$f;e9y+uZ?ZզJ 'ˉ 0 g&.0JRlr֐ ޴ :ZN@:' jٯsĭí[>C tG )- } ]P]w( u9SY#kt3 iݝ/w]s헢``2eA#IF"蔝F2ɕu2j ҆r|2,Ř|$24iA$B:0}f2&΁=ytRh~6.{6R~צ;t 3_.c0w?\5B\-)DJnO^9^2HRΚOd#`ݕ٧*o+:&q'1uALh|gyX}uzЩ3A dzGGAgS,Ki=pU:x>j9W#@*i}5?ʎ3RVYoB-:{( 'Խ@QmϮ\`>\Jx/ tw `37\MK.ƈd 1`nŠ㟮f^__uH_xPk5Ҧ;|(mOz\ W >mW._?.#^fhلv/mGdt5^{k)@\MiߴVbFë``9^?vY?[{ʷjN[ȲIwnٓϓlFGO}?O~r~>%t4yM߾y7}snx-x/_)&xo}^Y.aV_5BՔ=%^ aOJszj?c]jH#d:??w*wiĽ2mɗ 't<_ɰzٟw~dq$()igmt6s?h݂'wnWL7-ےzpfY%7s]zۑɖ }xӌrT2umݬ߶e`~uH[&~zh4$)3 ÐQ4HD9t.Ӝ#;.ţ+eZ(n1S'a殧h?; Nystv0O,Ң+ƒKb Zڿ0į&10]Xv|y? ,~knHg-PEDFwns>7uGu{Z&2 ^d4X鄐4I#ra‰`E^R+.IMYYv*UH3 y\E9GuC&f>m S\Ȝ@9[LRYtgC~Lj|. WƱKUAPo9J-םxRWHUd8xEX< 4oL|;#2nxS P)7>yX2+!>Mb] *c'>JˬeOB@.Rʌ4Q- 15ZI̫5}:c{<`_)ꓮ)wá<(QVM[^Uei-Q;$jILȝ[+vI}ܫ'h?u|(_Ӯ<8nȥ oYּN#G]@fN7ܔ NlPe:.ݛ[jBZ}1w Y5e=XN0Z\b,1*s(cٻ6$ePGˀq:ٳ ,8CGMXJr2G%43dWW㫚jTWXS/J"yWi[,\M<::i>~{9kNsR WoӞwmE׋+VxQ`2zt4 ҭk ׭pmqz{#Һ/Һ85|'8-"8mB0a!~8o)'ac 2gGvr3 f䈗s./!x3ID5 7}s wwM&hZMT1{1ĬQ;d)?ztnrz}YЫ'<]s]0b-[{^3 4l1X=hvqdl[A Ny?G˂Muv(R[D 51#/bMxO%RnSFW~xJAKÆyZW=7ʰգCq >%M6JW9PTVTR&K%: kbHg[Rї!x%K1RLo8Bt0Ko:`1 ),tбtyLIqH\d`޻F? `{w[ثսi9= G} ڛ''Q$m|2:B 62rdMUv^O&r*4) e,EH'2OZ mGpBt#k z#gi%f7ʾ>oLp~^V-o/]%>fH_ӳ,/|W_V:,"F٦eAJ\X@P&E#k5$o9@ &븰/vŠ.;/E*Y9Ak&畣"UVq@Apu  (+ݷ">DIȈm1(W[B2}U9mS$6{{V>kEH TtX$ &Q?Vx!E%4"k5(OQKgG<:I+-3.s[emq;S-ʩkM;a43%.r`y{)iO[hR.;]p{yΆmiAGWɖ:k%t =J :8oZ|8}88;<>0^[Vp$HVmǮ,D;Y 2. y;Z;"|C&N6)yWS+wy[yI!Oi.>[=}un0A) ~RԁB$/eƿ^WJuChcNG[v麟w^t78|m!XsL9ouZ6?+o`{G"}aT0V.,0=lЈOG>sr<9frVQG]NoԾ V ԑ6d֕'Neğ9:S8jzL˷ZG|X6Gr-iK+L+jB<-xy-l_u]Pe2d:=ʓӷPW=wW/աTߞz |_.[a JߚnG@ _Ͽ}h-֞v 9wSW5^:΄x`0gtn,Sӟ\yn8)t/>fa߫VD@>Lۡ{m nOӄu%(]t:dYш (H٬dDPL5idPǰn;e!fz{tgk-⦤c2^NdbNh)U) jP-P2W[t-*eoz[sueҭj |+묐w?xcQl:~?4@WCez1VqlP1l ߮XlnՀ36JNȨ.FUh˕ !{ "P!Dla&N?zW6]vV]]|/>{ Ի?렭R:#kHVA \VPe pTPXڻ՚bӣg{,f%ze?e6ob7l̀7ZV;E~| Z\{ک!.H#0'6>:mY"OOf(oڳ`l'_u>j= B]zuQ\s}A9?~bƷ"W ooGwKR/oG%uBw0f~,9ó쯞^^049qRߋG7cg& 1g-_/߿ޟH=jY9.Aؤ=H}=mT ;7{Kҗ+[7:0X ڛӋ[mg'Y%J<.r +1nxs[; oǵ+7'FSVS,dg+=fCE\l{lH#zl=ңSp`9O3k~dc˔ۺ*K"_}Z++KuFMeVfkQ,&ۖx;|A"ת8YHj祲Ɩx2mcU3L $U5,DՖ&ɉѪ8-%h&(A 6k>{Kr(Dҳ%$'B"PFS**Z;&.ƼO߷ }̖=1I#vf_vv+ʓZ! SXLRa%*u-2$뒲Ԃ^S9_L[:Gj5&u)a,N@RÒ" QCNB4=An"Xe2V4bVmotʪ֒3e>Ԝbg۵[pDK3RyKϔeqhB"na6Luڵ&!M@:JTT0˔o YSf,4XjP]$g1gk+6^[p̈ "ОF8Q̼9eᝏYlkt0֡'`%C(d?[ 'L"[v$ȦUY:U%#aÕ( V[T STBf-H}dCn (#kj&Gpp N?JAuĂ,bQ.!q+HI=_:gI!)QebP[Ycծ,m}St⒳"ac%VrV Y\j} 푥t]J~;0I%L DN`RI,"XZPT` :Y=V\vVs?FQ *H"V@y-`.,OZJ%B[U`s#AAAC\`-AؐZuqj?SI[]3FT4r; XaҰ2k K)vdtE >{J. EDPQG,F1x7$|j! * +$( l3nGZ.aHb8r+R{HEC HM|3.9C0 d0a"0C$Lj-NMr0%2Y,5%`A{ŽA l߷vLcC$:3X% ~bh& *<UZl[{Sa*ؙ@pq0Ps8eoz BGrpږ 0}!0PSJHҦl+vN^eTcރ. ՗44/OAQb)QNK{! jHb#`V6@@;7|͈Z _=&M23u& -jnA튋 r)-cV)9i`1FL AD DX3Bvv#m%]82(}XwF^J.G[-Udᘂ[8; 5 J,BpIPN"eiyU!|p|R i.Kk4hF#,Arс}IE@V/Vx8 [, c\XDunP$ 8@J0*&` L8D@v $Xo7ecbOGz'3hOYsSXk#p`Wpcң.)9H/s 6. ̗ W,EU"Dh)ʀv!⡔KpkDI(DÎb?Y.z d l+ hXj3(f4`RduJW<#hF%e#Kqr o : I,Z4" ?x Z!XDDd*(y mydQ{gƑ,ȧ#}ȇ$glb}J\Q$!}l}C:<$6 Ȝ~>z4Qg^L`` '\ ޚ9i/?`ai!f:Si5 (̂+7B0"F,"I` e),Vk%[ 2Cy.[qi%DH0$c m tw Rמy^-' tU̖ϴeYD&NS# P\7Yr7`8𐬆n#cMP(ۑ% 2AʷVg,e98!+; >vt3Fe5M{u3lvFjn{ #`*etG! ;7,hhj_?DkޫR(\<.rL@L=P`={B )L+!w+,X V! Ya\a0(*bژ891p 跑0:kt^Y 8 |BX '!!* "k!GG(?A%x, ,DhI2C>'21yY 0z"=RB^MAұ]4O=`a8`hgQ2tK{=lZ#U}E[hK-"A_6=`f9xeADpm } Pq18 Fi5:݉St]%=;`UF(j0ݦz4<٤d[;UVBXW,e@6[!8=5DpRl7Hb!X~S{M5مj;JL4Y0Lh>ĥid>h=IH(rB9 䀐B9 䀐B9 䀐B9 䀐B9 䀐B9 䀐B9 䀐B9 䀐B9 䀐B9 䀐B9 䀐Br@C8 تI.z4Hȃ@JNz#r@!r@!r@!r@!r@!r@!r@!r@!r@!r@!r@!S b1q@ 0q@Z9 䀞".!r@!r@!r@!r@!r@!r@!r@!r@!r@!r@!r@!T9 Џ y<kာRDmַO{b̺b<(.&B1dk|8*qڍ|hztRx ǃifM ׯʥ~QU q=7tաj  8Gbq>\I U=YBd/oD5MGދrry!(pHymyF !{5~Rbgt8¯VIVhӔP槿>XJۯlԒ'Z*`%yl>t2C' |$ ^<6Gu2ʊ(lUsfwݲeLL*j=xVU;FF!PlRB vZuWڑNލ_f;qQթׂ "xa\նDTӪ'VB\X1#/SR;z_Nb^O^a՗. 7?)^%8y@WЫb>~l-/FG]l^]5'|>kbߧ,:㺣-ϼj:Mma6&SNSE%U*j{f4Qr8u#Qk9Xk]lz{.-4e~EYͼ Ňیݲ)CRsanΐU*kC[1xMhr^)d8K,agK7?Y |T2*W;e:?XT/ѥ Y=|~>"eR`۔B)Ro ϔLaH.Uw4+ jH{oT'J ʇsg<;W?m:۵zA3f<}-t< f gos]4P:>;66kx\:بb mw8#zӵֳ28M-=~C. ÷[vyyͻpVXs,}B]HpDn>Y{`i$mM‚o2i,vIp]4ÈˇwC%7V8w5m~'Ϳ=v;o\0 ߵB\~e+6y`W +,-'k5Sٿc~BC|dڔB]±g(0$gexpeJİ_:M=jpo\;/@g<׻{ܣկ'=zhAeGL{ ,)TXϚ6ө>!LG]wFCrވ,+ri28Ƒőf/|]ů n&%b]J) ]eY/qUlpq%υJ1nӭntc4sqj2Q'ntp"SY? Amv$:'IJe.S &XOf&d Z;/ {0 !iwɆ,Sf3I7*C9c;hby>ao/c=ZHegO>\q)nwi+f$wHg-s{$N Jc\H*be, 6Yxm"Ɣe|(Qq4<=l~t<_DY6e) "e%"K*gx&Y^mtlڡ Z0*B(K-ǣhˊ.rJJc6@eG*!*D*Zߛo&Q;kѡ?p3tP7rjR7'#^rp'4Ϳp3NJX6qY }IP NY\\DH3mD<b>K;vN;f p"|?e1Aet8}w[ h:M˚ݫXvɝ GKK]7//`[UOM3>Qu ?gtJ}sZ,~1#_joK^ի剷Ӭ@pzo[YV=P/Uٳd> ܼ9cGgn/LfLJq\.rw9y?Ȼ毿~η_0_g.{f3/~լ-")9`T&V+Nf#!`_ϥ~蛢 uN9MWݯ(Mw+5朳у֏[l: 9\spN[^SvkeCa Eě&audzvEwMWGIsiYxvUfgCP a/En0VO\IPSp: &OS8FMXRu5ި^Е,le ] t;w^ $ et8e|Yo20N7T3WL߃Ŀa hW]hɿ.^q7K\i<w^S=;lI{-[ƸDcf, /JʉNpqg{zXvzDC0n*e}$Cfo)2 ^bjL8 ay#fn[Ax#K%IeB %%<8AZi.s% ;+A x(c$r){sTLLڲa-Y=ڣޛ8ׄ)O뭜ڭEa`h_׵Q{z5~beQ˩^M?t4˹uי jO A%SE,uQǘBL< %,Ҋ1`$K$DKEO9pI[}Ӹ`PW._&c`Z%As0-. 'Bk44"9P,:D~T8r *[Y^xf]٪()!3-ːLVi"8*R5v{eL>t) 5$F@ihgO&Se"-JA0ء9tȼH*m&A{ksLOlv~jƥGbJ5(hS-&;NA:f ,΁rQQzBԛKS.3Sx#Kiewljx3Ll+`LNF LVS\]3?xPsF\֟q,PJ@I\#T0mfmoK/JyI՛u:ؐq0u٦f]jk,lz}(ɲVAUwmH_Wr#ፆRw9ov~ڪTxJ%RP~]S)P`@5ЏSc -Idq|{{rq-T9BV4k]McFo{\On&/^5} b0]/krmWՆL//pq=JqHFu$a(=*ّt8juh=w|嘃d^FF6:dר]s̀X䫇82ׅu+D 84M['脴|ƀ*.,Di\d_C:,R<+o^FΤ{\=Ǖ1ղRN~[zJHRH]jvZQ4~d)gR`'gVC|T ꮙ/s@*GyXPhm}0BѹaX"b1;=a`zIWOް7rPhGԒ&dJDHM*"kL oNǽCUɠ\/Ew拗݉?ͦ= Ymtrjep`[(V5oǾ؟}ь.ܲi֧_ח7a >oyCG(@ZZG*R-<1ԉI`NG :) Yώd]'K!Ev%Jhi z.}R+BE!;D()!&J<0*ɍSjv;y%Qa0woqy︒rwPjӬW_"{(O7aǙm}=o'=]iIf-w:V.@ս'=A% l8և:v׷Q^/Xz|* ٲb-v=cQ]\۬w_O0]%x$Z('jC hcu/*E;TݩFEVzPʰ^!94x1rZtIBW,=/s\(Ku6&aJ sLz)7:M *8-|^||-Nf foV L& BhU:ITwYO8*)ԙ 9ӕ;'Pv ?;&v.%d?[{\LZ<\Te&7k lX9~*8D*r_{_~U{-;܉P,ꑵm5tg-C{3gXoN9,BGBX Z.dHgdhs'QiN FH6UXØY|PX.}pHO.)h@D⼷*GG!%#$_z/ɠ-n Tb dp5nh7H|$7 )$s c~+b @'ka1؄֪(Οʝw2S7PDo4k_=)Ƙ,`)`VL/c$c8t9`nYEٗ,m-_vta'&"Kx\uڴ8ROR:v"BYvJH2jNY?oI8ʈ!%DΡ. @@ *bsi $1Ǣ:k9 4q`;!%8? ջZXkc qu܅Ыѻ[t>MO5-,~iI<ì(\'Nݭ/zBSJbqt3mq…/U"Y e@C 2.yd TDl90C(yUpBI=z7Wp\WW1 2i|4r7[NI{p*,vN~ev[kΧ?oYG&͓}viSX=Oڞ0a'm6Ll7iE6(pG9ͥLhLhFZܟNЃK~w'Z)AkBSJ9= AtQyT$(Nqu$6pM}@Ebǫw͗S T$u1(KDQ`BSp>@2p F%m  )#PH"et*FrT$LSypY0.9Pـ/[5[Lq|*㓝*F%dž>#ZT+?HJBȢ8. "&RAk=w;+$@Qͨ@R.(B┡!XIJ'qHpcC֞ 68^7Ѵk$YH8i6#Dꮮ" nlf* KE>KU5Na>Fc (-B Y0b L>xλ;뼛 N#eDF!NH Q*3X0iiqeGvjZ蔒v,s̩G-թ75Kl>Ww])8fr}! pz*3 G,Ne ʯ<rFœ܅2`|U7x^vO'ϾO`*OʲYO@kR.|:F}]{}s$g ).w,(=hƓp$؆H*ĕLCnALfS&Elv#h-}_fbo ([K`ǭ;>k)My#\c]dGq WJMB)BW]UrRO@N):2C޿W 6cI5`=oÛ-iU) =C!,$H(K˜܇MC賳#}g>>*3%uVGҬ#R*}N!B2 kCY0s[Ҩ}tfv#;b ?j;:PXҔEDR \DDH3i۹TRl+; Qe].eXTA~dQy]/cg8ɲd3:.qr$V_%+GTXP,Jix JAUjj>=ïy2#s'@1Otp{yT F,)_>}:r}9q+_Z+Jz˃ 'k3P!'Iܫusus<dWBVp=H{*A W][Q~?A`+"sz"weMxZa=U“7 QofN'G)3պ+qKN3<*>S+\v&nŽ{ F߶dD/]~{sh92dr5@|ou$"eaHp:,O̴w %,C[G2M MD(H6ݚgэ(Y9|~8 {g6r-ބsXdLD ȄXP3FRni?~q9bAj$Ǐc3TH'cxL%F8*|@P͏EPXGwvp8mwM=W ';c3ÊR~Ş_9 DeQFyvo5(X8y0(VlpJh_7'!QZX@6|\v8DFci2 )%8^a*L}e7'o Nh.IIf28߬YX{Fn4ݧ1&=  s}&ڪT\&89qR+\?ۅ%'_1vgko:-݁l@Cd˘R |5[VճfrdƖpcvE_iDi *iLjusetӜ:]mZ$d&Y]XkWyP@k'pt M#pIǚ%7hf]||FVmjm._»_d)ĿNR±!(D58MP&<v_H]q  7_SX渚C;ٜ _0 6?G6(0p1=6]u8| gI$j+$ʫyoAh$Xʬ/~G yojw TgC*1ڠ͛r zS#ٛژNT%l9VS)Cso8/'G=Rw+RDt"lK`xgr$2qi.ze 6DHO:ۊ􉰝Lmc}f;=\BfSQ)I20,1IJ31Ba),5MM0cQd~BZK/B'2':)v5;t*ojMadGĩN/a+ h 57;Kb C. W#rWk0_ >pC *L^qdE4c2ݯj :n?_~K`^f0 ?O{-Xq|?<l~;3-WO?k㮶VY-o(ԏ@)ԀLllnr9U?]E^7MQhpx [D$96֪23ūwko~7WY_9+|S6g_>kgK_UdJFv3DS-xg7oWK3@1uL;\F&X8asn"l1(*=>{fMnep>HM I^G_\$lS$vY*u{wY>]*?׎&Z6a0Qf/@8Cvzm=N>]/fU #k;Ǯbdx;(Ϛ3.hFt\ ^RZ*` Wi  wʭfz9X9DH.O!\um.OO7r~Woͤ@bx.amB J&8LK([hԂ`)M)[FTW+ {XrEdǘ.$>&clTnil?ލ쌑vK(h. 8iThJJn# QAg5KWlSHoٝhG#ͻ[\nުf/&~>(7f>\@0vo65|{8A#}˛G7~\+" %k"+I7FF(= 6M5U> QW]6WCFbczřGWk&l<,,Qmr[%dmtkpI;&\(_biLF UbnX;Fb$[٦xaۑCFs٦dh$l36?o=>; I׿3yK[ +_4D VJi= MC*.{j1vaEC[v_[ 7Z$N4I i='Uh\#.DJǐDGGZR^>>9z}x8)uLoyT >8j٘Q_q>r6O}&4I{:=XMG8: 8\Brܡ77t\HǏxys.٪[6wE 5D`Lm%&d+A9U31V%Q| p:@%өFƒLߏ>EDۦNd|狑}MT&9U [~Ɇ1 |)V~p%Ts_ i/SvڂϤ6i]'Dւ_"qmHfRC!K Gs0Vb}L-'c_UwȉR\˧@SVc Fl|R\[rm˵U/uR d*c8% ,g&TSnpR8#!#GDdesN"B@];+VDD w0qVwA2nNWc9(?L|g' T;r~_|Q /+4fK~*KPh@x4ZDs D&)*T&J(q$!$$K|K4D%N_~ {šv_TJyTܢ9Dx/$7ĒJhnbR)& jy00T1&vw +E%j=N6V}QkŽz% q,c}o.kGe])ׂR(,Pk>::j}S 6# в/zڅͨ'CRWθӢ+L;?ioz* QW&][Qʝ[Q]`9]vY'ZЉܓ!]] ɛlIRW"F}.v:wȬ Էf)%R* G"c1QH>L6=锉.fV@3N]Ǽ!>WfJ ᣌD**hRb<xjjswj yUJLC;Vͤ3. uűoW-j϶90L~v0J9Q?JiŽv%jG^/u3\tBşs5iǟ-z:R,]<;2쐷vƶgCJ{EEG齒tNԊe97ȻHs:іrg'pFwҎ4&MUWi0BTJnmT,5PQ)I20,1IJbRR*YkfWk&:i Szq/=v%hIYI" `=s>C|kp#Sys#CTڪg'a+ :Sj:PKq9咈>SEDT/ic[6w@Q74Y_ӭ0%EZVQLH)0º]TZP AJppet}W9q[J΄U߾= qXMJc5= W.nChd%{*X3^;;dJLnG ~|,XoywAk`lu` Bl'G33^ մ̴IahNNoP0>/r\43#-bKP7OnUfLou(X1 cLwe#I~(zjøQYK%(7IJEv6)XȌ##"#`J{{T엝&<ghF׿^ΎJIխmm)V_]_ZPC` FrhI6ɱCϬJ| ک%惾VREFBs(6LㅭGylC$p8<=fdRP ;d7gYz6 Xf GFdF8VKwƊ>F#y5N2f6 ,ܰ͟mfKz/[х0C,1god/+.1$+n5'I/S1CY5@b#µ-▄)9|="F;Վ^je7o|Vs AYE},6ɶظh*sPWv@U2o\cb(9aG*H/*DԮbD[I,GcJLCG1#D 25s/\bH󃜌Ұ%f(4:;3͢'N7G2$mDVa>^xYYMTo!H9;3Fxem0gC 2:hߎO_k.h el$Vb eDq\Π٘b3^6*ob.4Ӝ&s.2HuA0_,o\~ bJDTDF<8EJT))ڼ-H"Qp,(!$Ϙ{Jd璕 +yDTX$(23pb'Pi_.T2Ce@*`i?N6/3rn؍m)`Sp _&e?Dս<&*D0 H vSh^YXB2$&ADxDDHk$겄jj6_ 䨏jU1*#@ȁ/&:QW1PmHN(]NX4x6&֏]D3Rɋ?BS & ':SP|YŐ? %1Bc7K;ʱ\cu #/) qpD .w AH /RD(@DF/*Ȱ%|KXg%!v"f5r+&+;M_j + :7f:!}[>+bx!JAGbP<h;[}k$ڭ#FBoR6(ˣpGU+kY5{<1ı@8JW q,r $co>a5bŝ7Ȱź Li~SU 'Jd#R7Ν2Kr6X0ı@Ee'!Y)6#Q$]nG ʗհ_ j_Jb1'7m%l3hdGW˴!*8h.@*ZJ%<2lTD0!KBA7Ab<$u#rE (F(~=JD8DT0F$K&rAT CA U -z6g {IG|qaA{ ƴ^~*w~4QZ#uPyu1ت~b'kɐ O"6žm]R7>E6«̽m!:Nv2_!}n%ENŔaJ1e}_7 σ$_,#?]VfZh%΢`]SM-aî@BK%]xУMaiA~􅭸 =ىz~crbdC> i){:eb B&7.adA{ŬWKQ_}Zdt|?*]cTP2^NL1/|t6!vIXMbR3\h;c:$aUs 8 E0)I#,X@CBҘZɇ ~1/$dKBQO6.d}tzLvfe>VIDz+?3rV\rDL&1lLEc21ْ0 F%]jθ tֆD1f$&f<.<=Яˆ.ﲮ8`yEx^&Z,6 .iSI kު^sG# ~y³}5ц]Qa+gxFwS,"0q_{zaY%AIX&eߊl<-Vq?S}<"oޓdV}Zjq)7+Y04.~5C)Rj% w>o O'wb|Von(DG4hыw\8}wV-KJv&Nr":rXxbi"zWP#LQKG.Œ"@(FS &T޻ )! edǭޭ7,ϖd oV˦_YXQ~ܲK'VP)hBsSdz8O2'iZYV.F ے3S!B!d  ]DIb»@Pl~ft̳rY V`Z`x@=S 诀`<)#\^;Y*a琪)W Dp/~j2>Y--bURl` ow,7<%ei;U2ͯ /i<$<ϪE/gZ@R0͖5l6bkl;mWut\_^镦?#0€Gqʦ>]uUU <[YV-tA LY:Lܤ6>fnuzɺœ @e3Ky~m:vGc.1't'0D1(Q O7>q:SVΟ}Cai75MN.\FJuai>7(> op JSج8GE YoKCy ? [R`R*+$[rIyYSy T{7Jޜ>F !-gx'l$\\Hs@`~Oåp&H?n6"Ts&9D'Er}3CX׽h4ݾ&B~D-vp1v=vB[FKQg)Snɓmd)0-DT-j*\w m =|uJJYeIXo.X?b55$8)޶Z5Um+@/WAiZtNئ) .ɞjw?=n;ۮ5;zhiH{&0dѕKAnD">L"c#F}JG-\t>t UѻpW6[iU'0wvQLA'r~.:7r[ݦ_H~5$q;QRds*ϧ›,޴%o[dƮCM[REKgQMXaډ;4nJb]o!Uwtu o㧶<G6ap8w-#)9-JZÌ|pv- "W n!g<`J%D˂q(DHu{V++1n Ѣ^%'Z8DD*" |j:Z~0GL4!t=Α:$ڧWą%(RB! ;Ǭo\pRacĥ MmpKK&Rs`=ߺ Hk_~kBm[nKtGۋbRmX9[i?flJR2)PY;qCfޠDžnsz])ǥkl}K#'ˏ[ț$mm{"Mt| (d?,ANCfKK*0'=iGkK& u`Isچ vfQ[XP]o6%wj=~o2bON0! JB[G*L_a!xzZ;r wTVI\|}BݬVl1.m:, WOܩ:KKI6ddQ=PT7VҺ1AՃhQd/[$?Zbn$˨G?~l2,wd70Xy&A%5L/'wϊir#/1A(ց0HcJ" T@(Hc)_sU |9}ٓ@-X.^b,KQ$@F)dJ$$!JϚQK2yx-EDlyFx'f<=0䠀cԕ2ƞ?C;Ezի#C6J.ԋ' 4@1ŀB0ጉ?Ɔ+%晼y]uBi5~F)?w6O翓 ԏu̦ Cfģ1S}s|ߍS4uGO/J$pC##j޳Í!y\wogܲIA~#w~Cf2c'@I|['*(~11DO8 q.L#58t tOIIze ~P6ƹCZ:c\BZ3cs?3d܉ ߔq. }'ai~`2Y4$mt#-n vFf;}tNk uPU y#N,MP6x: Jp#Lv>1goD rXF B#kw 'ʾ >[ 9.@Mu->1@EZPJXnKrDz|u!nL?e6[[U a+&$[W=Ҵ8MKESMf;N"r88wܺq}Er0.SeH#bEJ/zknxv))S*ρH䇋DF5N~u #iG|ɒOs=!R dLp<@ǹ߲]1td\MWu&!ʫ#& ?X*eI ̩͡ZEjx UA%4Dkb ϸ4u"IB1`Z1eW Li«% Aq,ӖeXȫ4Mi2q$Mi^߿#=k%E'-Δ/}~qYd߳I"jqA&*9*g`Q꥓V'M0ݎ*Ju̖yRvyB uǼ&1;_1_n#] ʵmrmYTABbqNQ dwd*k~ !p@u!.I)($TB R\d'˴V8A pQ`3҅ 5PF,:P3Ey%XGj)H RR&eoT}u+ϴ ',˲n E9A KbE\"FnR\UqHrINUqmFuC^Y>-m@ksEHwpۗ.+ƌ@|EF-4†1z j'v|2mC+br1feTmY2 ~D6iO@&r{fh홽ͣD"7'ZNa"$-A"bQ98/X)Jpj j&d/jcJUݡGdUWgۙ<43P0~fZj|il0>bnسV?VKnodߌ)c4OrUL+v=i m"=r w/ 9F EeȌvJUPx ywP4Th:Q~6~4@N<-D n@͚4-;<'55Ԃj'f/a3E}d_r)wqZi cKH3&#a~eJDw4^Nś]*Hwo3jk[3b]4Le>K5|}{ͪNx-K,iTn@ۮ}UD3/`ʉV[SZR'@ºٛE"{AS?/d<a̭bJg/ZSE*a[E`[D)G))]ξ!m=a[T2[;Z +?nЀd*.fd1y$IL"Y}[duC.Vk]ZAnMҘS4 Ϩšnb\'IMI^-f8&KYRƯ%NP;hǼӑF [L $rY3]$9 ltA<:7ta{Z]8 @*KP_cǏle]Yf %WH(6#ց6iOj;~5i}eBȴ|[RhMAJ緞%5m`&!˧ϟ]%d\lRlGU22\^Pb".8VU,O/R%~?jKf|VTeixV.:xjXk ŗQU,B0־C>`:!Cs0Ą_Rx 4S>D|[:-N|25I)9e Ln4jH:L˜8SNr񲔶Y<{(Y9jrZ3X\;: FlzkV7OMsz6e?w8$\&.(Y9L(=&Le⊪Mf 3uRAPx-S2s`,Ȥ~Y%/ȡ#RU˙FgnRG_+Ie fO4uN\$KԢ#UI}*:)N:gGp)_kCFs':'DXPD&\ V ЩnXEΟGݥ\L<"œbfΎ|ݰ FDa3[8 bXj k9); >.:}يrn3T5v$$AST54.<^ }\Z1=V6  fV-RCߏT dC:B/6Qlj 9ݱnESֶa;u@%Xšim|]^2؝km迳 mZ"TP$p.HIAPE`#C)pR`k("p5F Ax;~H m.D;Pic YT.Owܑ.$ ȓP %HqLV#`KxHG-4|1n1#`$ciM9d~[c+ܝV@|t+=A2EUEo;>$U`P eWz)>şL#TDK}؋A+{a<;xDiZ:~{ % 1}מ=Jo>C6D]uzA88BRQ+_n;뾊E03$Jc4OS- VWA.8>1y* O{i`NC@jUtGz(%m .@[R.GC ڵBQ;%_ɢUAleR i0-TE1Ѥ ,o('|7u%0vAI&_uu_g?~o?UoˬޯYWלar?B4ͳǃJ~a) Q/M R׿do[PެAr Sr|8}͔4S,EVҢ1fƓU6Q͒]ZozݭIRwݬxϊvI'*,rm _T 8)2[C^8xρ$k :j!| T`F_M3ۧcCHhTpvedƭP7Cmf8u!l[VP0#g/'$DWwn >Gj v8e e@$3bnфi|(VIf1UZT x]1G%s{Tp=3=;%æ`T0.U8m?ΈGX^*ȣ`7SCد\,&F4t:o1-0dŻA"Mg(~I.+RTE9${߲ۄT%gR 2:r*xa+Jꑄ]5ť"KUfN.5&Ͷ~ge)JKV¡ y.Fe_v7㘹l.kgkz4]V526e%x }|25RSZʔptqy_Iٵxx]l/nm5fƒ }9$%G?8bV`D'M*xH0˽ՕJv޺tua&dN.U_)X} \75UBvBTwbTiR!FPpU]}S@#Ja& Ơ ]<-3a(|vI Lr̖Fh1+BAVQg(Ʋʌ*/Gݩ iqorS qaPATH>7g4C.4K46ZmA&Aօ5Ixng Oԍ1S+WqsU#8PVZ}DlRkvdww4H$_ y ѕ#d;Ӻ2RwnyNJorS %2nbZ+0XZB \aHI: %B>'^-,{j7RgֲΗ )ɭJR뺻$"H9_(Y188@EQ^HƦ{0Jleػ`%X;@զ £P"y?՟g~U-=4hL|epR3v,MqwV0/ѨL@\JK=0-hCBLap~Na{`nFh ˹7" G0ӛζxĵ`A\[ )RZ]%+Ql3a6;tLmUռ K+ $ ڟƊX!GDJH-/Nb < #{$ڻg>EZy;ix%|3y9/8)t[b&$bclJ!9.DyI\TXZ^ҒMmزAR 6m3_X)=QԱFZ5Ŕ)7oS8T!kxWL`0㜓390v5OS+!D^>F}Zt9ӒgZĚIT D"%A`=3;QB{n$@õF-lB`~{ȠqZ8ǻli +<'?prJ{IvBOo~f aX9HU;A5U4 ~:dILͰ㘞urvd`,3wZ C8pqe\7a7[5c0np (ᕽuwsl1&Qk֨UMݨZ4o\@Y7\.[t֋1WGvU>Q<^jJQGae!^yY -=rTEao^>_M]"j%⹌]Gҏ} N0=X~ꢮӣw߮L}Ƶ,+ QlX:u^hU`:g ŏ[XHRv˻IQrߧsRdl*ч{>g?R¢)$沔uPuaXSCsե*S5–J= (Hqyv.G]S̻դ.igG%Be"y+c@)=*Hqp8Pݿy8 (% aVY,:TKR&PsZfHELv9Q$JzvVzWDR^\x7:(՜B ɱ| nl |gcɨf /۾!Rr$ۀXt&G^ $geQ);qFlheI $yv ׺?>BkS/:i%$=+ר Y nڱk޻Mi Xv=ZvG)=D(Py!]k-0y`lTDOLW/q*2[{>T :P:q^BGѝ? ^B}^:V2]7u3X`&۟~\.:M_d&r0 0zcM0f+qP^JԦLX!,5y` Mr].2\}x@2I;䛏h{]}~}}1'B?aP,-{-<ֵ_1$+_nG{fV+V~(mG_10.Ȣl0^ 3"2#~L)0Z[xub8[TUPx*4;~ Hn`A vScNdQ!NAs\>x=$^jT"_>pץP|>G?M;q(N?DWnkK"_%-fKJe?ek|J"X|j:yngƋ+ iYcR)X{wOymh^*LSRFbBŶ{)bIdhg3i Ĕ찪Ӆe%#Žm1ӦᯢכҀy╶6Je]/#nx16j+o`\o{[y-؅O_I(Do]r:oՃh_{bs8~{sDJ֮MฎkT|+%E \Z>Mƪ|K.ozƢה0.+N׼R˜Bk9M_q fLDAsO:(m? T{pCEEӕ؈69.(m֚5ƻ"EI , (!(ՄVZ=DxEFIR ^ 4YyS`䜵(J,y,'6pYh0'֟xw !yOWuaOn#%sj3>mPQ๓(d ,j4jhlE 彐 MU੪ = =iCT#$C%I؝ Mi "G{+ ٦_9~inÑGN2'=J VJ\:Ms8O}y**Y)pyv;AIS7\O JaK54x:˅;4tRP$ g&I]: m3ʅ'g7ۃ (0 dq nUx-iH(WBjPJA8fZǥNt^f.JQI0d>ËRv,)h`kOOR" Fr6_s<}2J9}U$|h\k& Ȋ ]/"JJ8yB9'R19yXI90N`ЗөUU@Ix%g6IUU~^"Y=Xvkqx_|D:m-QkxVbMܣiFAU ?IzBݗ YסW[~aq>Yp^5"SQދWH !s`Sn^@uDYj1JNϥ9Ь+:%4[Q-%-ԝKyj{.54L*iNd25441vO5LD㹉UZjwTunۉ|ST'i2aK=+\[ڒbv |^n͊\YSBIM!-^$Xx 3ëLj"XJ۲ņ^ovtYQB~VR#5TO Ƙ9/ϺF{C(Kn3iKQ>w9P*'iQ(YCcҚv>s{OW:94 .)EI꣼E(Ʀ(3[L,F!M.>F/rVtYpV\GR͵ccb=M%קU9Mj-S"k\J^}髙׭<V$w6y tRk HݓyHT>;Uh;w%޽3wu/t"B",h'4rÆv/|̡"rk~6I m@fcOgUo^<D֓ƕE5!54Xu۪>u_M~wR0] WiG|tMeOStmp>?IG54ST_/{&FA^"`VTfCz$^[[v􊂻޻2<nt<;%am;tӠ(XC݅yjI1x4Ğ\FvZ|d9zO۝<AXq+DCkO,h6z+ KoUG2~tS}Vg/!WS9JEוއq])IuW0jPܨ*~=) `RMRNylC{zY+n@nGjԠ{cg7rwAQswu?.S|u;ant믎uqEpTo-zC{!mOﳙ5$5γ<8R:tiQ r.:L '&# rGt.@*`ds '_l~Y:̥[ wh,laKw0ty5T"P xٿb_\|+h.%\ݾGxu 0}atz9=T?Z>WuWm>kӖpݞ. 6'G|2'{MKث#gC/F 1xaśW08; .}ߗZ(=!*gpIH\o7vu٢Tk.k$*Sº!qmD JwoaKPSMPsK+c~ ]UIN*{¬2K~pH Ҽpû2נ aPҵr:t!'e_цC@4Zl۳\۹?/`[$ Oea6q-rWSnk[؞Y߹wG]ZIa_ F)6k#є I:$ldJgN7Xh;ϗWSm6g {4v m ~l1 +ؒaԼ|As'z= 5Us5x"^7u΄]1x,Q/ &Hc/Ǐd=XV&IMB#m~t?N:xD8BNዓ$/edx̵Ҿ-f=^_&5RFiRXnNi-4- L=!e5H,/EoHvQl CPD|@C;j^ J/rkdأHg,;tQ<ѥY8")H[> K㈤&pyEDd 8b@ZĭH\z0p М -1)VAؠ\R,9HG׌L9# `Llp($XG4I;ux(t(w }~.TZB#d12[(P]q0\hL ދ.XP4C'߄ bȑ؎Zuh.^={VyCKp4 hyN/B{ƥ3u2(sv%u]Ia2a"T;w rBk^E =#x9EJk,(F9-w#gN;9-5 γTc,I-VYc;mgsh-v^t%҆epj\:|$]1繷̵5yKTYؾ`.j V15BD]N0M4plFp|iaܒ=Vc_)PѠzsFCzN^yPp8Ye 8}c.~w4Xhm͎rjx3*9y6K{@K4l݆K 1pr O'I38cbb=zOj &pA9GhjC)dj7'9b,\# ǹ%d6ipNEloz~ ~[ni\?-:'c?P>DsΎϢϾ h0gNm[%rMcn[eo-|؊{-^gyNx9A_’I~,S"/:)ήF %6"ҦCX.>J+&a Ϯ5D}QG%|w +y-9+,&( YiD1 M;M\J~fDfcg7>^')(Yr X_*7h9nL$- l> rPW[a[OPU_#z>'Ô.Y_Uy)L0rT.qtXci=/6*.h#G\Gh^zN lsfO,y3Z[bǛȴ7iqރ. {6$ /w{vn!$Y@,(dW͇4&5RS3CK8kޥy!0cdIA.u9s\h+F[PߨMcU梪bx[p^nFӕav1N9 N(!dsbkY!yOWSϜ0F$;L>yjEpbWI٢6HxNw6p:oT"9(H@!,ͣSbgnͻsË'qɡ7BvnDCwoħ-JJT]g6/өYtprj12]9b[4+rf}kL;[Lݜ_[8JZ3Anl:(+Y%e yOR"EGrrjA_sȖ;~2 J59WUԙ9e0xX<ဴ qX*&'(;:8N`ӉUWQ(OBmF }tZ aYv)C(~1;CG+$ҹU1RG7aʄѴz?sҟsnƧ>%JѣКQCte\y爳T!WFP| i!b,pRN)T¹^)>wƨ\tz.Pfsf3i#Սf-mE><ٮܩa2PSRi-~.S@"N80Z<{Yǩ[*d34t:_dݺhcf8JQA;3}Y1YqXlm@\Xۙ09h)xwa텗n嵛.KNlT l`8f4,]$j5W ̭vȐN6(>s2+z FeR7c{.8J=X6Y" Y=DA!s;S&_~yIk`웿wQhEHI~Sa/ˣSdxtZ+n^?<u)> `vX 2m3`:&'|~_,BwU?kQF4Axݗu{^x$7{9_ iU{Rn"ܕ3n%!, ug"JeY^{[XaȔӀ= ;'*5Xi2wx[4oagl_m)1Uk`ɪdWQl;.qnȇ+50ޞ|Wd tyh ѐ@MVi2?%*)$7>M4^ɖcZ#mw,mϰ3\~m;|i}C1\yߐR &{L7*ж"}D*ޞ+eH,ʎ _u 8R-Pps<"7,IW,?x䰜TS1"WyD `ňHNRؘQ^ *AEX"{Q)rQ 8dSp+uV(S +K glΙ8_}vE~m;118[6뎊'q9h&F/;j}i \EF]&FM(* #YKL_`\ގ%X(C`%D@f.GbNr8-V &Y_ 0<߮!R Id^L]"TQBTDK HrT(,*i1c4dter<~ kA`R7BѭgS7LPkCL/b 䊯.DS1MȦL"cnid$Fjl(R`{Xac0 Q986JI݄w 2ﮗM+mf'Ut"$1Vb)AH)m^¸cEJd&Jʃt3Txü ' 6`{ f=_Άrl $r0.oW\>6pqQz8).Jg%;ś(n G/ȆIGb6 1 T!6X3"U`y֜ XhPqb3Hrp 7WyWhi@EN$ب"ŅO,O9 תлj*sC)A!3ÒGy0a",QOc2h p*zJwy~ t!iaDb1l#0 EF1;F^YNH.bg3 !4U|}wy'url;[/;]=m$ҕ}M:J m)AGin:wSN*o^2[/_G,ohӚI blA B:X$G50V$3*\%@q2tT-+ p(Fԡ~y'@+sK͖"{1+s-oa#818=ceHa Z!qw-S; />W8ӪzZM4 !ǔЗ,HNlBF[}-rRlѠ,ƛ6P )XŐ#erhbWw?hۗj@J4 p2%;50VT'aUJV[q]@Zi7 `a&[;H7W?,}97-@恝>$ђ>և} ٱb>|1y>{4ԼsRC½"T;DeFCp~<[[p;JyuTa$bw*]G|VΛE1RwUɧ$.ɝ$d%I+ gy.9Q.%|3tJhEgA =\I/1>a[XR%j`]\'>U| sHPN,27v?.g1Ux0\,>k&ᆱWi'9$q4Γjr?9D2p "4* , V`ǃg19mtH1Y֐'9gͯ[xq fUa)3;8 8j0,4ږ._hjzVJ_//42SےHse/#G30ʺvPj50^ - js]M5⏦!sQ _bU9O ?]7.y;~^>{k2A0ۑo ;TgQhYd2~3 $M1<4h'Mrї3OUҌ|:Kf p{+B(XP! ]>oS.ިi1M9 ޚJ^fN5Ȩ yKjv`7znJHZveK>%-ݛuygFpz@j׫XрT03%%^~@q%4o;Qnn8LMp̀v@? _m %NYlϮ_}ٌ{8gX5Luڠ7^ 0c 6C ;cyQ.({1 5)}#2C`n\ :Un:*!6h1hh@2Nȹ?r|*BÓvf s(c2 Bbyvld1.%Abcs4{q,(=_ u<W_MNc>FDl Ah NMtyGiuV=l9j*z"z:5X$GoS#DK Gn)=ws]Y3<˚/f9鷹4\Lǹ~ǃ_VXZm,'qpy=_ |#)Qc9;kr_u3?8(7Ft=V[T2!wkJ=z ;?KF{fip#Y#m@ݜsb`#Z%9NO5EI9Jk[cL63Uuî@nWXw5SU*"p̬1"pjqe, O~ r\FAL/+{oڐր|*00gPZtv ֜5'@ g&[5`QX*J.FykrŬs!GmXLƐxLZn\cՅȺhf!!$cBeƋsǪ  $cNʂKsx%}>j`#@Y nEb t!1 JV\ Lΰb+`q世9UQE$45m u؀[V,'mNr c u2toiŕmCID Ęx~/.2 m15CՂږQx1(W2͘)Bm]璳o9k,A;g'-Ƞ'1hfz(ؓ?1+7mjt)l tt"-jŽO!9etE祈aAKrЂو\!Z؈E\ ?.bL&60CdHZz#Lguz&BwwOQ 2ݾ M9{(}k2x]lyfuy1qMgP0QPω 1X`!cE/:XA˕tIWoBwQݣEs,`}!A`A /h[B#zA96 z~_KlBMixїjgcQtPVaVD~)hʰYgVŬ#y$]ɑ' 䄓&Qb N =昶B 6t 'RtDUWC%RQCUޅ̮161c*jSvHNw&#[!rD12DcbCNp}vJHIlR%A~jLBWcX ˌ!?ٴ*b#Pee+9 Nd+{yR)8"()ff{8`pF|ՇT>!UZC:S;wN" b /ŻX.b`$Ec a^we}kE ׋VI0P/EyWo>NB'%T,ܷ7 4Y>/*cOqV./P8L89G|{gc.>X6ث5.._X-z4܇D$}T0b"g 9Mڑ{1g|krz&r/d j jqߟ=8b4kkҘ/;Jƅ`hŤE}4ܾfжP-F10g~ٻSybSv[ݧ#E>87M)a};hçwŜz 198OPt|3ǿLnmM.;xIՓ s4PW|@ QrVMg|5ԳZzEGd!-fS,[A !Ӄ&z*E^PÔQ]`x}[K=f79Bw_EtKk \G ΐ 󼅍#@}B}GޛGkPvf3$o ɿ6H6[R'A"V~#ft7ޠwk V/2{rc?7kk85j(5[O돺5Mb;e׾z&FϵXs-P|5"{9(}5f#Hu-=wE;q ~8 Rr,Obs%+`|C/(|B.89Oaڑ'h A;B]o{kBQ^^7i$ΐ| ?G lxNrBF `BE mhY?/;0>??{xZW} s?}n?[:zϨz~6dYCvZ%+bC0G%k}1qc4@ʽ1.db&j{l9RɊQTC̦6o|ڐ|ni{3beQ-_S'(BZl0<%쬟ase; sjj;ya4Rsp2zN#u'oBC[}zgR\vujI ظu>91tlξVæ O&Cb9CbC20"fxMQÅ(<`]}ܵwvehaҷimxtc[m}9fv<=xgՠjE͓UvPԺ1j>ZL`sw)e͍M1 ZCm=Hvd4Ηڛ>Њ < #͸sb.!;JVhV(!`k1rMY]?cӹZZNDkv51kds'x0Wήum z-7 ʳme01ͮٵ1 zYtǟn R(Q.@trCt|VrlQ1:TEDP`6c(7- R5.3*6Z'bs7 m#`ngcbn`s*r72F,TY\9-MCAk!-bH!;OԔ+dӜ|HbdkS.GsLsb 7kwݳ}MatNec0u~ut)AC|3cm7N@\ |?+25ɞG ]vP"Z]BaC޷MP&u:#x 1ҨB^4:Q!Zb`YPh=zfSX0}7h-zc-?=TZW)*l|٭լV١XCYa#7s(BCˎ]YWP=TsM9?v{ެ&ܫ>^ OWG&΄rP?|i]M}\~ Z㠸Yx( pGn_dw\~O2Gp?j#LٰyC/"='!P>ld<*hP!]KsIn+ CIGwb/>y}ȧVkDْfjr)$ ٕ,V!3@=(1#ĭ7TcΡKikW(!FfP2 vK3Й!_{e?`<*Ͱ7RQgZM8Q !+{ `t NE;9w2lƾ)pkgd`zQ~Uo?㱰J3T0q #xy8e"F3>K=>1OƇ ,l0]8zE]JrŻc4K7~(#b,}z??QSj;tT ,nnMmSiW'jK֔?;_~T߽1xIj;Vf>{q|%NI&,Ԟ wP3B}+>I69HO(p&|6S梪+$3ɄζF=*lna;en 8pjxN1F=HC[=,{2%B5NLTǔP/TU!xH%3{0-OzbKJ Pe V2Y;P$S:Pe(ǭzkʣd.5摤t }vqعT QKZSiTYcKWHwSƁm"Q!+jp.M"F$JMXU{,,:r5a\96==%0MeCns4g 8.<GBp%ə{K$T`9\ !J fO93z>_%R֖ @ѻ^?W]s MNLY% CpڏKVSCWr33^A5[^y70t/q^_ mI//v.n[rŠZ׽ DIx+wiRr86zT-4#TnYx,8ZHY4/fwp3yG{',q)/hTl;bTM_5/h ] @'o;k1]Z\y2g-"lxDSi rt+IGs%yѐ=ۘ#*bLp OWhwp6*oG14f:b+[M`DWhC;FxN.GsuX4F!kS}ԯ@;ێ 7(TѯnvM0O"~NN썍<۽VhBU+{,ΦCqF~ 7՜*.DڭnFQ擹mm'sz¨Qjtu!~4+ٻޭt's;ȣZ0ٝRNէEZUaOwƅY,cZpbbж9vۅt9KZ3c ])+~0m+þ=KM M5FʰW}S(XV$Gw #nݱ]:N:5Dx9(W3EO./e/z9m.|5ټajr;%xrmUك;' $J''6RiR2|x7A*puxdU lS"jzn%2{(>O6)hj U',&{Luj ޷C1QUZ3# =4\J)@=T4"|QE瞬dkګOuU( Z)f:շ= fWJZY$Y6N~Hx8"k썃4"B1e^W5ޣ8T:{)kԤ;u͙P0DuWVqv^Nf%;!8y[P ނY&(%7= 2K1 ]M hus͋dL^θ`LqKX^m̈x/N4Dye,$'9d!ՆÊKCц63(V iEp樋dtz'tS&R,M'lyf6 ʣi"CD\mdjO< ;2BjX-PbnpN~z۾|Uٔ)Mr6o̓mx9yGHTJVkErCDoKdMa͢eWMnsC͎ ;-(#(Ki.C+#lw5MѦ8j]m;L=5N;Z nCGy-7YWպAUL/}ВGY+//m]ܯ8&CBՀlvcH׋1S(`[Ϥ,:'Հ-IS;(T]q㳃;O?.|cKuP8QGC21{4b]SN}FOIFrOSzNMXP蜑M5A.bԕYZԆz굹PS S "@Gf@ Y <%}-o3vv0"CzV.%O{۩|ܑ`vTG#c͏^Y~* z] mE쯳ŵT:#bfI,_,Mg34,SBk#IB[a5 e7FL ) &I(mԱ<+Bo]F StbW=9=~hb?(ޠC787bR š1KI3uƐ ;CK"=h&%Ic RfmB)Ofs_b#F,M]Sw~`01:CʇV-g!%TgA)mu+a)5/u;q.V*┱ )CA'Ch ( "ZWDɉ7_yx}P`cjzrFF"B~e2i%ɍŌPf=}]ѳ@A2C7q,\$]B0]$XK`iwzMo󥩌u}0^TYszQՃ ƝxY:}Z(Ck}jNG_3qMxށMӻ77DjK(8M0$nFV?h (N9b~L*b&.4-{\iT `;\ h%zv`d<:[/LNnqV"a5nۣͩwfe4s\2:w׀ɔ֘OKQS)(O}RI8)'Ol PMzB`Vr!%䨶RQb\`D(՜k)d B3*zRz6:7o^y:P^Nƙě 86z%ɛE3c1.הt0qp}h[Y])s/>tm8y}dY F>(Z}G@! ZB IZTS ||6R}{=-~x%ٌuGkMo)0δmmgF3*hjFYb>KIB`oJ^$.|}$\}-_S@<Oq%<=GqvQlgӧ#@&>uciӊSb⇳I"yT 892S%Cʹr˵\riQ ӘC~ }$'=~1D,knu_ wMgj`M{Fi7u"?[kCn|s,NWحZُ*S (#2e q]f>Ȃ8i8#<( ]9ʶ Ғ0KhT!*EĈ0?y {oQM^QO0sl%aSQf'޵q+ٿ2ݶId &.kqY%;'3ҴfzF=3L?TOX+uE؈b>^r_"4%ѴmUҙ- F1j B"TT'G"]ac4WB YX1G=#l',f#mXq:H.)x* 5$5QA& F+Ow]mFxDDnРj8 #w0K=I<FhF#sʎb@T! 0BFf>0;jܕ;F~ hC@KyDJS4Z j8#媾rr3rD8J#9!ЮG(KI?JKs QZsAvYdiY]y2fe`A .Rf5XmjчB=YTabX@ J=+oiZh1w;;" .܌y2{+YQ U)sRCϴ5!j?N"Mߝܖ8u3;AR4ȝQR4CNxU#gfJV-L;7- Eu7g"TTfZJ 9c50 ڱ}Tvi1cnGvݪ:@ Ym7DWf9vK{@Th~QZۯ zڑt\WWhWק@v_d?WϜ~goEzߦY*2˗gRY***=2Jυt," 'KǷRgFe./k$p"W΋y< :ۼ믒]w@f"lbFZ-r-'Cd~*D s,a]lXG`\ dض *Փ!Ȥ~hlLxkX^:,$Zbdae>kvZ8IlN9s/7ErlEl.>]5MmFh4–CFi'w!9#J'')ȵ-KN R7Eb!d61$̬('Z6R>tzv>G|6h[dyQ4Ĕ1FD>ȶQIsˣ!IaqeW!Vr( *o.gߊҙ=^ayXŽ(c8=]mgniƆD`Gt^&*8FheEҭAd*Mf_:GҞM6GoP &q 69ff[k]n̻خHQ6 xdʤ"ZBgT1g m,Q$yrYFCJң@Y7QV< )Xz9N 0e+e`[J{u}^+5)M"GODfc.K)ۚu]ysER&r @E)vʠ?s!ɡgo1kUv2-6[um80m/7Fǀ%&Xl -y2"Ìv;潑'f1yN3Y(Hz.`̌?gڊ)@`W&~б-(k7"\ KG#[Ҧ>Wb8/m``۳[1ϩbcV m+#vk`5ẓ%ƾgKwqu//K" j Ŷ~y̿} fsǛ?eʟ ]lQ~X=Yع+Z&Cfn ]!ZiVֳutLGrÿZW޳+}bнٿ\Ӻn `0?o[nZRV cd T b\ep/ߦޱ-;D ӷ&J>z0[ŝ8e>螬0xCZOZ46X%F[g ^ 5VLц0%~vQ+g1{  9.l|~Gq$1 _|@R`pyּge0Ĵ@)ڤc[&z%r9à)'$_PY n_|9{Na| ``/B)+^xuC >VjWWnNd"mf,]\;(GYڟ/+ iiTR{ͨ&Dm.ӽmD(Y+̠-9]qV`D P%t" 堒 /K PR;J:d }wT܄3N0n dЀc DvD\&JS: Z-%Gn[wwKm?~wK퀰KvQݙl lcG6ظe;=$+δf)gw De 6?`3gO>,_J1j!lͮ?4֐ X⿊ҹum&i͕ՐN 7+{ЮU9A;3y,Imvmv}K5-,>^t߳+]1+A~li>|p۾O93.NP ހR>aj[Yb 't`skg)ƁKҞb *Oo_̮ Y=i)MxKD׭‡O-[R\~HLl0/S ']-C;3腈f:IC+ay:_F˿-*+?2.WWg\P.ob,_RM\nb|}fuq??k'(BmBb` ؀Z!z5L2H8~w ΅,SK}釻Që1ssk;zvPye, n$[W,ïէx_q Ӥ-)&b,-i C ? +n/.V/gZ7O,˿#i-z,:YWȑ}jC/~wwSgx"%Z&,MirR`<6N̷ whY77j< 6YRbR+Fg)62$P'G 9`'1_5x)kP+}\ߎQ;fQێ[+zNՈRZh~ a%R)sPMLsM}dӟϡ_~yȴ)gyƺtl_Z Pluj^1?kvɑJ/,9V!,u6pNeZ}+Z| JV{4 M+7$ylr0f6Mmj܈ڠļ&|B(b¸fzq»5 Y(U"Ōh@KE-q}(!m31] xfh4/TiSn)Sd4{.z0FHOr jI"%JYjz_.LjB T92 L a-V9:xR9[@+ɂP\n?`[nu}7^tKqngw+2qjd^CW2JC=}{€JG?F84 wotRfݢRVxx"4g}^,sxT+yrk ^, l٭m_J\Db\YtRFY=+|J7e2G{W/#S=~(Y1>G] 'RN%NoxX~r z\r]ʀ0N2&jud'U .@'CKZuCUUBAA骒Vg1JYBR~)Ξ,MwF?.f\d WI}8p}QB|>0)_q^E2:r~M6~GrUf<ܹP,ʎJ݇lmgŲLPU RAK$M""!uY" xFző UGFޅ  kT*YYTp;dl-dj7P+W=w˧j!Bk)j!Z A^^ýUmT}xL0(#U>ݫ`;8RL9%`A&4 !Kx1$.H}h77~ xϟ,Y4t|*w~{/烀W'f7}_zg&+͋N'kd:`s;txOt(Fk)Us=gۙsOdnpM*h,eҷvo\qTJm(? g@eU8}翜w;۱ù⁐6y^z=1$ 0ͻ\WC]zPBJ#)ZClIƁNX<5҂3K9 \&e*3(5c-UVՃzsaj^|ZXSi8W:DpR%B=U(גp'yNɗrW@@" 灛6 %m҄'w(Dsq_4!2qRed(T(4.$0+Ve-0FAeOȐАEZ95؆oaXl#d5ZV)aE[~ϑ"@a[ ]c i-%w[tketrL]L+*ZR*dT<;-\9D΅2"t QDA3T>}iSͫڈ_5fZ~-+ g|Y͋ky}l@=g!|#o(a?.Z]VՅkuQk5):zX)4lbYa8VSF,N9Ց S.O`^ӁIGR-nAi>fj䃤W?WV0+BuvaLU+: WhSV #piԒhY2Oizi-U'hT}LH[ȕno]0BXtVC ̠\QP`6WgC٠7mENF)v7$H2PQZ``nzv?Є$o%v^^Z(C5F*EhQw۴~Fm?CۏxgYw<VTSWBh[^@kzaMI:( 3^R^`;mZ5X_z#C-V|!jֽ >| qE27$`Sh! &%h0z1!w_bkJۋ}} ákzR)gE)*1O N 2ӏKģʉ&#㢢h_%,>&4N[WPm4},h5Y-,|47^5:#})v-xruq]r= j\@WIC}c Y!R4#WpUB*67b%ʷA!LNx.B]:=E<%ҏ7(ΓgH"i"AWI7A\{Bm<+Pj |{ D%,G5dVj\ӳs=v7H:k{ZSiyJA t;MJ`pnd[AUt<d4)low44 pQM4(MMn/4r<U!CpS[$ aԃ+_1ma(*v(𾒅Lq\(3eIeV".ra{>ĺ{);~׹Bpr}Hr3Y~Ɣ9c\LK-~SsHfv6}o֐Zn0kÌ0-[0-`kVPia}"@ӫ$We4UH' .r$F !T cõii2pp b|͚l{F?geRH޶ HκFHh~fF Lſ+LA5376X|MU:Ԣ͉}kC*blj$lEElUU޽Uj@C ¢yxz0J R*ZghswBVjUUb )zZ?XЖ1GD1Z:J&D[dx/CPx馥Ս0ݢw<jh,&kBs=Be`$#ьD@^ 8%O4Vjk+(U/нSpB%,׸kctk2SeoӂIf5O?fӖ[M{(hVgϘ.ޫȋ s( J7?0K4WQj?Ҿ e%,#htZVOԿu;bJ*:C=L 97Le4HΥ=0 3 m;!wJX2$CxXq(2N x(Dy<~ zQ @2ijYZEb$MH5EFMGߎh ES~!M.y/p5q(6:V9+! 7S.F9HNt0<2R`LG,1iWOͻܕҧgA'/i:PMN:?nq&G.mKy/Axws*^5>І{gC"B-ҪGm@8hf$YB"Q*x|xY)[=޽ŘEJ_Vp+>Z}dQݲaod@+*g8K^gpuhwDSʖ7H٦'Y.BŶ-XCӝs|V׽_@ dlȧj go 1j{8Tryq^Mܥo# (pW|wwgnY ]J=/6OdʾHG.~Ӎ7kˏ}'Peee+`~?4eC_̸kL daMz|s>z2\JF&)GFXv#\1F \>GpCޫ|FěBZn7MfGHŊ~'R%_ rkwҼ (Z:F}$ Pj$НS1`|\ <>F7=HDs_B)M %ǾtP"M!JٻX)?SMr\{}epJ]l8UvC<%x=%q'|$(>*<_][oG+_e3rwW_!rvO UfDDvS=!9DIDΥꪧ똸dzDOxOcW/.ԸjĽ xj|I^q0%U Rf}1T.0IzըJP ,rATlgzu;P̀?BjGYFhJwLusYɏlMf[O>~qb)Gd۪Z1F1^'KlМgtDoH 0aUFGt#:$]J,tki$Q}UGxW\\!&{=sn PNOg =Τ ҆ܒx^jH]?RlQ~{ E/קs@ßGJdmi.VE#h[M) LDR%֞f4iAV*wכ5vh)Ivx=4Xi }NbXxu +Kٗ:}tk]ğST%*:͝WA͚O >F'6W/#ˋqZ 2Ԙ/r"ט/rZcImUA&}' +'q@ŢelH#<\5" 8I&q6))"h4k)|K42\ wT'%OZY dAM,.ߤ+(J%.`(Gt= nIg!Q'VzOI?uGQ3,و7_OױU!C t!%%XO3EDU v:p{r\B'Xe*7oHn*gQ<pTn߆U>-K;P/-󣦣Nj<FyxƁ`s5}W' sa:>eeS줦$Xl1^ D߼*Yo3VT\ CUo줪U]S3զwZ: 5.5P蝅e,Fvv$~n}WQo $gEvWL$o1^ë)~үf 70J CjN#MA(Լ=o1CG퐍Fi!@C `=D'=;hP@{hWUs UI@nfCڭ7I$dpF]jvpxZ8[Ӥ{^8µƕ *ݣ=ڨA[сGEH-9F4 >(2`{T [iW?hp:GjHzhէ $UG\#O}7`Qe|ta^=Ppa%/׈"ι"O{ hb`&!‡"ô!H>q<@J*!W5<…" 3d QJnx4s:\mmS|M#y=ViM?[VӍlHK Ho}a@.إl1e *,u>:T9#d+%U 2!zO-jKH$ݝl5;,Fhwf+1?JՓ0.xi&XjiF cVN7p\iފZ2-IWF(E'ˍܧFUM.'tH-Kf9D){udbL2y ^))3HD4C͗@zT4Th$ Vi'8ʞU)r"J $&M=W2oEEtGڤEȅA'7JB ZD.K)^5D&ARN-骺1yPtՀ %9QQ ڠ()#dE>jlH c5Myф`O_PRB>!eys*Fìo A9>!OHo%'<΄M 9%3ٿUyZ?}>JGsOΧЍG&3<̀@<`uDoJ$$(t0hIٱumSrJTp6S0Tkfc ]ؿ}z:e>إs/u*_)J^77v4X:~5bS@g]U;@6)D*k4Γ뱽z=zQb|L/9~Úq\ҧ36rKΞn^~c5 B{ƓoS-_9ًolUu2:MGv<ih%QHiFi_`mR|S"_%JDA89+i[W=8k~&[BSooB9Ҿ {a傫_ior׀Ja .sEw@qz^M:lzdN̊F֌GxMЅhX1^6朊5DV-H/Syΐ$iM0o]kQ"?2{λcӦ9 /}HP.̿⿡s\~3ZokfF[r>f bR6 ~C 1Yjݝx.&D%͒)7=#rDPAD Tr˷<'jzskm|9ȣC}$`nkn8UK,ThԶ *Z(W-pAo?Ԧ;(cZvjl"|?7˼,u.f}^D!F-8c6"YK< b;nx ߞBo!]XCBީ/qFOҰ)km5c.@Gr'}di'OS=]gfFW6ܳ9>|.Y.-Ttgm~78 Rf_!/FGGGm&Q-A5)FN\JQl)rOw|ŊTŊ,A4bP31EG_و5x#A⪡{#}p *Pޢ9ڎ3 4>?ۓ9.l Z)Yg`P .'3"^͎ўŇ<Qf 2 G7^b(Q yd Hc \O}y0܈vtn:7VܼQz_?Wۣcٹ1i~uQi.y!?2ɠuaϓ8p7iqz9{7gOIޜ=n/G"'oPo;thgV)@>UT{vԷwT_~h1~!VF\6IfGY-ymh {9)Ly6Se[\|+fc*79kNB)jQ !+^ر|&@(ZYqɃY"d&p~ gC(Q`#nĢݏ9@>pl8\yD@;s$q)ܭ/}?sW>e' ;}ɁgJ㸱",[A0f NfL T;ZJwc_YŪV+*NlUE# O}arĆz,{yH BplfX׌3DKniwtU"gҜ -Z(Ӧs,0GϿgW}>,4ge?(7IQQ햎j_`x`2:&^{]M(1=ј``3Fa4v,$r?xs.ךzv! 0X NuOpLW}LA A20E=JR-3hJ*:Ow<'b (5%oT(dYd!G0Vǜ$) /DQ\oAb!L !EQF dϰzHxq}32{`be[S(HO0{7ϡ7DNgCN:\ vF /K%f'Jp蕩T¾@ 5ncOzFAGؗ%$ÕuW`5Z ̮hiE =n/W.ɸ؝(Q>AfF>e,IAEEw"5$DҾT; &2T[fjM_;J)8%Zwvߏɛ +%. ($1TZPYz m{y<[9ӆo_LX jǜ!!Nw猯Tù)o~⿧IPCNc߳qVk}J kkjPNŋo\쿗Gq+TZRXp%o6*͔Ѻ <|F (QɛEq7)83ߙh IUN04)!QXv zcX  N貰L}i - 1Y09M\BQŗ2'11[3jYwek_RĶdx̐C1y4'VW8U$IԘ$ʈvp]?V8fVzG'%֔f g;5 &ʇk$%J̭ÓWokUОT+-Rzn5tb@%,BBL,,q ;-.pDpZu 57N)U r# BM#[',x!$"^[KJ[j9Y_ i 8aX8 ,BQ%ceًI "MK+$P.\Q@H頸RMTEZiQ؋u q}/fEDKO汷X*=_\D"gP{/F_sPg?, Qf}r%1{^?yxaw0 S0Cs ~%_a+_~~} [c<.vnou Wb2ŖJ}~OmܖQTûw29\^HwpSQ\sԿ68qϧ}٪:-ulq]mQb `P8h^Ž `v(D s?UŞ ry.EROZS>Y#bl7k}oTVM>җϾIy^RW }ͩR#D)<]I[HP fź]ږ Vi1Htsf,,i?^;!]fHq|&'Jj&Ю:?5XL%D0q=@e&r&:=ZW"ي[TraŁPA S (%Bj.<җJZBQo};"E߿xz]/YBbH= 4v|%cW@G| ~<-Jrh e0cN駺0WKt00ju0m(ߥQ%V̓J`Ci] Iu@oי? cl(PI:=rj9ry:ᭃY NBRpx :Ƞ <M9]iJ٫Z\2=7OFH8u8)ESiƅEa-Xq UHMQ@Qi,-:loQ:& ?w=~ܭ\n3줒;ujjƊRaM^ݴ;0?ݤ^RnBW'&F^m$ >aMt)7MN)7])75c!|^IM@y&gyM H"d@a_'򖵝oSKNt,(|t$BdӰkbd<#dٵ@Vz#[D@p))O148;ŰƬcXakiΧY;hG+رD؀hETl }&H`VOoYB0}'?Z~ ZL"T5 zW|u>\lp h"/I^mjʅoxW8Z!Jtn&V& +rm cŎbݚ_2yW\wz [ q. E8ŵp%:j+M##slפ~9gWsӰmQi ȱZ*%1=a20{)[xk:v}c%^+ۇkl_=k-_;sL hA%'ن8U6ch̺>-JRͤ k :-;| cwӁ|w'$=䰕$بn*}+SyED8Sdwl6V֡Q21Ob%Rz"mJ&(wցb p%z0Օdf9IL$) g]uKTפv|$x*An/ lWʭyYꉬHjheXSѿ,.c| _xHq܀RooQWŇ=1c9AEwޯónkR n>\W.u&0iq3 &ëi()7oo+^アݣMXu-Xd2>YDپfyaA)*}Ԛ43AJAT Hki*ET37q0* ;hHZƴE<%௤E/W ߦ]'л߾w EAsxM /G{XEv* mDGƞFH[[i6wR$. (B5h ̝}esQMk;}1qMA]Ja7Ш=mЈ%҈ 7tM_j<}JӨJsaz[n/zUa1*Hž\ 9Fka)i0I\^>#b7ˤjQ;9c.f4B+Y6",xU{ ;D`d D"i RL1Y\*)x~׶J ZDZ*n3L@c3L_QΨFOMg&IJL+%7 Z4 j_ϰ7 NeD} E5M:m]MOT;ѩ[jvb2[ (-R)q, ~<3ZJM9r ݼs]IkX4 ӽ(E'~y5w|(vk<yfZQ<(cm$1+O`LaDąרPe,XÊW IyHH;+ISr2e?Sh&()nH l8> zݦ0&f*PE>ct{oߧIx|].;byFPT`Ab#_\+ɳ pI9'_ 9f$qhQ2 gቝ/GlJ.K`B_;9=0dmqg[߁T gˌLTqۡM?<|^0b|/Gmu#q/(d4mLDC2qr`4~]ܚ XJ\%ph-?cuHUP;q%SiX+ᄵي]z4"9~^sYAXT0l(3fbN{n.Fb̏ 0;]9'M=g;Pޥ1hHFueǠWiV{ВJnCnt8 =8gȩ9ʋ13Gc}1~siۡjodA*Ѵ\/X^}wp퇛h{z>t,n?"QFiyJjISP(V^}r*!PiˊE*{6[!-.Qj`D٨ )-U!塊6W Iqp 1ܬ$̘@q/0A Oҋ'QJQaxIRO2%K>,Ow[ur.d_/}cQAk0v4U1CjbIl\K __C1٢(Y^W6;O3ƠoÇPDن]4C aY<˜4$jg<}Ѱ0T#iƫQRn(h32Ї݀V6\y(]L5e?rLd{ʄ9b7[dfmnIHkџ jFVt@ϰ=KBn!Ω =w0xhNrD߇on(o9P~5㷍^؅(bINy2*p<”1㿌~1iԇ-߾}?+]#ɰ@ؤ,. M:GA*$Ҥa䲍g|jgէA tu:yVw n9j.6*Of`]RQL9%v"t>]߱TtEYd 鮲of1=QВKa@ Bt/WtIM֞;yb-0Yp*Q-{JwjT|hE @@,dlevXo0&~ɠ`ڂCoފչ^=jisn6,NzXSec>ǰAx; \W2zHr}=*'"xZr{XkP[L⇕8U[eDA-q,4\FP`jѷkxBFk9!䁢kk#ͫex4Vò jenvT+qI ?Vޖ1NW+mKaKdY ,tM;&~o kH06 |+FF_񹇁-tFZr2?< >r82dE ~sMN+?{g ='3xs~fnwz0y{"}+X+ns7ɄQZߕ>s.ĝ|vqFj݀(~]1;1,F5>lBX_߽oިcƔBRH(ugvUCk,+/޸7 z>mblBgA7(_Lj n'^]_vs1!+GjT ({ZP$f8%fX˔kmS| uEuN"J.fyk{֘`$נ3ƈ"AH4&VXRI(Ioɓ΂Gdf QQ& N ٮ NŎ@(&iDA7bnD8#x="R6^k6^w0n{~URC I\bSk!R~ &"B Er-u*M Dr~͎(\b&M,TĸŤ¢u{$h1- gA1IkV񤙁)!_W+"ᠯa~)F5g\+'Z+X-y`dԛpA|`h42dK{MDX.Ej{`8p0&qrtP:G)A ?dv8I6ebzUUzX*5)*x$$,֘((a(Ne0O)ONLLƈ&4e4f$(ĚTQ!ݝ H*u[iBdL%hl=qK lØ3$I<`x5dw4(d4U ,E6֊8S8BGS퐇E@-4LD%V|[,&A) t^ ;45vuYSХM$K/oiaE?@.XpF1<h(s >t9Ͼ9u\ l%K2_*eX6;zpEKXJ ঙUXŭFBS`Z-UlE)G m_Ԋ^#C,x.ߏw_p%Ha݇prwA R"o2BMd<l_B/!(DNM$\IV$X<,sn&[ԥ6Ѱ@il*m_lK܁+7nr.c[yY1Wܩdddd˒沥,nB?ʹVKE*bn821^ŠR$՜I/P1Hŀ) 3)# !qN2 V]Jl+r;*$(@Q7"T* ƌ% %@JJopIj-VmCE詖.O|gZk.U躭܊RF(,ȦF+&i)N`r'gm5 8C G1XܩHōQ$ }ZNgFlF*GՑ֮k|K\ծk@5@}k\D4 9Pcc0ql ZXZq׀Ma]Vcn~_Ӟ NXI=.IH'{s.,ԧSn/QƐ^Q8s%TYe"4Vr8[ <["pi\354(fԛk)h+_0̖J[ [DS3 ΋lci薎VLԎk\8. V' TxJ4ފ59# t̓]~{7I$@J$d|}'W٣+RŰ/Cή|Hx=  k$iEIȍ tpE򉡔`Xzu (R"yϜq3A_G-#.}>tZFuFbC}լ!<F*P}(UJՇR8ZCP>PD5RI31q"24K&ɔ(εOPJ+x]P< 9eZDM$ZWu//#7P>D=gΕKLX|п?^-_YuϞ| *5o}.3-^ .mN u+:|f]oGWNnqXė˒-KHI߯ᐔzڀGW꿯o>|wuz_}|?/=>EfR =̺Y'V} N)=WካFv4f\>wT6~p#tZ5YN Ql`)ȿp$ E~- oJ?Fy;|>*| Y-ݱ3u}I\?qIf2N%mO!SEp~q=)Dx 6*i>){B>FB2 &5APh#\`:|֚l’v f$](:0/e&TI5#"O()<ן!sO*1QBˌO,ʹN2Bs1GYꂑr*Jix7[6S(j[۔7j؋ss#r8("' K9HJzZ)BA لA7;Sׅ8Z92ѫւˣ iMƕ _]_c3yG=x{&f+ wFY4:SzJ@qKC E%BlV~U\3}k Ct& ]FUIJ$kųZBXL΢2u50\"nR\n(iFpQU?{6c1u!NQvĠdbjo.YKSW&,`JjnMfJBԠ,\GXgA#VN|aPO(RrIZSAj ZRBYIMC8dƩTTJk9ULGfdb.]Vb AGc=BDR 5,lm%*SAb Vh #/XmFt,P͟hh+Os=Weas+33h}jiDHA"OQSoL 1* Hrrx$E>w`eje- 35|@{CzЂ\YksWE.>)A-ЄR4>Wk;0%\G;aj2@= d!) 1@LJ%QZrCۯ3hi7ҨJ5gwR4LC ̾\?Mtw7XEܳԬčR^D4ќqJD4>z)_q8ViC2fqNB̗=\xV"#p KoR5 xj)?\ΚSnpGHƟHRX@ivX@A!l H(i.S8!SQR&p!GeR!&&>v+Ƿ7@X.wn\fi\PQ@T K·7$0QTn Ẅ́L7P? !ZCWO]x5UGzhÆl"QƠnӯ sKd7ƢQ7nl2f})&{-xD>:J_jM-Zd V,qo*Q/nH%G!T%)7}W{aX0 Xj}Qf.)R&Hd,K15&R^x ڝ`"'ӊeQդrj~yJ'85^KBҵ 5?[ 1˒Ϫ4P+MtT_@TpY0,*}TQz>7ӎXG?TՄԈG` K>wϖ?:WET`hQZ 3ZjH]QI)(F KP E6DĴ%\iIUCOjD ʚO55sjԙ](?wP 2W-wZe ҔnZ~g]؈1̓v$t!S MrA$PRA&eFZHwZ>QSu $Q:&PZL c0kz'3Ջ4P>r_6}6w7qX W~}_و6 >@JzG[%*'oq s0$RPԖA:pQbav3-bn nPGeqs~v3\NQ 3Š|tqj$FG ܡYc3 vAw _{5Ozsb26^@࣎iJ,,zI=(MQP!Jg)q4xs pTLI !* gF=~g4-Z4S45vf^ۈ|C7#9}I?3X$ 4%٨ccm^9|N:^iQ:?K4 `3Z՝$Ƃ`)?wU\RҺj1T:$c@*3(G$*.I:͇a}ujի~xPfpjM0w0Ru5؍ E;3mګ7,݋ܑz9jrzu:RH!:Zd%ymΗ!vmcgH.ZHvkEHhcs[Ic*ݯɒCa-ہ@>쮫j.N n8.qs*\\eY,*}jǷ5d֣] jiyrUڻ6/oU^*d*ZX$1w%Ճ-ok^G\?P1 >Dyx_MOA|:ͤ%*UDZjrZ{!ɍCKkcd_j#c֋m}.]KX~ND(܊."{x 04e-WRnEq,b0#~ g~13ZprLFwG];γ5SY&N.Khz\ű.eY+ˢoq9꜁, (%O$C`b<Ed{S9\ۧ ]N,([]Im>dhR6!@u=TF ʜgaT ʑe[NJ Jכ GCrW'~Y i,ʅVcж]v[384ǶF\j= (Nj?͎,]qDl|(D9zQ6X,TN u:z^`֯NG@Y'N= ȡ)ү !gd v쐘`oofQGo:eDXv~vOb4\/Ɯ/BQ&)LTs-?[6/ sy2`.$g,\.LmDzW.[ M¯ʇP>biX.nO߯kq'@LkBBSA7tlěAl!MP˫P&vrsw{ɮ˻r|Ҟ>Kf'+hq 3y "> =q5kffSўjA8D1{XN1 >d:J禖%mu2 .,r'J1' 7e+DR}w G:}-~[N͇ɍMIR^-,w77)8uwN3ǿ|}ѥ˔A6S~6S7io_FxUF^9:W #F{)' JƖW-4EI5$ K8K8Ҫ oAKA"Tҩ'!wy@;co(4&Qڎ/Wb\W-U˕Pp>>^%=c|Č'YB_. D3=ҭӲQlcuZc} #6 }矟QBbB㹅j@/ۜR m\bO D1TˁӾ` c<(>%=RԗtM+_f<w>3G.GB$MR7:ǗJjN~lGsw7Z1= ~ó,$?2`PdsЬS(,D"cҘL!3I%F#+JC &,'45 b~q ~&d1"KGA>rf`՟tM~ZldQךM`X LJ.(u&`<+;ъ?MwI%n5X5:YI.vFmui¹4` e}30]:a?\-~=!I± W ,xI,@S.> Y(\mDs!'⽙KV6Hf Ҹo6S0c h⃧+EǷ%k,(|_O|x@k޽ZkR})NT|,QyXw#|R]gehXJxFgG:*Tyܾ3 Ƴ{~&}nw0i) kZ=E Tz$$q HR95Qpq"\W%Αnd6t@ ^d$S1S^ &'212r]tDMKm'!J:Z-w@%%0vR V!ID[2B6 [GPcG2.&lgJiYv7n~DL2fľ$IL2n{V9I !C++jFfhx.nK71*JkR<iY륻ʰ=И?LaيUWwHgv#EϘ`ul"F2ܛQEݝ7k[#h5}kT1[ȓZe|MӠpfy&.Stmd~MʏQ浅lro7 =O,Ic.E +d;hts͸qK0Ԡ36S8z|zY)c,?iͽ3?blU\xY+5ciSُ|ãIթk(kͧӯuw 9h@Tl^ue:6Hׇ0Kֿܱ1rɮ_|L&D)OݫX#[ד\PCv,e_r'˾ڥW{5sM(.WڊwU:V#uCk+ѷxw=ճyn=ar.!=lV?r!ۇ~w٬Qyͩ +LzNb*!'#yOb uVQx%f1`ֱℝJƬ%)B^u)JOoݱB}RԮ՜5ݐf5N Hv̮w @䡊2ހ`_LwR#F^w?;O\iQFOŻ|.<*'__0C3]$jB<=v0N;sZ-4/zi?N2t1 kkji\8FH*vNU;-; 'GSnqաl#)e,ו#H.x4]򯍠_f|̓G)sl_Ȏ|S~uT?ⵠθE8S3sApX ~EM$OLKv5{^VmL\QE4u>?lnu ~}lp<\WUn:dDp !" X1ZmQTQ{y1#bFR4w9&mR^Ϫ@EC dݏr;.U<s46h}k7cKN\(r"l*(uw9g`̿2E{sd|#[jVU)6c,trCFղLo}l&PiԺ*aqhR˃R;] ŭ^Vmh҉{Gq4ZJ<(CJP1b\(nNɄJT1y&Uzɛ< %ʔp8>ӑL#@r\Z2xCA% kcNZ#X'EY#*x,W:g%WajO[3FpQ>!8TPYGv.")o7Mp[0;L89(idMdVOi}7;pj&Joynorw$v@Rh4t,H6_#H8?$ImBR~,F zNX Cځ:A}KqWK{ k=VWJ+A›B3 ; 諤mWI$P2a"*@ɉUؘxI{[W4C2л̷ or,HoOJR3ƠYɐؼ#89D2L,&13zv/}5Fgαj`za?})t(Vt~!|l וֹ4DIG3ùà$ru^fu~"A|VרyW^/ufu݃qvm.[_HZ=X>ۯZi)v{i8s5?hL%kV`hV7Z-8@\%a&ʟǗBr/?Jȑۭ^ !(5j8tzu-Kk֯Ho{qrڷYkBsKCL+$YDbN azI /yneg J jWF}Sk#ݿ;ŚF)fB)zÂu;jB1O CBIY j =BSU 'O1qe)~DTOњ;QmݝB3=.MT=D^50fe,Jul:X0NgZ=M'u'mSHaڃL#Mzo.jX۝tjQ;'([gt=uy:7XojTvS]+IQ~)]RQvWdw9FN_e%y.ϳ}svYOg2=|"r,G>cqeZ2Ea`%'+Q'^AA=&Y-*jO40s1"1!H&&ƅ]43r@-٣&Z_ׯ=tSDtJ|%Ū#v2tɹ6Y+)e{o g4~0[=20T8N#5^x,Ti01A($7p$`EKT^כuYr~R"P US -P1g"f\=yʸ6JO!/QC#!S6HI9lGYsYM^3p$ӦGGw0l[͘wő)sS&yez@jARJ990ATc {vz5 )Cy=~Ri1 $:rR9 x0K$k fZII_{f|煉t6jmK4.Ҡ _3{?^CtvݮҡUm7O2C@>%GA3wGYPDE-6$R~OԼEMG8#aMvwC ܕؚ'?o|foG|74'^ ucJ( #VYгk0 B0Kq٢q5W^=ǒ.YO\]iBPǰ`mnht(S`jg ix`CqɊ`[Qh`%<0M:b",h'G5\@oP0;r86Ĩ8 D{ w}ȤynO7HEF`]e)hIfK<)6*4TUi8#<ilFň TsGxT= 6u!L?2 @XfӠl1$e4Dd(P:n%2N>j/F~@52eMҙ N ^^fj},ȤtDĝJZCp \`DpN݈|>F2.p J&ayDipX%HyS0cřɱ$RĠhqڸY:C䫔ˆCIIs%(/&A@WX $$a5Žˋ.@ U`oa,Ua(ӝ&w-yn7+`ojȶ̺ݺ!S9匬1n:g竳Rft^קF [&O;8:Inr6*qJy|j.J W.,jO"ɳ ^kw8++}nF KTD Z^G6D#aŕWȡ}3CrCCҌRmݿFN'O$ry|bv n{ L|8=he͙B\uE_gʦ$7 ^_t?g,yόsFNpcn9`S[JHX!J$=g| >nJؙY b%ݛmRz L;hHtJ_#gtm>z 1>=PCL]6E=0Zj,=(!^E)M}gPu[v[q@$\Lv/ۄ]GJKhz ES) |I@hB|~+ 6ˆq"|;Ēm /-xO@ @8Mr` +$vn6TyRZ3M˻N[qKC +#D#2&'Ű|tgArwpkvW4ɠ`ܓs55Ѱj_O#ʼkcUBVtck)5zNLڕJ1xKSe3A T6?6>=|(d2†I.}/7UbTuo7]`֌s{J)j~y!m 8 &Yö0@6ƀRž$ fͣQSvo((Vm,:*!Ǭ)#!Q팍O4Xe ZCS_n6S WUU!Z4Y+8<Q͛ ^*,a.bU0<oZewZelΜ4Jn=NM>`LZooZCC,1oDZcD4)|P{V*D<=Lk㼂]5J|75s"!D BD⃐|_(R.(B,„0"A48*떎n( f͹#YvH` 'ړsNϯ&u]c>w+}`֠Js0GTa/DBa|_C hP[א ސ} f a_ZB 6ծ6 a'[j/ntz}1Y%q/L%OLzV nMR2 is$!qm"Sxs?kѮ9GB%DKv/7뻢PX䘻bCR9 Ǩ+QpvM]сELٴmB\Aw{QwU%%,-,^fI>uQ J s23gd>O5c7a?cOۓ LL7d61|ʕϩoDޑȗd3Za@u#y} ^ynO}(^ݹyiWݹjB^+xKO73qEDiϣ,i}ce H5B& `0 ZQă0S(>/(f`$|ў #-B=0R.>DD$XE!42.9ϔ>̱`c@Z1]p8 @㍣P t{(Ԙll|xoQc0FjD4BSqIۣÓZkOj |@ yN@S2#6F@# :_3CqUmMPHB3[b7_ؤF;U#\S%|)a.~aZiq:!֓C!q:Cφ0gg\ gH|I⇛o U~&+͛E"Z ~#^?߾؉?g<3W~&^[fӬñ7O\tth/ڀAPx ?04>I{~>JQ<q8D}nAa۰9Y#ưs$M}>Nb| KyƗ`]GOx']ovALGi*4k66!\Y2~6+d ]‘?h3sm]|Ӱ@`{awg|t:35K:0t[o[ x! e!M蔬<λO@_O`Q_HpBj9 Y\)dT`)WBԻػS3{"tLw+a5#U-+oic$$ݡ(sagbL(Fp0G޽SV+0K|:nd`t6~@ ];ロ͓V&ȴ ȷ;ˎ:/2% #^zMbL%o*+xH2>(?Vfқx/,K$}Q=K(0v }.JLnZbe aq ˯"0Fqd`)AǞ!">!ƣ؄{B;0آ9l P!taћI{egF' "ĩ>BS9(=l*Ao)Z]-bJkǕBm9fkڊOOKkß|--ecR.Z DŊ4Gj&y;^)I`4+d5v'kMwOmMf~ [l* HҚx:k /ĥ<==vxxRRM1O,\% $ިv>,'lugbLIg<ܥ ;I-/BU>VU>v#9ᎉ/01МxN9+N45z/"CA`eUt`O62/-^NZ `!3bPWZ{*Rb8~|"2R:E}ya efm:Pchyά΍ƃ׎B q)੨ k qbryлN;^dfclp7y&@%dֿ1t s%Yn_0FW*igU:f coIIFhO鴿nDYZ0b}?4% S$^J+/0- ZήZ^Bwg?| 7:Ń????_.hlW&2S3 (o w'mO݅o~Lm\#|o._fހ&&-}ym/_i{3 haT/_ilf}\g@WY Y~θSXJ(YQoc<sc|M dz#\ ۄhƥ:yWI~PS9+,N7-j\}Ϳ^Mg#5*ƗKW?84ר[$yGr/B~ KIY!!L 1- A6@j%>F$٠#` * {"$3 v-6Xw^Ɓ-` ܿt9+rQCX^ܞg7Ͻăoc+XZ 'D! etxK#v3cgS;5Bk`gQXŗ*,Nxqb6P>hr(+p`,/bS4'xD"c!ȝ4Rٻ'm$v1u-ۛK]$a7Ap^KJv. 9G)gxDDݍ$#2ppv`d,h/[5tq#rӽ~ yzO/n71.0QL4*-v.Ӄ)ʞTNF,<01iՀou@Վ R*m*e"w+i뒼bJ7\Xba`#:^!Cù"Ae$2k+[c"j8KiB2Rssь`.z p$:S$z6T>iTd>>dE}.}ˈa\\˨2"XKJ^{vdĄ*ܻCsw/=94/<"z,ޔpÝ/.tatۿԂYs*Wq?qio4 o=ݷA`% o8~v&~' nīYofbY+ǵP-w7w`.<Ƴv  %*yDU0U+6~q, m[fr+w xL~k gJc[NK貲s rô IguJsH ggZB@!h杯'EQB/?O`M.7 (I&+ј pK255u|}Y(K@px{+`r +־8zk j [C`dk`P({vI71cKCUsn-~8zƃVϟ}gHvHTk>(gҦɋHS! Zp6[x(avuZ`>gڄA`Qנ|E>/=qT@1(\NDs,Q S="-\Gc 7.?R3Ԝ#O^tE m:  C.D5JP"ifY-Vq+DF-PTK- xyPR|N;/3g?$mH46wuO|V&*uTNZcPfdQY?iuI+H OZuyJk yM.M@\֛0A{oqo9797!%eDFIVGUFT+4N5aHӨ]XtFuT ٩cn崲FiuPpEB@V QXPhɑ0$ 4ADn$Hժ OSR6ex>+~au(5!6-F?ΗITK,|$̲WV& xcUwYg unùKKqLw5QV)F16x{o &o˩JТ|oIѧ~N%hziCFJܛ஘`B7]60Yoc֪^~KW_6qp^Y2c+}y=VrHۖ8닚-YRR٬0TO| 50H3Phpq?S(̏ԀOEȒ XV&T"`.`(PWNHCD(RDքO>q`J PiE(֘0`SRC$(;,yG1 }a5= ¬FHEr$1tl(w𕤁6&|]ׄ 01!,5cQD[K P `8< GyJDdg2a>0X89D/>ЈI1wY Nq⸠B~*:<-oPm)kf [PLrB4P@ZcK*@!!ф҂w0[µ/,o9#rWGQ<63U/))lK,MvK'7pkV-KVT=#Xտ¿BtӴ;Ajq#(BE, 4FFF8̅ C0L h;߼~ZpT#5nWwL>~jV^^C?"?d>cfx9'jßjEw|SB"*Lq"a !_GkX%v$)hX0RU=GGdbbG J #?~t{?~4In - Iܾwf䟧3 Ѭ؆2o!ᡟ>CE$20KMH<CRxX'Jjټd҆EY<\PX'::{usjOdjN%~Tkۏk9 `DTa)|،n/-O.9a {R :A$1mk8T=- Odoo٘zhswY3_ ͒`t>E%5HVLP Ʌh\pVH$ )ۊӕk /cYվ@@F1Cp 4`Ca ȟZyG-KR +Vd=A˓Bct *ZE-okJ1j)5-cr%8 eK\ɇX797kkA"bc` 6ܾ짷xf呟{M[~ \řh`!~>4#oxI^xGOpw ڛZWg}rpϮ}5dyD3ŸV C# j7S#Rk~UxUꝹ3S,jOq15^jvZ-yJYj*O0?xq Xb PP ׏&oǹ>fz^;ȉ.Vũzt`ʃU7P]7Wu{;uŠyIl QQ'/n*$(g{$Dnu|v;? BBLh- DC}2Saڙ<$VB~-a%oxG:|QSwL#øzS JTgzV_29@hN0̜ 9j+)Y؃﷛BJ\!(Z5r\m5q"phXȍ.nom;)C9->_V8Pnd ںPoBHR,e{NPQv+ ]Ae:KQ|34 z\1;4&Kw\_t.~ͷxp{&;緥[gx2Ǒ } VVr)Q/$z7E U?hrL^_๚Y }kyfngV#Uc1" < #lybf ,̤pR۷I-ѥ`7{)fqfy1e9cD4=]9\e!&l&-&O/PbY5I,k 61fY=g6}kz;ԢSP* g0Z|Wg[z)\G;Afj56J:TE]<?ПdV0Ø?Q*zЕ#03ԙn&3Ɛ4%lMW웗CFVgA7(6;mPLRƞ@7GgźH/G"cMyu&Gf"oNɘ0& !߇  @MOPg6?M4tsiLf94=INy"U3՝bl G5 ;kז2,)qXPi7i9@j.R`!ԕMv3^DaSնO[c8B92*ZiZPz(TX径Ts1BIUa4EwQhv& ON 2d .MlX՜ 7P~qQ$xA囻bV#W#̔k)RxnǑD2PP -ln+Hj4.|ZfhbAh37Q=M'r!]KE9|\^+y4 Nu?o֎:R*=Sk9: U)ͩ rvwSV2 9`G|3Qb -}ܐ AniɆR`X jKfp3MߖW>Bs, ")NAe&l*GA }e2$ 4fP҃7eDyu GJgkWyչ P7i u~̪Zx1iZNs pn6F+3X,覺0ńjRf?rn1yyϴg f5F1xY*R F_I}ûA\s8#Vb (ֈ -ˌ7fx|H$5[߀i{% 678D eQ6:Zܐ^~8ȷ9ߛWBDi[ĽXPuY6Wov=͕yPFa\a7l#A8=KY QP֞č AITd UQ|9 Dv!o,A8UMiҕ1`CtsUTJ f,Ñ58VNf̩=OAG 򳒏p5(⩐yQe?1~T}kl>M2SaFLnў\z?RlʌH;j24#<C =wq$ǂ5G3bN] ÜdR"1g7A6l hӦK̚?D wʗlu{\HO2;a0cn9QOvSeN?a? ;#x2þJ-A:6<_U^(z..RTz`'W{u(SVl˟3 @,zp,]c Ѳ/XĜh9PvU~Q|{V B1ض 8qn0еޚ4_DB5H Ri܎Ks "uxn$KOvv2 Hn$.v 5b"kMl`E$1[ĔA)`Nq_%'mkc S#J$jvPI8c=OCczY&[A5rw6'|)~Sg Zj{=(%QrY6[8kj.ړs[KS~3ΒCL510DPa !DHgl)vi3D7Vw(|` aHI S \#[Ang?kpDTDI9˄( Y3 W1={>{i@׻vCwɝfGXAO'+;Kĉ_ߣ_5UX.oEu&V"Qe6. f$J[JqnYӯSuU>i#gwfا5cvqde߸ÑIRI^ȥ5%p2/f<(Ք;0o#`G 9cґŎGDDEΖw,>&+lc[b(b8]z+sy0(N&SzY$XmVJsdOC5zʳouά% gh$8uܨOSfN_D8X=p ^RE ͙n)nv!k~SAXL 0 C'͘J@Aͼ- ,ɥ1 GRdւ(8,E66Cp*cÎTm_ecP):MNV(TgN2m ٬n+1 Գhi 62%E5D&dʃ;Q2u6K-hH><Zo 6J1QT"!f lR)J-Tyi~M!.ifQr "aˎ$=6$XL3g+LBY;%Yj%Pb qR9̺5oT?ݺ\'ijRsAX8"vFk'Ĭ3I™ٹX rV`ccގB_عŃMЌRLeӮ{z\.@k0}[~{̮znraOf0 6O!ACJ= x0t1 B_{Z{82pF'P xz>y+i<q@0ׅ"FI|]5i ß~ʣOQvำZ%3r򳆠JB[?u_ U#j歧"ZpHl(O B@8kt+ŇPpVi&6}xNZdN<*9A^K">/I*z X #8)V4۾p""y"p75xTufRU_ﺣ-آ(D&&W5Xv -K!e۫dz^zMMq|^|:Uh 0]NQDt}?۹ 8Ή@DبSlՓ~dI_ꕓ.7&v]7n#~>~S8'{WA8֟}a_a'jL#%N< S%1 []~tr2MR|_?>\|:gSGL'm<d{Z8{g_bnXiJ=>,Fpl_ ^lT8cAt&?Ӟ;T)xTRKٍapΔ54GdĚ(Y{[mYuko(3N̞( //)'=`ҟ;V2X]]B~ZHVMKrݣ #Cyؓ>8GP] fho6}人r8ܡӱ,طnt2oJe; yi^sH-lrk)w i4q+-WJDJ?j#!{:W(p 2<:|t>O. 99"4‐<'٤K G9:Ar$SRvW@ ug{tyۛ^c)9&oLo͉[,7{6C^zqMCeӻ^L14#nBV Z9Jk\KkVH%AlS^n+#Si@oߧ,,˭e[_/V\"a ^ݖ?YGp4/Z .{%$0MWzy_埣s~nMgbVLӗb}+ PIpհF 8Dw£~+\@=s/}?a"@H.p)ˀT;xۜJQs H8 ,9A!AJL&u'MLVsORL0cM< 0ٵ;E&) ^٥E%xSz{;ީ|T1SJ'ः L`B:&4 p :,Hcy^"#t] nAJ>_iJv"I+zSֿCAj|P"_0iJ2y2F7YUNO̊9cO0v-hdo1\ܥ_xtd;E)NwS#@aI-M[x;)Nw Qಭ²No(m G JeE&ХوcjEvx5ePJtWk94g%"q"Hpye+Zgms8bD SXi灁곩栺5o`>6*<XBj HY%tPPօE`{4OM8zí8zmP TFD*B`DG8&Hcc51Eh,mMXFh{c W\v־ҝ{SanCX/d^c䰎 A:smVTꖆ&b-K u+={39w=l%Wm?}W>{[% `y`QR;oj;ZȚ^t>ء[*刬{{+P(:1}׳FI"3X ElR%.C`dtFfZτͣC<%X,x{KR|EIl-Dְ`L#',҅LGWX$p٣uc!9#[,7t{U *QҤ=1Rdž%[K@`#^O+F׮(2f3v"DPOc3B*Q8!؆J _aL pLh4Niau&, 񡘩oq9ln3|B |`'ֽh9ˌv.J߮b*ursj//I5هd`ܣU "XsK,{"M'"_.-C0xO/Kw&9KZ}I ?GcB2F۽շTGsXt?I- .;KHvewƮMV{qaSՐxo2rQr"]AGMM>`7.8tƔyڼ  C fA1gi'dޞ;Y9sV1AJ ykx3_NImvy:@HFyJAeگ -^Raߧ4MYE |y'JAR5IkjeBΥˇVΫ<wrwOK.?=r&$}XqaGmY"B .b\| ZT So?6ǙK,Iz蔄k/_y= @ca|uvZ<.}lN>T%_'-]z- $ \7xo(Toz Wkm2Sa} `h7%+nAfyT,S_y̤cn8`/Żgf>Ur6Y o 7˺|?u6bkIyzO+o͠xN޻41{t?zKOp -A%A^$iH.!~N)׉Y= UσM!h\ :Fz|x$G:#-_k[h2NwE|f藚 в*&ZlP!)hiwQL-ѻK-ُz~Z!jl>֤$~nJT%uҎOs485>s Pqk &$ZHR8.|Z h: (n_l#s Gh٭QK xy4j.K/QY>ǥ=Y|lUN(sH6ᒼ)mPVnlQSdP0" LF2i+Ud"_P6 تF3,ޕ-Rd$u ھ|fY޵{帽B8ϰ;Cs^i)X)nZ'2'nwTPcAӪPe0ӓWaYΑ?FL+!F[BdEqV÷Swf4X6ܩuU%42s4͜69AQ}Ntj pTbD("*b4` '֘8Y<)S/^jUh;8/0L<@wSMnA7ɞ]u4#$c ӗ*fD2裈Ul;f_^\0bJ xpNı,nCbH  L?#]J9r̚)R[+p?ZK ٖ(Żc67<+Ip-JJz1 J[uT]zr)%2bwOȗvtf;Gq0rtVAT1IͿe|a]()wzvy*2G/oڹ*Cd}tb#ɘfZ8Vc8 "ش*c'U]  ,H=ڵVaaZLgvQsKkKȴ\Ux5WjZ(qQI)Ιe&VDcRC˱!ClpqL] PҨ4a(ǩK2%t (!GaB'/F1eLQb}/˞f#! 賀Ic$TB)o// ,DI,Ā2&nQ22w.X]lҪXQ쎚fKtBۈf 8.\wZѝI oQ4SzRN,1 {@lWZ 3"R(\&Ih/^P69`ϊFj.ӳ=kgi|48kyy7a 0AH9 XD8'ꄶ`ի#+ĖdotWBXqàㅢ*2 JKY8`Q6.9{^s's2[wx9ØwBaT]V3b/TuQR3p_/7;Tt7T90wɍ^5w_B f$N,9',>^&z u8E5 Y/'aud 鋂r8̓%JaD'/BDᶖf]ړ]9Uxl $y9WK.8ctי84s^mҺ%^ ATʫ5 ^R޽$*'%P=U X7 ƫ,DtNꀀF^^pKV՝SI-_jՁH~CdIbip6{&Pl[Ff\jF&-v$r(6@J;+Ćo[µ6Yo[?Ӕo k$Kb&Ξ %^TH-.pq+,t!3(Vؐ!C|xAX1$kc֐ *8}V[>zU~Yd޺-3=VA|zNl\ T7 чd`U "DJo894ië>QI`qSn,⡠E/):墻_0S~3^Fh"IL@"LG&pɘ(εqFKAH'mHӀ? bX`g8/}ڊeQKR[MRғDZ-I,u}tuQ'>hRDƵC|RgH4$eu&߶V5#l:ca*E^j#,%$sy8H{cIPQDNq+$a1Q $8"ĸ@ ŵ艃1|w>_05\%l89-109:~'k6Z#܁TWOQ(ᴑNf萨*W4Dúfm2}tnъ~5<hXzkMůU^+i) ,P8f$'.fzJMѪ "J=CQÅ\>H5qIh$0.YɩR.51۩{K۽]myo=5+~$nXjy<:%^CTpD2elaJl"-r}P"c53 gqxͧFSzښi\~E<ƗLDCY5ֺefKQ`pmw٭;&wefP6j_OvF25p.16f>yGClNJ} 6Bh΁ݍJPU#>v/QT0=Pur iG1-zs8$9vx\*q]泠C+Ctz>@υϾGyX81a-#!1ZHmhH`<^ P68 Eg^8ᆘ?\\.IqKPe?b>IcL>.BSD`fJgQ+8UÍm1eϢۊZn Jx mGQ!Gsצ Zi1cOӄ/| r߱Ty~`k[-{"cy1|ǒþcɷh.{L\qIM㣹⒗4^x-ҹ^B8TjRP؃) I`Fˍ`I57MEQx-ʧ@)Rѽ;E<}͓f-5뜢ӡ(e,b=q@ f29 ֘3hwQT-=r`)+#dr!no{p],.=.&rv?.dF¨"Gj ~#B4doX..[PiFޏmQ֎Em1Q_;\n(W$OވPzSF}zp*(z|SLӃpDFQag/]+ע&2ʵCn>G-:vp1owr[DҵQ1 AGgIA$lbqHep_y v^O%&@?MPgm=-Is Z+rFup}G1ƙP^@Hh_[+\ Z+rw0vlVd<[Y/BLvf^ԁ'7HRH! ZWr$rbgp jR3"x >1"L6&[70 _VtbR#=& zIp_W2kSD\'*:":7sM#O_ͮ_k&" /!L'-L[_`qi;hk AsjTY@9B@0o=q&54y"1έ1$bQG>wko⣐t&OyglpG`r0&}Ј:,Y4\#ġPz]Z@B55fZAҼt[zBA(O8~p ZCVP$#%<k3#+) A(xyBhGb@mT sgA+t)N8*(Em/fCqM {J8^H r4w9A3+hlJґD8-C0OF02_FHNIm cV8ƍV)治50PYc.WaTϣ(w0z\5l+88qnv|:.j'8>6t}|va?eo/ЯfﮱE5=;Zj}y~zɇ2OT% B[6E뛿-.TU7vpHsBsc-)ɺ M-T"+~黸:GZhs2q*)Rc)w N3ۿ}IYt=3p+<=x?]/x-9)یe 'u,\mgA~k)`o7׿?i<}7~ mm?]39ys_H߿9j3e_m>όW?]*;]]\yAi;ߞ}z #J+z25[yƇ4 EК@rqRbL[S!zc"D0L 0rv`V/VZ xnYlЩ~7Fg;`Jdĕŀ\k0ª:s q# #90Rg<2%#e*U&Skc56gi4Me;ro}޸\bht_8{=ۜ_\bOBjdGހ~u:-2Ԡ'A3khB`RIjZꙥBo0N>jP?ϴ0"r02 !,4rZ T@صȗ:(XtL`kA!53%3EɌ4hZct8-MCrPt$_y u[ 3 N /QT3el`57t{ekٿ>7frX^~=g7e'.듗3rˬ.|ܭ]:/yC= %oxr1eݗ>|:^4kG5,o= ;_߼ Vm㖾WhvrW}A~cX~Xu;]kmQ(lJSSt[͗7C\jup!Nx {@rI׺o22C,tI2/Y$@!KcndiyE4&gGC+0,z|SXg\бCV!IKJ$$1}`HL oۃ$uIW{wW#@!ԓ:5Չh3I$%#I$L$1]NH&KK7h4wJ>Qb{* eE(⪆fU0 nyyga~kw3:iyKr`(ki˳{Y.+]oᷟ, @=dS@έhoIr{A?[z!nl 8t5;<V~F 6I^6vg}t=Yw}Ӕ(T(RTQpD,_&/DYRR'}=xbHfh9;l(91u 8=me٘x[F rz:O)|6{D TKtԙ uqF{d:tDDDxu\xu\W1k#R[-kSE&1&ÕAа%!t)'OG4|Q _~!UJtDY p:L:*VsՀb#ň44vЬqцEĜHxYEEW^791|bM`\QekQx9+Bl V:p֑!떔h62ɡg b5ҪR@.yuJ$s_>/J#;[^ݰ{?,܎#g9Q ~2 SE+|#li(qt@ 0ҥ?)ٖ 0Wt )J5O^I1L|1i/iԐ]9WPRX0xuԭ]!1lN kW0ޒ~ 6u*-J]oxO^YSu|{<[u/QN:9 F`ꒇJG`9R|4$ dM_Ƚhu(Ϛ9",y}5 \с?-(DX\O}$ИR8:(Sq7z*,(։a% HiF[XldIkvcN0el냍:$ 55t)iT6="C2FbK@3VI1 ǚh,vQ,8$"iϙށ̯sC${Pױ5@^7(:` -0v8B'0z$)qVO5 FΔ<))6 lD )$;-Pk@@d<[0+p`8sgn:3NR~tBA'Rp^6PCJCJX WM')P6BÛ5~vG)4( tdZb(V/5{!e!"(n^0p ̜v+PA˰XF*De] VD ?0ƺIH)vcʀtOvCN}Ll)A 0 6 L&OjS1E$Ǜ8)ONeZtqԒ=՗Ⱦ.O?} f<{]Uojn҅eq_W=$}Nk ѭ 4܌f,Q:(F|=+E^~8ey@n-I׻%,&kHPqIY=3Oq5 s~=sځqNUosh=PZ(' Ю o/&ԾQNH&ۃ̌X!Uoh  KL 1OT9mt<'M xt%L 9M :o)-j2{ѠzNFo&wqt7s|Z)e\y&Bs*EBǯɼa%tTj}Vm 8͚tH"VtU.71?btZP ,d)ꘈcЬyӀ]OA0P=!q:Ҝ!17&$/y %ޗa{Cr:udU\=y6`*șEhTzQ/Fj(5ߞEnk X [,*Ӡws欬 + LKBc#;(*⨐\HC׃0LEAIdBQـ})FS$(ٛ^I4poGQ\fu]?I;1yœ/<C*NcM6Z_f_i>wy"SREw?2֟(_%gaFHX؉ Ȥ>Qbre\\]<#TSS# C5PBRIʔIdZ.v0{&C5C0ۥEZ2wRr=8Pd sCyE}_x9yE5?3}?nw0q/}p VL.; wC}b 7gvrqDg- )rЎo LHaF3՗ ˜.0/ xKkk `t4ЍL{kFza+ #f2Xj /Xpؐ{.#p\v7/ }$ZiEEPi5!R#%bN# vX/]Xn  }㯇]" SNՃ/F5/S#µdʲC &qSVYHԂS(#-A1^*ʬ-m f%5sbO'-mK)W>q1+3S_qU7} 66|w-CY,Ug ϧ3pQtܚ< 3v >Kl{q֐H3 +o8|E޻^p`K = 0so7yED?d=c~nAA|S;"ޥ$ʏo`gg)O@ާ{t&8mƱUFMWTHwiB&Ħ$"]M5nWYy;YٯUoR$P5}v?ݻwѿF)SnjG,+2=t:WSOmJ3SN1;Sjt-I4+tV{Uu(ϩvLzUFzr>7]? 9is z_]xF-P9jܫ2<*RKeT7%׃g@nN0,N\^Gbmu KYI[;S6XpNB0Fy'x;Lp)8غCyLC9-9@3OqkAߎ[?n]dYoǭ>n]S6rjWP.1΃6O"VTᥚwa5aᥚh8G ,$^:xj@\DLQ/\*#W)#z'" Q"z)!rɝ1Ƈ:pZ=Q+IFCp^kC4Nc10a(uTPԙ<3nmW5TYM~%lݡ)-'kCE0VHesp铵%i)YkLi%<5Br/?$f[,}J $GFBT;<-wBJᨈC8rL볔 fZuG7е 帼8)8M7G)*K!B~g 2zKY c9#{ʄT*UFjgN:*FǍ^XjIga6˄đ P9"ZIBcy!)\x*S5TcPL8QN`I Gޕ6vc"e:\R@L4 $3-;ʂ9b_m䪼T,Q$g!J[¹=hE9l҅EH◒~LVMc%.yx+',KGUHUGh"y$1*2RƓG:,5Gꁓ6P4Q.o#"[6EI{CـWd&ė 6jq IuEbK7;o8;jsƔHܑT P7\u@Ɋ?VZCﴐ@yübyz(t UC P-!`VBSC S jؗ@YOx!Ǭ+EzjjPF< ,:!Sě9Jy@U y\pulfZ:pY)eb+^("%pQĶ9xwzPp8Tt0ie}ZoA ãܣeYZUTV7e׳S"ùb^ktf}Ӡ80o΢avOZ?mom.>MɯO2̕$Pc B(Rb>٬g|PЈ!_fbWߵQb *WP A{}z\ۯV)l߫EPԥȌ/4P(8^oUZ[+Nˆuœ1N)bޞ1iAI+_FG^ S}˚Z v[HHDh2mJNI˔˔MIX3#\T8?}úS{g|!'L_c!hOB滃O M >+wHc1&u\<{Qhv_ .P`횓Nӭ:+d /90?ji<%Cms$w=П=4i7p-.@ho@ Br0иi[\쓤kG yw'6k%[=rOrbX0eab@tTcvҁs^ ^fEgp̱A7c еƃm`q0Qfj/E9cO΂i'XKFST~@^i*I8)cTT5P0hT[Ĕ 1=fb}``;i64't~ 7krg5p724HsspH"JRaN64ٺ3^Kn|:vC=:I3kآ@ uBbjCY9 e^-x'"&t/McS[7{" _ 58,QSB1@%4A*wè9!zpfS0O eѝ%[O~X(r]/{ =B [(WRB`i-S/ĺ_VKX?\?탼T'tQQz+v)ҙYe.JwE!Ꝏ=f/Y2CNC Mfsw0;fA 46,xCz_+]VG;had?򃙦6Jng8͕41=EZ|^(WBm뜶#k -. ,Rz!R)<0"7Jl~Ob>Lnꚸ; Kwn<#*M3*Xf-)=Ы{KHR Թ_xvl]u{ SR+-^$%oǓ 3YzuL/ËVS~vK! R Sce:xB)-vAΓ#(Yȅ咥ŋ 4=,?~w_yz$\HJOKTziF`测,4jbp* htDrH-o~߁zKr PU~*tOLw 0I9[Sl뛳7Ր?~Fq2gߘ; 1^z+Ɓ˄||7+J1j.Dl"4ւF}N#V+CAo6<:3^g}SCkUe:٢P:~:8qz48K~{ ()r=%l1Wg@>4_M* 61TLKGFӰHMZxmsUWUmn*M%m&l.^nʻ;ysY9}f U~(WRL%J5bk>+~-(QEQP  )l0N*,`TdQ;@Pz5([&Z.WJsE&N :RAT5m QT[Ĕ HK8VO&8`Hi64Aɟd;24HsHi*PəHB@ {u<Yk做)-"νyӜ.(1IVgG\I EЈBP( ˖2: 1d+幐Fh,`FPASHH%f4 kDY|s3/gD,e4Ś J4fϝZ8@[!r%^ ]Ocvn>=]XQbK"PJL}F1 xjSX0H5&1w)JH2?IO/>h~As(5;]X5=> N7_[`h>Ee}~ۑT "s$Tc/D|by'3x翕ZŎ_գWfpP{_}Y KVˊ"b ]P-[5^ V1kOn4L~F3APۖ>??]l/vn [ops߬ʫ=,x7σ.D ~M1%6zr뀼CP]q`fBmm45lW% \3IMZ6Gi2S)WDwϱ᎛B,yb3#u S :-7<ǂ@Eh47rzEU{wk>:}Ն55Un5430AuXZ/433׭'aX=@8长ytƳBHtۺ`F3@:pz.7MLlSͲ֬d)ڇj[˲?~RԭrǵЍ_q\G#a w7<޺$D0@;hv bY7guFq;W%\ VՀQ8Za 2e+&+X0 KB`:k`v  `XJâⶤ`XT+V/_0m#zVÓ] ơ:՝#ZIRJ  K\KGk|xZ]~dpyV9.9l L_:`o5+jWȮ6 NBNZv˝}Ju""QseD㝗"C;/־Zf  n1jU*sZPpR]-\kFzmK#$5VEmOXOg|_|@&ϰBJqAa",/$|vgL"C3eKQ|TcX;D+%v:ݩf>o{w!͕JQ|RTK$r'ϖ:ݩV?rj-r2}4S$}?giNFD ovvDivLy'W3hqyf4 k3,(û+`5 28u Š脎Qƺ.yZ[w[r"$S\}Nnۺy -":FvA1pR4չ#kB'Zw@BN\Dd!p˺AXώbPDtB(c/_d<Ѣ֭ 9q )Fyۺ㨙bPHt(eW=ٟ<֭ 9q ]#kNh۾J5RҴkЀjRMJ5m7Ple~*5A ] "i[VV DƝEPëB;kC Q۳lU JƝߢ%!i{iҒQ^]$;gJKu7 T JƝ]k+kkmRMPt׈5f=;1W ՘B15J5՘5idjkj&5f͑nkmRMD7ƬU֘8j̪ ժ&15g5LQL\jp[cnk̕jDGrD)jkmRM 1J֘s@>s)5֘KɔDGrZ)!okmJM8j̥&5\&\jF$ikmJM`dzB5RT=5֘ɔ<}eX֘Nv8`okkm M`R5,`ImkmJMH՘Ţ15J5֘c1DY(-WcdqB[c>3A:qgAv>z3&0[da[5i˖?zNo;v=1 37=Uw&wS A?>naM,+OFMѧxfn"w熳.//Q&u/@7[& L3'Ѕ`݄B^kN׎I7jkn;H," ;p HĨD$f(xZY0*\_?z]z$KރP2Zެ۰%^(ԪٍԄ |gVDZ ?Oo`x2Xa/svO3aŏOױe:G֮@qĉLv֦4H*_|>:@o2ť &w?^"o3ٽvvvvYG*1X~f4< @JcPa4e?Fk A;,[G /-6jX,ձn/u۹g/T"W~/5yCm +R[^ߍ[7 |5pq4ob`d ^I,8 څQvDP>P:U)Z7Ql9TyrΓW\ZC\rZ$:mJ ɤ)Q(Vn8N\2IAVHn P !JXC',& r KĊ! e`$EHDB;R* uƥ[)Š[RmQ=!EXBx!+uDHXȧ05Ɣ&Xk CPP}4?a!!: IAC18>iI3J98fvqXPaΗB(9qq`1\`4Ia/r$[ .Z!1)@05@Ďkk A A!3%|kEƬ L5B!m!\Y.c*QNS,ei`q}FD'(AF<5ǓQ^tygb N;y'8TpHF\1w_9akOʲ r=e.ߤ ^|.?̹H́|KryqN<_y "vч_^u?ΖF|0?w௸zgOg9â-y>=жE@?b=D*Yd=jxRL~1۱Q'˶վY$ iJ2J RX ŬI /75I|ק3F?1lV:fz|N. eWxz ¿AަKXvGi4&nU2Z.ܦjخ{处&N1$YK` f9_!}G9lko=a{'aL*`k_kEDsXiϔQkQ_y>|7ꓞoܲ^,l0c84ߊP?1v2@}-fjz_k p$c%a'N[pLRH X;)U! 1dž39p8^yi㲂Q9UHͰY/O<*w}5=qZw%sMFm|=2hFɽ0Œbfz%KٚVDsakM[UkoNju.Rݍ;ܶZ>֥Q;/TMefgJ 7H7cDwš:P5l .dЄC҄&#H8*dt\3 Q8#%t6KB+UW]B-4RzlHoo6e,XW@&102J,q1"D8BX\FP1!!Un{ž5;Q6Ĝ!d?01T17x4LIFNG5`K:Ts+L"0^YCX|r<t?bo5t+YiM_F]uO`C_ϟb~;vUڤ;x[4wT&L)9:a 4ɘke >*Z g>ty.;g}HpnG/qo?{ }&cv26!MPp#=Jcl+ajdч7%˱$H|G[(>M9tSS[:] TM|/cՐAU=WWV5 {xrӣ@ `x;bcg,Yrji~hÉWwp|-C&s ǚTxgz,|qtJk#S;>{W6N`s ֕w޺ :8$xu`jG`pjv5JN>!/C;ifӱU։ʄUy?YF#xv!#D6=.x< c=拋}[tg˟݅0t%7c際til5/7>}x1Yb~f`&K yg< PWޓW} > Cb{`d|ɂSb,G|~|Dxꮮ*ʪ6ܮZ]O-h~ރ|x="&{/#?|Is Љ\*W U*PPaM/+gmAsWybuod)'}?3c7o:{i>-+uƕS88[0s NWɏcWM DDg,8S(:+i>sA! lE{z 7p;TԘ?{=},q#+}wo?xӏ!t)w>{:E%98O ]#~R_dyza'QI6.({ kx_{ª8v2d rvnc"v$YzRzXv]~x@Ro8 N76ys\F%=6TIE#;wEwW M .ݙDUg?^F;]@n,EiMP(m+ ]h` F/.pqb:W0maT%>&܆SVYjɹJ$:Z"sq2׳ŲHm(٧EH{29 31H o:L>+17c hΏcQtysWg=\ i4bQq@h娯CdӀJ7* aL=4{H]}VCE&N*[0La)EW՚:k.M#yjYƮYD,] dkgyIcX5yS]$#qG04=C(Ujj-Lf2 KY]k-E!es AӲtm:my1TkqwUMJBPL19m&]Om%Ѩk: 铩Ù8j_N8&m%K44i9m;d SuH\=|T!De"AtOqKN|6j/L_}jPΙut9UN-[֎@iG2Eɉ.IjmAN$yX0tb@#PMJJgZ (TS,T>sh~o|E{;_e^ Qu&yN197Q- p¨eٳ؈Vݕ]cI}58cBS#)))J8S;r9D~0i!z4XuNJjȱ Guja*2!{M%;VPC( iz0ֵPm>74hgb]r:;CpO8 C2) ΟS A-$Cl4CoQG FrTƝ3OQe VI>VS[G.Ji,7`/Ys8 }JIJIq7 uDNjRdP1dP#OT rIE$\[BtKbA~30Tߘ;jћQ1 \_>Pehy*~.GSxS(nCV9r&[4JhLmA!w18*($8ۓjcἵCiPHeCiD,$3 .B:@XM2 điH5oM&&s`(s0!`Y.r'm$yImzFH04Px%鱉ae%)ED6M7jdn8C-E{o**Fjsu }8{Ν#QWZ>;A!PYM Qqt=}?I \c?+_1SBo_՟; ,i%X m24>`J5+4Y3]"H΃Liαw`%&kp-D)B6"h:qP %x;86\Ѥ&hVYb"NY+e#DR ^B iaH1|Hh 4I 5&rF60 .  ^Js:IGyה fk8 cV-ޮ@c6vFIq*+;蚓Tj.6B$ekd@l;( Hhb"IF5* ͂,3zN8Ppj8*X+4R D;8w˔oỉSRiѧo!|zPQI|ҝ݊m_t1V2,SV  }b+5PUlMd[!cZs)i=ek[SKΕh"JIFdn(@P1yOI ^ N0%vM(P178uG={ipAMALk΄~>=vQ$tc|՗o@<BMZ—@xM I^v 9b(`$)E,1=y|DڰW j?bkkYQiG 㬂 K8`QЎdɺ*D5O$MNYr\"RO5n)!g4Ok39Xk&)?`n_ԃcԕlMh#NR51&JR˱]Kn1/`05hTE GyXڴLf8.&ڗp_\^l0չF3:kum t;,fUZn;ȔcSKٙ g_OU8:+0J)t嬩 ;T㩪'Tʋs[^ܵ$ew{/N|a*e ͿA!F7D#dmN*S)Ι/٤ & Qz*ssssRjŞ5GVz"XGlGKG1[E, JddW5ݨQ^ͷg24OAILLp謬XLp~H&RєX\@|L╨yX/~RE]aJKmpXi-|Jz=R{xD 3#g$tn[dӍ:#D VJJ2eP2( A4j\u=9ת5@h7|@70<`g e>㙪q/~<6~*kكV<7O$Tc-tҮ*wev7ӒAIJ!2;aaTDk8{.(65.Wg4K?%Yl'Opxp1e~ /0NY`.&Q׿+y&hA,CBY0@;DLP ^\~2˛|22gǽKg>uV^oɳj ;!cő88a筙 ¥ XFH"b#H&D6K jLX—L#}>/?L*INp Mm |>o9f||b#$6ܮJ}zR_~d-煺z_9m 'X |p>Dp|݄@@~K&?:ƜH|ߜ1Y:2T̀XE!55]]Iv%0\4o@- *:d4NV'9vz֯#V"iـm;?fFWsnL33T{U|/UB}`$p JsRG;1 svJGWJ\Tb )XHLeZ A6QjDcM- %MΧqIJMgH{`q%L?Y&& ۧLB /ΐ\k"y/قP@vE; Y>NjFf5y{{dAHgH^!(":&QH]mo#7+?B2_},Mlp_r03Ȗ#ɓ _[mZndTXU,VUjM1=6?aTkO1 kTߢZK$i{;هc, X.VQ)6y}1ȪU1]9[l ~fӼak7\\c}Ǻ#ɕ<Wi,@b}JM+9QJezŲʪC y"zLqD PD&Zk ӕRH00SCr"f 3!'yT .8B``f')E"7H>n7<a m+G>5.8JTm!)d >LRX4WNDIؖr& ! g'=bKJXě@KV^PNQ4@{J}CW B'ߑ2Խ|ܬU$\#F>&'c 5fY"zl$U:: Yi.$4 ѡ2G7}fz{gH^n )b#RXX4׸":3.nx^iѸ)恹u^aqy&0odiy6KbCz}Q!b2q+*d.U|!)ڑSMcNc-xu܎4"YF$Es!Vr;A)PcGJ-\E8GˌJblkC˓q"8CX%p=dcKtyk-1 b0r=x[\HzTԉncfT0̳< {Cb^6z G;+ǃJ#g,tVNS{W© L5{@9Q*0çn=-KX vu^7vAycMsA6.*3c\eHm .3\YE3uJsI0BYa}&箋/cƹwzg<ɔKy^˜b+F#ۅ[Qsl)uuy]쮚$M;@zN6*fmhtLzH/.?'mzz'8E/WnMs2Q̅##{(Q yN21SF$R$JBF ,2b q nS2no;jEIÐ 9\m\<7tkld8\;7 ؒ^a lÚ)6z Վ%:dAI3?QͨoS \UsviF\jva:U͋)ɔc&-ΖvVS>zdJ)K <{ձyxL6F8/$/iKѢ11K)RXc.eu>LjW ?alQ_ R%{a+Q/C1K: /{p3Н/`һ/-vQ/|g!hK0vj:'?lkU,nnmMvȟ?oĂ4KĐuǕ"V۵>C1$.ڃw;b9\CaEo: F}oڮ{XFj)O6uTl£-m.F{h:W]uٍpL{Puџ=JA偶]ˣr6K(j' ^w/~8Bw?MbWeʶmؘkk̈́\r]ץpF z[*IFW0*K`BtB1LclgS6v B"TDİ!M'XTtMl>wjzÜ,//1/ċXNE>w(izQ'Rȍ΁6芐kJy*Ҝ/wS,F,*9'+Qeؠ9JF %1#C#М%ƅ5Jl, ceHʼ$F\^rXD;#r9n];l;^-NSRn&0nXJqd}hVw-.cgK [ނp{pVfyeOgga ^5צ0sU`_c b$Ml퍙YNsS(oyƴm Y1$g'~ln8/ QkšWQ_Zh<|[ӊװD8|sӂ\;wCH(ّ޾}+>@z LB -gz*qFST#U±f}.bC23`CD]%@ M"z=IORI%N*ci)i<е>!~8pA%bNE4?+9)n~v1FuqkH Ak[{.;̖l'!g VW2 6蕌dNxNs0RWZ J%صBD{=Yik1ٗBIґTwII|5@TR8aj2ʊ+Bβ^ӽ#;t=Q>Uv0b[5Oc[{cWD`B+捾astfְTsĠCd#\!o"R@y 3\?#\f(9=ޡv'^p)p̙נX\t"aΙ}8&|(EA%ƴR"F2k䎿.9fsgi+DFDI8' s|$}\sK >Z:5n\SSm^w%70bhդVO] PyaRUDgǓ t _gz8_! o ؜d! -)e !% o $lCLWunanP.m^;seݙ}.Z'oxc֫Cuyz< L!~6~0׳_ϊ~= dt9Q{\y6{1 hL!Wz9QOh!a{Ѽ?f9^Uّ:N9!Lz/噽)M~M{ 5q f)N~9\ly~6t[v`ɩ:+造 z.mYu`5F0^[ V  (%YLz(e eV s0R0{lFdMд(R89ǃ+1HhKKe !Ykbc Ƕ9E5BWl [eIȧޖFzK5$ @,I-dDhfǤE<d^ f: .ȤTQ |j.R@6ReA!is-ƘKjQikP$)Y7w$MW_uwS|үN>wZЭ]s~)F&(i"*pF^f2${nIFd'PF< u^[v"!p6m]/gӉ+IYv Ɵ|Jeίݦq? ?t{(m]ޞ^ֿN^WwzC2X>c">^5˹in3} (xL!I>__;cw=nt1>o9l?\\5p@ rޜD!@՟e~j|[SI(*/o! P`h,orf34lH^dp-!9kx"qVYy0/3"f7F9ND1_QU4&~&}3Yq_j uITHSg\j|%g*=5J ed;5W)A mb F$S^]ݐRM Z!_C4BDFIV&!c9Erm"E[469(ڠI"{D- UTـQWQ7>c "ӓfyA(qLPD`MzߠFC($$n*h$(JNo.c:a m lro dc޵'?+#Q70\um\9+j 8̀Y-+pvjh-2\)I(_1z-X@-3{(AndJ4@\9kPZA^ ݲj/vhQO%3AJ[HIdD $ą eNM > 3m P|oV{-ɌhGʙ$T#k4} \{)mFTOOtHe;HN褏^0y5!09&-Atv=y#OGsB2撦,$)V(~톙|ANI(]$9I$uR3b PfwČD,IqEuY Zs@^g) #z_U 2"@ ˶#f1Cz>dw>c͢lє H^y2?$1wNPtYÅV t푎T*$ MR)G#/ 䂐1Z9 bauLZ~κ6Q8|uѶjPxSCEfad1kd%(DRΘ^ eni]1B-Gȇ@"{ZEsA]wOBl RRH;DR: o>pN3a0ۡl@qקhU GqB:"Gy:XrVTv~)=ӮLRfzu3=n>^2ˏ@ǽ8r4\BwI7%&2erdSn(Ӵ}W o֓47EoOG!{5=~)*ɨ$_ Tjڑg4Ii`)0Eƺ;ω9]ߌsN[};n(ADr1[pg{y4zv[&%Y%ZlyBY2<d P} C/ ߉ߑ;ɽo[mh<.}E|.uf)+=0 zq *?LdW)Xz~JE3e:? {?\\,ӱa5(ˍo9%*R-Jv>WbhX6 T 92[G6,XBԵ5.-twSf; ǗLX50:`UKՖe߁j8>g`8ʯM` DY@+mRɻzMB;3Jxvj+CeKYKGv-5]_ zȉ/hE2ް[X(?~Hn#9IbYJ:b/ތ+恦8f_Qh?&G??(p45\+\EлN\;q==0b0G[! C7YA` ud$PvFyr'ztL>ȉaj߮PkjALo`d4+XYX2\M|aHgVf^{y{rG $mC_NԹ"$=SA?Ѡh3M: @-i{kK"buZ#ZHkY#'ŹL6Psw׮ H[΍yձe2Y&} )mD_LXGGjLF!3A[WCNX V%uISW\|l2}~PrQhbЯ8psc0[.wD^0>%[S6;?~u#>ox+x~O ovS~ׇ/o(|ȭr'4Ŷ&IJWVZ 8͑M&$ 5JYir`*1&H-U#phH MMm!_jj~]$jb4_k!Qs9c,! EI_,+x&z[B C J#NIː3Q%ՔvJ \Ԛ);O IUP<ɢHY R"ހBoEKRBcvȕ23)p 2q~`~I&쪺'RR Y:a4^ #> 27G|i޹yoX ߯RL;+v{WuԐ|!x)7vz{7/i.X=K<""TiN+9@ Ѐ Val&9ط N4=ݽڏ {9"{OgiI, =vc᳁JҢ<3P=)fsُSmߕ2lZ8Jb CBͮX4f8pp\f~U,޻?Ӻ]?IxߺYr^Bn 94{)|C3uo6xd/YVO8Œ KFƘ1ۂQ3GwYᓘ=X( 88Lb3Tf?\2hmOFh} /cRݼ=n 8_xB r׷ܻ6)ʬdtS2րz:8*\_7_w谑,S:3)y.Եi@PQP(R&|]EJL< "nCijHSzx{_\e?߂6 |<=$V׋aƉ^I{Oz=UqC;oړ[};.2V)hûC|$ǃ6<#j;8Og:廳G#,!j%z? :LN'u7I/yΟZMesl˭Xijd#=mxvRSz04Q9#78b@bG1`aq牥># }:6qMv/xMLx.{XF:u}%J)8x%QK $!Ʀi[hs2lBSSt}jo#q#V™$+Q+ S.qNܷ.%;@N=艂 ׂ I 5_Gk4eb=Ɩ `IO E}[ўGb :{I>2_Ҕj%͇%h6KˆP"%_dBdWUu$w>gpzI_ΝOU"]|}hJH . N743=6u[כcu.WT)ؚܠ:42WLgf>)#nGyg~1r{n[Te0 P+Uy߅KvymIZ`_ )isYCқCݙ?>A(T9_ XYXJq^ØQfbfT/ȣ3bCY f1[և87Lkڝ)i,JϨ-7l1IYU5eQyliCƂ!6H'Vhm|(4bMBZpGw&0*Fw>iPֱsoM ^nTZNEެwl9 nڸϤN*tf*W7w lS7X8ߓLJ- 9=i~sh >Q;[xԙ1:[Ecj,k5ZW{"&MG{qJGVrwؿuKxB :w}1D>8IP"*(@Ͷ_;dO1-7`4'<[{3`-jQ<-V.XFխd]xxp7#,1<~sTvTyXQSS9#ӲTdjOM1vKș ͏8^ Aue/r:.${kO~ 'å0&ٔTmG!_HsƏ^LЧ.p8gH~΋$HeA˚þ 54[;!Bb,d ല¤ zjl9?CnV7@dQ&nVSLN 楠 76(r[Q@/7O{k4r~(~˙V?ϫ#M]} S"!B#h ؜0 4[2#3jE.%M'MHRt7 FDQ2Ѫe6x%]!6MzL0B~u˶Hx\6o|1xJǻzs14ǚ u4QUVbmVm{.WlrP?`ߣ1>ڴ4#3)?cKyBu;_eWn׷n+ πsnWgWb7,糆 _n__:z}Ǘk]?z55'&9<f 2I&>Xիv*RVV7c< ie9`ZTnٓ%"  3Kf`fؿl!R3R3G#kCp@_5%=.;kw1i~ͥ luIJa8V`hSʽl/ vz5 KJsFZ:U_X_ҁ!/熀CtQQG 5\'GOӪRfl'w~[_R[!3z3aQ(tuebMA[)5g_7G\"'퀝;_O1?'|vt4N%A XAC hR I8 Hi4H'\"̑aeFm @†v8RD5{څ:"[\À'kTȫ>ٙPŵ'?kQZ.?X%/9WZ~W)Bnz}Vss8 мz":wN/ pH5כ8*#\H|i!Ln]4L=NԇR}D )enV)LrTQ!VId=m< mH$7G"Ux {{3 l9 b䦂TE5_ko"S+Om"~DX?Ѕclb`Ao)ח4_0,SSfL qGYV2l0yD;tP.l<0"IShm2)Jf?_x^*4m t%e:Q,ӥLoUk]Z $5kT7׵ mgǕ(G.]T@48w|JM\r :uwa`}QZU՚TVAoy何Q5}jK&[pk{G* ="1]{andGK%'Po!.<C5J8zK=LZ.) r)bo/ s9nELwˍ:n4zیѲ``4b~-󌇣؉?'1D֔;4xxH0hlԘqTpZm0d 6'.v@E$_ccEcQuT!V{ G0,"S~ke&Iz=RBw8}mxEtU;cD4 -tMn^b +-}|^|ig8cQ]4KHhd!"z,*.<,"?B'/}p>@J{Mϼte7>g> ~j$lTB H+A0%u㵄WdZ񷍢5Ryֹ ѸWg˳R{cQo(zNya>%hAZS;Vr0Jqp{%!u(bFBSڕͫf&$NX ##PrW6:P79{@A}W_M`7ҧ[Ԕr@# ]aDǍ/><ŤWNbҁXrcҕ:[|w爭8/&\H fUVoPFK3 O8/b46|=bB8%T\RGA50=9EsL R:%d]޼4 4~яG啣𗱻戽o4ݕ^̕p/!L,NsT+E}+ 6"Q1)i|Y7/>=Y.D_nk*ջI} Ye}Is꧓AwʄQe%J'SH @Fn?ܲ'DP*('<\2nTB޼,ٟH<%)_lqgw\-|6yϧk6k.EVHxH9.4Q1>Wge`v',|i6?$Q JǨII'G,5%$G  !DY^\բΓЁպ]԰Ri'\绂# p9bX)B쮲 oP)CL7Nd<~jj5f57| Rƾ{^?'\_\3MS|~d\ƭ?F1u}<;DhrxN:ю2v<'awaǵu?1L &G%i}^Z+LֻLn6DZ[hhϠ~nCx QcD1b8b`r~,5;ouCƗ f G{lOx4hȱΧE!U/ S}NbL>BU*7\ uȂp< +"Pǂ(*CZvo?ۢALiʶ?l \R%#*7 jN43mDVVᤉe9f*{WU9#9a/W7.,$RE) #BcmHHBR[27#UZg UTxICzЌy{absWnb$ƬVNn]@V4݋w[P(u`3N%`[v/#=(\F[/ժIvGZUrki#-9* @_]Nt_֍$JoKTI'7}VuE[~R1&bAn?5 #6xcF 1NhpAkǤ @W]r&c}%aNN%U|kȉĄ(7OWpsT MT#B弨 ̹OZ$ʻ9rcd9Rj1UD=gb̑ȌZF\䌰MPO:96Gl Q`Hmo`awI*#6EȈQ:ˊ8r<#_^}& %͆¤#u໓ Pbt#xӿeZ %K5DFbd>Ǔxп)I)Ʌ(ywM)jwQH(M@rc6ǶH6au}pDFEkyKdk+īPxu_Z:4AT0/bkTLw*ξWY:<ȅ5ZcyU9>)y 6b)ϋiYY:6TK:6Ԕ nîz8J޷YÜzW`㮽-LPzS0B {Q QA]] "k7O.R!tΪFvrNU#;媑e#I$is!@Pse#㗒HD@5;@0F-ʦnCLgƗi)cV"wrQN.)E.K&(}k,yIq!(:䃋Q]QZQ-@,cy;NƆ6>MQсT-6.D3m LFNg̞ GZa2D63-U:hyR(4,S:ϝ7@ǒY('35:9_W'uͅKCf[^{WAhREoR*A VLC'$f2h6T/ԸfKT(qa0S@~R0q-S95?-2}rdOv ѾBd [nϔiT)QdyTpՊIZ h޻Obq^9#BS-(Aw!t6%+f;$l`!*a`k;lochk[c8 m?^olb7fxA8ggֻ֋ׯlMݳΫ;č7lyλ_~{_{ zyw]wĎڳO;QwXX.^wGSxn l}o7<I[wlox0WOlm[tvxoltMCdr]3TjFkyٳ݃,f ,dtxZ|/flxã^\mퟎ'P)tw£Gk~,oco:Au8dzgN1jsZ0'v4Žm^鄌~+}[~~_m0wg}S~+;a cPݣ2z#0b{!u`1W|9߆ԓ{eۭ\#wɵt|0.x ;|RGIUz3!Kww\qo/8}i|0^?N_{>o_on~vy}aK[uqjbhћAg0}g?IT&E7zDkkbxg:Bw0 o!\Sξ8<|1[W>}}w ";}?{{~ѡ?M5L*nl47nÉL84~=4ϟNsz]A4u z s *n,H[ۻ$Z MPY ,=,$w P`@0>Yo>whәC~3_+fAޯM%Z߯q9?^m\bJǓ k`l)˸vwXKRs*h?6B&Akm$˫&j$#cs鄡2]P-߫n뛒uyVb}9gXV0YRm2@Z"!ZP@] 1 w V}py )JvkMHXuwX{![ ph$R{TF (HO[/A*M9Pa9lhAIVېSe8Bu-Ra4\f < G9<|Q9< b0`FR-;4,dP[FÒ9STPıá` ,.U&fH-cܔr,P.pb K&(=%:mJyF既R*fsS|Qk[ʓJ/T 0=Xos\J}( i%uh\8G@)`$h%p-dM0wX~*"D Gpx9k xuE8|i'O85qizL.&DP)/%ʁUXUHP:I31LJqVr+Bom"ZoO4,>SEZH1u,"g *&P&ZTL> SZJfFR I9| w/WbD`cJ ~2bEL11ohHJ=\Ra%{Fdw~ C'8ިMLxǯ^3* [G}#-_,/Uͥ!SIȤ2ً)41;!C,WufƠC$,R%j]l\.aCXjW4cdvadfC9YPzYn՘+v5s-?]kOI+_#͢e]ڍ&!!(t{MV$bgz驮rWUmEE* 4zuؽ:{nʐ<ӭ <]ބR:[ैEBlsx O sx0$!avG2X>;$2gRw7!<'f/VszmuwWմ+ڷko\1]UkϵN}WY݉ K 2,1H?]8bËq#yp}f@*XgN@TdLTUTevoW)J!2XT fjsk0O-֟DPpҺ`J~t&U ;ppIA!TKëv@n&^Q,V3a 0XUZդir^<]IH_ p-": d q/ ۿS%ڷ'43@b8$&(X(2PPlВ*ey9T 1~ Rq~s)(30ŷ%EHf2D̠+epyBLb F98Ik"iQ'\MpA1N%N֫P,a88yDZMEbP)aCEϰ $qo9cS{{;xܘejJa9f^0SNb~89#mKz70g{PeIxfB{ńf['#5.-4{qUfؤ6N {K01mXqT{&܂㵂[E?mL3|lY %?6Ú q fz aBExXzj&bU!c.IZ1"bh}7xFV5sn6 jA>)ʸƷܺ`լ}7{U khp3,|nkxAG.+rC3gSU˿N>>;eÛF}٠P840 AXqAƃHJbEih͡bSD+n42^n<~LsV!bH>G؇5HClBgIt>V/ ?Ucx~R7.a'I]GGM63[/*H]O˶h,K韖NT Tmq dfۃf0&{<58 @+Vvn0ORS"'p @D \)/yKoC'8]6Koꁓ j9]S܈Uܓm1*Վ(c^w{owW^f烽۷ s߽pWG^oCWlY5n>On^Aw}SrQКWF@+Fj~oM*^BBכJSpu>+ؽf;+acb?5520Cs-I|E,s Do꘾l SY*ZXᘏ5 6[r#!xxX.$AZ0U 'ăj"%xPA-(S-a+jQ=oMeAA(2$͈E(*|F3[6Is`A _*wZ0 'PvUl\\Yn^xÃA+LMZM`R2xH/ 8 k%CBt]rW H仰gݳHHXn:jSM'V@]TL5YLIHMIޠGt\`7 K팰xO@R$ѓ ++pU:kOg{c:q<XXD 0XeQĆR-Dh%bp#^)b-jDm"ωr=1&BڝY 8Gi{LSvp(SXoeB!'n==o&nmi6qez;fM%KLuY0pF)EfRp!Hlg)gy/%ڮ@i)9[Jn&grf˻ b٥L]mwUα{}n@P \1#q 6 +k!A 0)ʉAa!;a Amv biFV$MSs'`Ffuƽt k1 3$G:1(@IFfYTA{~$@R?Zuu3\5ac/5 )'`kͽrF $ai IKQ^$E^$E^,s-dm6SkO]`Δw9f'g#nj ,}3؎R'{gfG T,$v,!SO^_!:ym%3AIݓ 2LB&Vk`%- PgsFp0"2a]xMYBr*&n"&<;iQ3lbq~>ܩLQ2b4:#Avܮc^Jx{wG@srvO۵v{[o޵.N=ު׊^7mm-VVZfg?8{\DH7rGqDI0 Wv 6&ܨpEwm-ta!<R p`Y@4 9\NC<"2HA1SB!#p%Zq.{V qjd!a"`%I)9/I) `L`[@ gj{8_a{KuU0sd7 _c9Hd~#ʦȡሲl3$S${zJbоs5A.ڔ1G$Q;=6p A7SOu܆Hj_݉|q-ⓔu4\biO~ÓÃKPlo,5vܮ^|w~KƠ e=ކ g+C uBsP\ T P(i58LP)t$+sǗoo(>}c2A1Du'?GWbcʮh(Fp #f?ƐG?חDZ/?lG+ܗÀm0S1W/^=0=A+WI|8~5WMn>㏀՟~yg26Dk=B)W(`; שBDV!M2!)u/ά{% _> =PWvoW4X&Q)ul+tQk*|@4㭆d7)N{wn\?k؛v__m98oCV6TtȑL쌏p2D 2Ɇ{swg /Ita^^^ޞ](OIe|M|?z+Lv*Ձ㨠)Jw vcPOo? dtY/]:y-Nk-V٤Ig%v>ت,ssbm0@!L()DvuU77D;7ziR]ʣOC׵O=ĦśG탞)EOE7B5{=19\٩\Q"soW cI V,i;^xs5WBҊW'W]_3Ds?cgw 9y̝MQQWrB2v^*Ԏ4z_0O8B㜯_`6,CziΆۿ>6>[ -^/DH tfes٘>12 ˻kl5jT݉meKdm GWyjӫڠE&RR|(yi&آY.0Ku2kCfQr uwq—s#I@' Hs.J8}ހXˑ&WUK#cu)Hg/;[L89ڤRITSAiyu(cU1h7:a^Pp LYT:q_YYhݎo?8o'smkXyxq[?<婸ytw.o\ǡ+c_#!4vtq&:ղ+Oc+S+o;Hk=j%LkbCD:x,ז<^zxhNnnx/M6c%憾#Ίӵ-|ÿh̓sJ|Ep?[)lT́yc|DJS_C3Ƚ ZVÙ5fptF;ԨKa{TZ.i= jJ/u %qPiv3qMHEQPz (X[8I c*PU uy V딫/XsƜa&q Է+ڱO/~{OED4}h^2BdtQ"딉ɤ=ޛuĶÑ\,Vk$D:,<* @xz2QE``5&O̩41^| !Y>MI?A}BtgK2)T5&]Ĥ: B &2:d  '|x:!TzFŶc5ڼ"J-=Gl<-u\n1or\A[0DuS_cZ:kn1&0A;SO$@/d[\G ;o"dD溉mgTk$"fU$ވ0A Y<}m hKO#NQdp)$LQј2`,2&lIcZLɅ4vaыW}IB-lH.fу1u K.jnlboTQơ*N2LLV VIxԪ6N@/K16Nlɖ޾_9o[C+/y̧TD.XoҚ<%:hllA~A(> IcLFdM$T!SYT*~ h9ʹ< Gזjcb:=]O*JET @8))ǚ6UHSUr sqR#e 2(hbށZ9:͸.-Ǜz.A;ݭ UW2%HmfjJ|5E4ۋ0eUbV*zf'mڕIloꭻdȓgj]}jOﮜz+)1@h6vW⒐Mgg?-?n1pBzȳ,^hXwμCx++;~-gW4Ax=`{rfi[ V[dUmC4i2J@v;oA dNMIY{-d-Tr7m*t3%LoKu£`;䋜9EF;+6|sz(]mD3P1*y<%EДQI+ve Բb;pTBy/z`4B;(Ө',weęG`E.!~yR|wV~N;[)(T7@QC%&qcPPpH:#3xZ5u U3U'ͭky;h:&h3Jd!"6{m:Q+ʌ@m%pW:VȆ};5,Ӷ,=g{?bݰg+gwiHpv,RFrݳ_DOjv KS̭.hQck<$ZӸ|t$/agIF7f >ZC.ʖ4l&o1BM^Yud b䧻֗PL5w2,pn: [A=٠P@egmp?&4Oy7Y}6́ӛwQn+a.%Z[\nQ6J'0{0v8niٻ&7W|pu#"סeHK탥c %m; NV\=HQ;ʬrCM(ա"jZӍF 5T6F40AM/J0FY58-8˱8 2᝙WLtju>6 ֔?5v)U8aW~-ES@r"jOOAlPD¥'ջ&傢F#5t QT<"QDa_8LH@A6HAx_QF)匌dNk;Ei'(GA78" yJ'SoC@ȑ QQX 2敷^bYIP"))`>HT60F$ 3 e'r.ΔVѪ/Ʀ8ܾVXKSAn[0I;܂!&Jۦ2Nn&&0L RdڨKL-,}]mR =˷?N4Tk [f]Ϊ1 ;}T[1-"jZ%!R| RZYZH6ɂA,R0WnZ )X}am&_Pڭ6ݮֱu<>< IC\n3kuja/L+M\GW8T! f*JQ0aA4E<'LHJi"5]r[F( rEᙱvw<ˉYc4mMm$l ݝ/I-˙w>Q1,n?}*m) bZ@dS1<)7NM=҂_9rT DH xYK ŞIIQΘ&yx@R2x?}B KGm!%.E$e| nTj9MaU0 F3yR.$l ,X/7.[KZʓz1v66UoI6e2VQ$"Z#fY^j)UAyj?hpeDZ,V:`@%Q.e3ftw8@\uviW-Ąb-j?B+) 6ˆRaQQFDc1Xp$i0`/86^ QѕN9HzY=6Cyb0l,d3AA.ǏV8nW8 gبm#@0"V H heFO"$e}oK\(*\a7dL" o8Qs0*YsBJDI@Pnc&R\VmUn[Xfd#cqjQUavMp+ ݫ'2 bE]͚6v$ RD+lM x^JaR0L+M'. 6_0F4 F 1xni͌iDW;AR Қ6fi9).%-}oK-hNB)? ؅J7JRL6Y,[EӦ8Z k,ܼd}oKWF%#5v<4oyartӣ&8s 6qb5'qgGT#SAeSʣh:幗nKEvU`bÁI򹴰j5:=r'SN$cOTkSzll!/-ܳ²PY" %"q+TH#g2Ft*21ς"~cY/+c!wx|,;R3NFoث4p-  8z-D;%5UҰ0{5D q4-ܝـp^7}VZ} զڙ{?O;W+ŶKe:Ľ0[r7BiV\.6mfAkf?e5䡒HG'g>N]IƳ't3,TD3-cA<ƣf-h|rM~! g륅 %cEyfSʭ#Tw $:s(XX-xD)3o R`M!X ނ5XeǒYdJ4z,LĔ1ȴhJ7ug? BA".16Ȑ-18x1Ҽfu^Wy.6KYE& O4C87=6>`U nr'Y-f}PrV,?=e!&zvB(or'+7׏hLyNPZy0he{o5֪!_^cWɏկo7w]\N~|$2 =,7?>MD_[#Zl<7.bxcJőo|Q ,s(U_Lz)z2{vjSŽ % e~b` -f\,p,ZIkG/m$nra>Ym`X $:..bTlNk_L'_߮z:y}ͷ?l_=e%'N暟]d|JRhQSH 2pTύ}Z-󩭉Y@!BV|2_*Xzˈ'Oڸ= :@Dm@F XN&n!Pq[0X*nK cCY= 9]b eIᨲ\h AE!X&F<Ე̂p-eҖ kU2E|n>S=BA^M>'IL7`pc2n)!,sJH9?}ڂ^<2:߽?^)D轕UDWj"JJcE*K-$ ن P:ܔ.P*`5V~%6V+@/8ꥉjw݉} sy%-:|X =r+?6@\X&B{gW_:8'~[ʑ+|yXU+VPq=e6a@|?%[l1>M\TDB>seS\N}ݰ[_Nong`Čz3݆D{qz7S;8w+6Yz3݆D;۔Z2Dfu`S°xb9$-]'<'ѴșT`u:<1Dh'blsD䌕' .tдJ”hq>":i3f3MF0 DoR+bNph!Х4Rg< b+nާZc,nK!LroafY؍X}"Wf @Zm۲ZVZGıSfQ=ReD3/(gI8THB@]L1ŝD* AN>.-%tVⵜ>z"L Fp<S:6mt6BX޽ 쉸s/=ogݧ]bV-ݷ0O Ģ8 6zc~`Xw[E&L&Uk7Fd-Cs0$dB6)D X>vC;éRnC3L_*sUфj̈fc&4v̆W! @,D'iTOgKWpo}-l%C+{ŔO wYnoF9!tvN>o}< ߷B[G]|Xjj]slSU(s[) RQKD6Blu=?@mԣ GsI:wR zӜ%^I j5aWOUHթ̬*z"#U ҊԆ9 X VJ+,aKQGأ5˗O-<-sY/ Zs!jw[x}WU6;TW'ਗG5=X@~zHEZH~=f3a7CV1 ¡JC:BHc tcL'LN=A7BJjνJ{)s.Lp2Z:MStљhPdRKm8\@j :o2K9\XwUbAU"βNUIr\{ʀ*wUb ["97!X` ~.EVR1}9n ܜ}sɇ̻$9d~/>bD`A''Naz2^#Nm<@_ Ⱥ` /eU}=+})~6ދ:W ¿ZwU8Mr݋h%IFź7LjY;l⇸߾?}r+jӄqc|X.;ő AƤD,|cSրcͥOu$1jMFꨛ S5d#Y|]s?u3no눠g%S# ];P*Oae4-@ˮ?XvE&Wtu ;EW@{?555u>wo,aЭB!i똶L$Q*JlkpaWJ W 7ʴ1H]Mv5U1ˮm6קB4h:e 4H9>+|XP%A ɒ:XN&e#Q%`~)50Eɢ'KTF###:c$mE4ǠŹoG&sj⬇s{˷d [SB'rQKV‘Bpq\Z QPt'!~l]ے8~WTV־KSNg&Ode[)*陸IH?Ln㻡nȗÑ<8 *Z짤Je~'s={.{\Hh<s;?]G'[k*Ycc ˝S[_?˹~>V_n~nt^Ʃ_ra@C{l({|ٔ_?9|,{'? ;/*k?@KkVQq5GZ,U!쐱w f?T.p.`J /-4EVqVTլGm#cw6(#:_^[Y{tD&s~S`A՞ηg~S;|11ǼB\@eupV É w(L" `Œ gPF_Nn9k&ܜKFW=q0D~eC棖iX@6haNF5; |b$.\LPtyCap^LńR=_={|ǛpXcҧZr"kUuN i\+U8 Y CZD\7tApC'?yk)R~|m'!UpHRQ0%'kI0SPYߋRi5#VϽ5:>/vs{'4)mB ]D`-WRT%û#&1,A#Ъ"#H'ɢ;u):S`' _טkF\erjYnYVF{h5,Z H.tot8|7Po69Mp2w)he?`Dt/Pc3̷X‘A(0+Ei TlˆlVE `ov1ݕt 3zDC A]zyUU!,24u%IikY9fs*$ 1d 1zDp$+öٜf‘ .m71%mHZV!ܥV#9'- 9`g\A1u>B\1$xq0L", ϋ̄Bt+[ʇV]3 O"'$Kw}.,@[V f9.ҢֵJ$+ )CN4( o'/cBXM憿>>wOIPJ0vk$?Cx$gJp9!7e~ݯ;x4+/#|m/i<>jd=\0 ji˳=FKu'׏=$3%Hsk;5ϕqlc'#Sa?WfhD}16_~g36gP[u*cy蹒r^Ffc҃r#qyTc@Dl q7*ij]1%L)ě(T:} .APjDLfc fjĈРRm% zڬ%syza @nZFl D!,pvT!a]\$`_ĭB_w_o6_?Pxb!JR7[~h B0͛űT:ՙDPE*vqdHf du -ӱ_W[.u*l>B$׍]@"T>J</9QlD*9q*hX߿ =H7l OkV(vi2@8@fvRA29o-ș)3f9f)M21Ŝ 9`zBH0p΄uyKt1'JPL4#qx/ _L1[Aᤄ ï?;&a&'3Îpt׌ -qA=vl(O~g,QI+R{)O/dЮ8MaVFL/)'2)DjzTbAOzŜbFT .SzmPM\v7'PG*Vk~R#`M¦ИbvO4 .n!.ؔj힩s~zk )T}SB>*|:kz;glLu3!HyާsLB%BL^7p!uÝsct/vs_5^=x0\^F(Ap 5jU'z[t]n?CR' V Y*-J9yiKs*V[*Š[]=CjV2N/WcNY]dPPJg+1UUy:Û[p//_Q.@7Ř>.Ȼb{YQ ǬK}pLĈTJYg[Kcϊ46 vnpK~n; 4hc^+^cޕ Om6)BL8GI%S-.A ޒTjf➬c1A7fB3oǰ-)]|7[zXjiWMſ8y*ԍ7pK l$-'/6{ @Cs/vn 04\)rM~&2a{y\fvuŜuqjj 6S߶o˕ߖ+-WmLm뒡Ym)Xs+L[7 *B!7[fH 㾠SۇN4(}㿷ׇ4J4zJme9egh tGF& /evh?qȜ+H8,~RK"GEU iH#R:MM8_Ar@Vu4ZX&f J@̦JV*Q6ndSO^hn.t!9@LhQfHM\R|o 9Kɗ 'i2γN;p9rA_ 1!9ngfOK-S?1uBEcgVN|pχ"ؚO擙aܕ林3&YbTtbnFH|T G!]g].$f?ہb*y9!RXLI:'qS[:k}埅#uZpIAÜ/lD[ovi6+i6M,Yx m3mɡ.y ecS sF(IQTJȃ/ hPG)g _K5k؁M#F\yqԈUu+$fQ]x%(yNT(CdmT1+WЉڡj#KJQՐ 琀ݾP큜P\FD߷ 1"B@Jp"J?[bMunQ}dFn*5۪VqxqX=6JfT32JKyՍ:m)-u13D&UT^r?9u7 2Ew-6łbA? }g68(PdbhE;,TzffO?=Ffь5Qh',W$RrNΆaѭ׃7}[ NҫwKN}dR4#4!oEO8H>JQ4ܧzd@5a=@mLffn+YNK(6Đi+(!7Kz*Q_87HX?f7yMg:rf$wP@dȄ,2/lk2O!1ж$KQg c\̠H$h7,'Ǧ03ñaXHGFF&HAu1g|ܢڔ^$:[o7#2dcvG0|4uNjGYW%vԁ^Hƒ0r_t-J! 40Hʗx."6Y;-C*set %lӋiM%Fbc%/,H9G_2c=õd[eM좈EmGIC8$AzqӬ%4DfKV$\1ͱ;HcOpŠ<7qLH/~Do}\%☸i@_1Fe:,7xb"2Plyhek['_O^#jJ_#SRt(o~7/rX*-moG\0idž'dZj.=1T"=5_Lz;MCGWe`fP: V&% 83Ă2P繳Ŏ۠5숐;tOzzAdtN1F+mX $O#}^[3_$NEl}|erd}gHjD].k3I뇱Ҷ,P;E,e_w^#["[T]&FϝAԞaB63DV#8݇{iD#uRvX{n.@ C׮׿H+o(=J=OW/?|+O5 g0 @a.л#ʈm0u )4E#1 *μp/r̖R}s ӅHm~QBnd]ϧ?/s32i<>P{{hoya޳ٰgj;Z 3;RHa&=!BCf؊p6Hn7_^i͌W'?~L N2WZ0^6-: 07#ȫ>Oì?hO2s>h%}7 ֠w4f"M7.u.\NɅח~9糷ݴǡ\7YSִMU+5WD3 W5MUS(D/9sM wuӪt66=de3)xK(%(F˫;ar?/[߄8Yί߆"ɫ`ʬqg'flm7ej,TYH^]%1&|exi~s^o6N~~2Sg./˿s em_4Fq˟ "g{o6O'9"}xoaVPˍ@vUP>B/_^BOzq-+dn ƬK1=nJҺ~u+wջ0贺yIdыp~0zJ5>[*ڃaH Hn\>7}cj|ũZ-'_ p}Omw ?cXk5~e2)z y/@zY4+h@hP (~Pt?@> uF^"䘃PLcMa_h?{HUӗ i> :oG FQ1-wo ^ru~M4]22*͕(/Sv5l}P(I!*Uxo@er5Y1iQg:s:"~0 p82,ˤ>`ܙYmVe*<%4EJP3H Y^FmxuWjM@~h&о95pyDyEyHňHwbåF;JM+,{MǚR5\CB,$d6O gs!7 " @7r 7 %Bb;NuPqXP2&әɵK"*#<\ڌQY!qGswu_ΕV8]68躒PQ@dƔC&f4"p 6\w"9+-Dl[|uiSdr7]RpyhL)ҌK!3)F6mr st<J)Y\J'M YTy, 99뭛H9;w厯z8wQ*ǽXp|ɬ6 EA(~#RZ=9ys5g],Z?yxGU}u^uWjLvy5?7ߩCPcavYn!BG-~@->eQS&;USĀxnz B1 2[\Y cp͝ %{0Fq1,H)4}i$hMzۄ-[P6k 'x:cpr|SQ^fm BK~pTr0JmnTWr~yތZ*WqxYި[-tfPL ϜqWORb'xŒ 13XTN1-Kd$(K[r$\eh4ǒN~TŊӾmIKn7hW7@`J= `oYn羽&3Qշg\Ty7A?ՖwZ>tgJ,7R\k f{@ `8 (m?B$IM0J'VK~U;_I{|74W+^Z}9!qeB(R>߸?zmT2X1~ : >lئatؙ|br}X[_ H[5b-T޸ڼw)*+UsDSyau%Ze![PTVɣD%_F\+a̺`b’rq6gm15_ӕ[z9kGK*\5V\jx1\z*;VY+F8i. Zd‹FG(FDŽFUkA3T5|܀LN i,]DPQLe8m.CEi#mE QBJOJN0\rhРH1M\ 2$[B 6^G3IEQ 8`)'VWA:p*baTosRE(K@Q%sK r)V\'t\vogIIXNb{])`"V1- ƨFbbFAScd b0ZX3,5 rl{XLÃwBD)B+Tw]n2rS/ZהHQ)c!jў,0JJj8 3ZHL #a)[>h)#d_+{-*TbH*!iR0zJ#B 0v)كnizp RBSEiیB]PtYqG15?.ݚ t_hw<{~D^L׉ȫ YbolzI>$BDdfGۻZl M8)_n{GY&[k`R(6Upm w嗂)J)ϏtfRz7͸ۅRn/ ,i."8Ϗ}z(z+{\9)f*ț4d)>q)GwKTSqrP\曣X$k׃ L5l=haBnĿ>o¦Af;Išo D~Ta= ݲSް҄ 7 ]W4\@p2)Q(6(R<4lCTY+$SNq,rgeo²-²}|CP S;eS> 1!r(1e`pY0N3ꢳv `#b$0xH͘CDqblAKe $Heg(7g(ϒ$,I?+K<Ȳx`KA WX>aE,yXl̰2Ld{h gJSm"J t@i#PT$1\y7i3o]5*fnpJ(0]̧.&^@<ˢ >(6C<ïk9Ofh[O +QW**U%-+$Vy ё1sU'*se$gٖ̑K"QEЃJ$`Wx*m'e(=zv(+Nq5`,`VD$NF@hƋ;mLj#].sHtx4Lcƌ(++PS GQcBIK~Æ#ǠHn<ްTYȴsMG4&;ަAJ`֨k;FnEMFR-&x{qKkOAclZ52OBcJ)ۈXFq#W>A-W/nDϿ[كo=Gu"xKYw'*REGۥKD.Y6t5rl.G.m 0LSq9VqRgܠu)v a$F$G*+뙄ʔyf,;!vk)ؔ (d$^hoR^i  J":")$ bH2֋v*]Nh{N0y!jWSȘ Iں^Mm>\oN6նJ:ì#5ɃVxCG(Yѕr$ aBߓE;h1 ҩPyrq0GV2nn_Mg[^+HYL\ܫ\"I\t@f~*RsKrS@R*5&8 U:5$dAuJ9q)𩍙ɤפE͖Ed.Im WU6#&(3AhSyE҆k N8UI < )܉(W25 N]S y vA̤̰f]=#-#8eYI&׻:)W2VYL+Jx6Z@<]?0[c3~0®-JdI&m] f-pˢ*_4Jlޅ]bFH&f 4jIޠ/ަmwe~u"iOlY~0NJ{Kg7>)NS;Cȩ.dz?ʯ6r2[Q7q+2vH7z:=vhx=,V4\J>̒O5F1tnܿ WʄʯMg}umhR˧ډTq"pӸ])9YОK5}3؎;04@r %9o9_#1(t'pޔ߉fb=XL4zY\J9T{oz~}iY5.~ML&Z9ڊ F WY5)?}GۉPRZs̛ze֍Nbp:&$hC}U:%\E_>օl9Yg{Ңe9o&>Mgɠ3=l휩ޛ-'ݜ\nHv.F]*-Ī?_5tѢAfH+q7_Zì'pks(@D_He+)+%u3V޽p˚t+aTu8JPo_f`7%rqȳ@K}?у- n{[8zF(ObL~X'scJђt-WuMxx׎L#=5b64g&_˗m2JB Ve,UUiYєY%L39hޖQ#)*+k\ķ(E]0<ʣ[tTD 8ޜcyR8Ih]ŋG+ N[q|j 6o~ΐq9aˑ>(HK %CK{8T%u]hlqZ= -s3丫,7#߹i?wYa wp=Py9 LѼ~N'w^(H vv׷)l\ > L_MG+_Ot<7I~tJv~'u{G  =2e I8FX Ι4C˅Yw T@1~Rx u !#,7"#Uk)ܴXhΈnqG*1ez86=]J|K>;?}p6qTYm=xhKyLoLA!hZ<R~o^ߥݿ DђXi˓$,0L?I%(췋Z-*)d~g7uBm~9Gh9+3IaYqQGKFz|&Mďo>gQYgSп|_>ߗe}{1Bu=*e@C%/A3l(K<@R& 8.18`ڏ}Kߗפvd'pau37I:nIgQgfǰ4 iT.k@0#F@-}dc*7Κh\nAog^E/禐C!gԠeD% -+ yLJƔ^ ֯ʯsMz+U"!"ɒ`JRNZqatsd(-'ir:Se7ĉJ4aqY+/麒iڥjI鑿X[Ipۻ^AM{߿o~M=k20W?M>>gNr_?1oO/ȩ-OԾ3beg3쯓cgXv:-E@ӣϯĹJlAʆ4i3ZI4.t"gk_9Fdh"Df!TxJoddAK~V7"9c6 ]$~ }sc})bU芜2֊w %p"Ɛu*n]2F:b1JidʇWFRvhK]HOV*=;DIXAtC__]z$γۛW2.I'TѨY^80{ԲR;n*Y X׭ -UݒfnT‘;re%%KAPFKǴ'$M*вH+,;qCqw4 qR ݜ_zI'F`l5T.lX±iдCqNXx6MB+*E7u:;dlNB]bBOkokxw} JLּlK!+}VE$Y" nU=SLѣ"3jux09()1 1w ]x8h!=VNg дރ$vp|i.d''/9"JJ'891Yt F :K E.0om*2 ʽq.['ZdO5k>Ņ76rv?535.^iT΋'.w#;ZZгNf4n~ VbɧטW~7JjS0/ Ȍsw;|baT&=XɽE0MY2dm#ŽÂrqXkY*syAh4FĢ`ҘB! 0Y)xVFb7! d.M,X4֩޻ɒύ(%NKS0`GY8>XMiRwK@RAX܀e[vDkO$цOJ@JE #M)FxL-`a:}4Gg/Zå ϒVSZf.DE(}  xhs7뗕 [9X~)жe_QV5Q[|$~h-Ս8Rǫ橕$(QAWoSDZeTNP^"(,p8/#%1ɜq!p~Zg%!KaK1U. S稠 I@NGoN< 8j;WHgBbMTi}{ta<ŕ?-"[0ZiyzY'gs *0=ZKȣ;Ƀ1wi՘{BAS3 kPSFQ#:jߦ9nV2y5p7M Z!QOt79h8?URR:u56a==C'iոۻjގa@,67IMr:yYm9G(8:衸8/U`0iζϜ\y%!Y8JhuEI5@*Ằ6!]Q Q{^jBs7ռF+Ir{Ofuz8ٙowF(3~Y}]C4X|EyNZcLzg._A1/ِZ/na8ƴ2~<(V9XY%VgZB >Fe!O>e$mP]Kd8꾹xn6TjDTT[>WTCq!Ŏ By{C#dvLCxYn+)T}Q֕j/Hմ=V})(tٲχ"J6U3_{[0-j6&N3Q#cB$ʄ2 F/4%0.^/Vy%ܷUs& đ>.L,YS^ظ{+/,gakZݎOt,+|K "~6//56&,a &+,źP e i+OthH,K= DY$I/ae}E#4$͹mdѷߟ27S2!mJIAs*N,3@1HeUaz ccjգi$F3Fj94rZpJ#mH *!㴤(_ci}QV1k䌱K¾\eBRr»7qr<#J15nlFCϤE%BhѓZ:ٔ NO3O" FҗE߾`Nj/8zV<ZH0#M/0!?3g_KhKJsݛE:ҡvxلR掶iHں21?egn̓3upq;oٺ]4Q3 ?HDPm#ZʳhR*y_0f >D JQ)Q#͛;%WL`;w;@tw8) Ƣ(SUψO_(IVm;bF8 %Ĵtöl(19ń{ۀ./.9Q.37E>ӶGUX$%0&_EӃ1zӣ-SJrygnnXmLl%J`J#Dlh{J 68PscBX&L(e*Gf6OxN%Ҭ^!Sh/VhLZ0W\ cp5Wu'Rz3R#,FՔ!n=yn˯x~:+o//9Fpgewd!%6apqӝ s7IF% 15H i{7jb~26 B@lep*p0Jp̼=)-Mxi7<]5=\""sYR.KZjKz. `b"`r#5]?_ Y9A$TOo7]ӝ/d4b&Qwپy4i#fOɁPC0sNSN&ywtڝߣG=NznJMp>vtt"$G%pZ= e]~ /xUNy"Te%΅OJOe=>"J-n2(Z3uʴϱAkV|K?MpRiUp=b::$TaFMƭhs#} D:_IПf.lri/Jt>Z6U]sծt!l&H˥JuE/9FyjYmx=x%+߆ .9Fr35q4$00ud ndBWc1A߽QJvb\uQ p &)/`;%5j %//*_7o^ƌt`6x9מQ1x V#a4QJ%o]bt4JPt'}:@DE<;ZS07O05mjѬ9C0#sfqVu'۟6,5ngSC!(UՏ iWe{\Ur֡ZE6b%¤;!N!~Ň/i~\F_>ݹ0wTDIx7Y˧;8DqØuX+5QV!K{ 5HZ^ =!Yq‡UuW R0>C^4qSݻ[ /7Nm/-!$E'6a 多réaǒB]v6\[>CdLx m΅z)heE?JiŴ~ 䤕2;jj-&t) ?r\*SIFChϵ5WݻCb3TPOj}Ohtwʽ V}879`fUӫBNk3ɛfV@YJ#xu^k;E!0lt5}H'ge*3Q1|7T%?NP'ehx+ -(.2u"Vv4 [G"˚Hܺ攬lȂN1tbI"KmD,p%.9Rg|Fl5nY{N7qW^:0kjA SΏr s, gE ?Pɤe ¤*q̠Z꤈T3V1Z(5z%sJ2$#bF㸝bznqap9o wƇOk^19/llưٻ7n$Z19`L8ŀdDL]E=fZRKb%NlZ⯊U*jxϮ,#p(&ѷSPoflWxk9kigUg"Kv%3s>|e,]C'Z2%O_'oh7^,vDVѩ*vE:n'ݚ?.Sq7'LT-O?|UM$OZ>7d\WI}eAw$ܐS =YjXWչ>lay8m+f5VJbkYFs9ǕA$`띱[17(Ԝ8sxNwzOU ׎h[N"v3\\TFضD[$;$DZE'\;*x/O^I 4vFϙ!ΖCfs*a:LYsER)#%+dϬP&cʐ̢3]ڣX϶c?CKԗjoUj 2IBcHR0“CyEWQb!HJ9:o9trY!") ~jADdfEk~mx2]0㇧Y_>|ٯBW_ dR@wxH`m $DoS > |< %hH+1(@"y)ҬSp!i9q{\PgWB4Ib;Z,2-/ 8W7/T]~@ygEHbZL6suD]\Lc2&ʌb*È G\uQ;q+<fjk) [`0\rA ĊaWEX{75[t@k^ٮܺl'f[ p&("N*'Ex+ή4uBrA=YX{).UKroykQF<4j(غk4|]â 9;I%8f-y:rRIBkXB3I A ";SyiZzX*+Ep@b5;M\ K۞-Z%= : B%/ '/ΙYNFC}1+pTVwjXfGN~ IbAYLd#ksb/cc_xa48̆<ߨkQ5T4[ l@GrK,C rFmnQ&9t 4-U cM%XPUƪJ\b*5 x/1%Zx! R䌦!2':@.7\+!4ZbԖBDM zV[ ,EjFlK2TWlޛ_{ҖLx1aD;j'@Q)x:5 Mi(&|cu/ȺWΑ-g,S6B9s(\6:Ӭ㽟< Ocp4OMb{"#ۃ0/ fMz_?=1>4Nj/T g!|dӇI6 k8r#XN{|^?Л Bomh<ջeUcu؄D Nin-N?fDCؾ{@~Kfl.4igѾ4x&3x':mD22ob2 hz "t+J2R/)L?|%[jaDv˞ǖ2=Cg.b$TST?YpL""H+EG.јU0TSE^m%^ֺ[DZŪ' ڵpLu@ w5sL5mjnWLRڂ_U#.yXr%ZvqT !TRujt9XcneZH+@@Fy( pH깤{2hXF?oiʃwk꾨V/?\5 CeyDF~JQ*B5(FmS k$7~qT-҆il b%s$A䭞hD]$EH'"d'WEXIbYM#x^iTuB 8qJZ}IڃAӓ[ex)«oG&TpKe$~Y%m^/UeKTCT @W(/> (:̦neYɣsdUd$?5FXv\>.bk̖KA˘I t;c9 UQYdáEZȫT/iUp NgVjlUMHQVaa7Y.ڏ uɮծgWgrh&'RnΎK77ݴp¥ +]w?Vh^'^[Ue5yY52S: Bnt逖Dr0Twf؛FsW\H|,uU5i::9b31IUW[TR^ ^M!R>@ތ;1hL8& <֒K*V(F'ލAHDV:232B!3ffZN}<яэ$!׈׹%>q̙֘̂f!tR`vD/{vOˈȟůE2+ΙYNFSy JyJrSmeɣ3䄪b~zTUzDJ(V~b 4BVW[EE5ߺӝekrsqe zg,Vy̍7ʫ I͉ÎJL1W4pW#`gWgrN /`%cOXhro&f{&4Wj٣'ĖsuS-*FiU@b5{t;4á+` 1npDqp_ G)9]c=帹"MYu2XwWfy.'2/cyu$,( )rexVqإmZ` u2}?ę-8{FaiH5h^F=fW$70O?{.-ɚٳ`/vk]SQA֚Z\U[zb+A :.ブᒳ{o 7EޫOY/w7⌴)K5-x`Xp2`Bɑ 21)g_ ׾ɟ&ۘf܂ >"ފ,md4Ku.8x9Qxc&W{vFiz(bP+mT3X{fCu9lx];f"׃{Mcj42v裨JK <ȸOtuCBz^ܵKYk'MB]5tvHI;TׁN uNm~J8tr4x81/#1`d\nXTqvYDyιWyf+P9fi$H |r.H0HHsyȩc5u#N?bxzocxx^4w+<؊dC}g_=fۅIO< , [ T 4ǒai]/8 *0ߟ5a$Mxhƭ@x)1p_ ќx*J6jY"l> t9\\/>ot%(igÉaFqT5P+Y)ΧNT%l$g|( sLOMb͌134ΘSDrrl;ruP94䔦+11yy@z@+1g)x1y\9KyG)/ţzӔ:ފ28Pq,a%½k PⰞeށS \džb%xNV'rJ,F ~<]:>UNl.b(IK$h5 B|k7bw,635 y1L:tQ)w@jx)kc;KmYL\Cӡk5N :aSRBtV.ҚҊ'U MM%zuT~-mVN<^10`yBL^kKie<889jPmDKF5m!|֢'JN}OK=_&\բN Զlp)ۊZRZ6Qep>#h.|X?EUYkH1jbwmMnX25[6W!{vo2Tm%ꎤoIMH%"$>b|auSتPC(LoV8Ô{wM|Q9`qs&# kձo"yOi!GXPrL΀қG iYW brL,K1|y/D,3e8Ӈ/?_2 =V|s-)Hƫc U}pIiѴ}bwZՅ6Lg J" (pzrG)՜*B$~xt}VV4sJ%&z<;L#: B"g kjA)bϒ ,  G0&rFTs,=j H.B[z3!/U=܏x/r "Z+bt2Hb6q6C\(JsÕ}ίX>uniL=<$9x綀iUPvcfjӹ ? 0[ô[tqw_4YMUE .CP2w'RF*l4D#(#$ 8VJ4rv0+ߠoWܪkc|X0XFB/L߿,׽G?O'SX3s6iZA>c`L4!J ~$^@J)yEd=OriiPrF"K C)%ʴ{~׮,t6-n(%IL6skzS]>* BZH33-՞!1,P+^x/ %lY -1kԴ0(o=Sw"n.n߼~-V ʐ79W*-Zǹ6Q:c=6yJK+=SZjh`i@mv׺<1]밵Te)+\*p)QfW58;#Nt-V`.<8Xvk $`^Q+$IҒZP!9-!a* O"T4LMJD(ljwt+*JƻM𴸀I%1EJꬷzcK%g}Z$K]Eko/ɘ1Ka+'=%ʕ+*R !*zf+l -I$  ,uL0˄uR [gćimq 2p& GeiSrq`w(TL.Ƃ?>fv9#PP ~`XG#LoyT7Ř7JPDE*3{y|~YpW5Stx;;^L6ʱOT^d=UR>""5%Si SUN2FA}ȽC YZ*_Ӊpˢ0,B`M JD5JgɩP^ɀ&,` KUdIQW%gR,3IbB^5ٞJGp ٥<XةL%^9}Epurx{"&DYoOdO#TnF3k 7aOJakR)|pSnUͩ) {6եwǏyUb ҃{.}tQ7D\* b~^;T vqE(|c̐x 4egv nQJ+#ІBP5dcMAӃ,FcM2"VK,BHTQU@oSjl+(G[-K|8G C_On!% Y:Rt}7}L^L@wו U&󻏫<8qy Gwy7~\ͷw,j]heΛnjFZG08J/ı$ɣKB~elZXrI]V$cٹ BMZ)g[@3|c̳\֑и*^bq$(J1*GkRI]t_YUaV<{ߪ2(_ΰyA30JjuRB f jkx_lc5𛩼rnݾFlAݨ ٭ Y>T=xv] `QYQ-غ{+exI %̭ދ t2즷zЩVEhkNjV!v z(b 3JvfBCgNd /F"/1Efv#ْJb e &L}FVͽIXwoh("r4#zd61`P1G+`V)! JK#-bKuCYvqk ѥ t#!X] )ҁ' qeH?m^sfRåx>):5GGXͧ&^O]=-I$vEx4RYrr{u띗#c:QD4}wc7[Ѝ|7JfS'X>-[Dh}מJZ`~ȃĪE}Vl6`ΜU P?x(b=pV/~8rږsg4.h[6ϙ{5ipgX`ٜsX`7{N8đt#h8_C᮹yyoU?:>GGϮ FAك6CTeg>v[=08^r6#g"F~E[ZzlLDD3}uBp0{?hz9Lm5 1/P.ŽAsKQ%½ɗ|䈜MN&gpܓМ@g񇹹P<6WB+} ',+w<î\_48浍;qEx̷po GnGD/\Vc.^+g~V|\.!஽w\^x\Η+E(|cDzɿDWl L %kȡ{X˙s ;Cv)7>VW;'(p+A2E5dH]9^+I‘:fdg{jMoH*A`WvU%K^?{Wƍ _vnF}Jlo*f+8LJMԒxk6HCb8I˒əA@_n mC<桛Ubgn{i/-eR,_h"Q.EUp4`%nh2Mn:NL>䤵#QL,O@d:[d7UIꃥ]S񱻨9_0M{+_}:K[z ۍ?yKMA;Ocx#PLga䋰߽?QJ%fly*QP] @K%,(}&y- UVDFXHշT3]ӂݸtAWˠyYQEdWZ!H39ߟlXjٕ́HpDֲ5Lեڔ+mB& 4Zj>zTý7ƴd$Ja0Y5gL:SzJOf3bhխsli!)>D᎓ZRV̜ɓK}+QgGNH`*Zm=jO9h朠s'2K  #(Xv^׾A @ᜮo  R1PvfZL" ݒ(amf!d.>1Cl>QR\h{TGbmoL!0pj|Ն֓:(y&z HDg% gKe bJ N tՊcQH;O b!-MD(ĜH{JO?\P1/aQ `0c_-)|BI ~r̉SO cۦ9Mr hLm&q#vvҤe;9yVRΎ6UYTwÄJ\1gM< Ǔ5 t`81Gq4P:yA{ l^NzwqHM,Opc'gJpVy'`'/>h + 49ƿwM[(V.KяG_1 gg+RRtWg5ƴQ^nag ooSJesFeuE DO _Yť&OF 7nǣ~f@s,͠Mp7vdr~Zx> LC?t;n<.6@`(6 [3n2u6-&gZқ3iOm* <%ÕlFU(3^! V)v2-!23FX8R_\  \v\(a bvnh'`^߾nU"*׮ϱx`4tpnєya<є"q+A5hD ʕb4LĎFNȗ? $r.idO$oJT#yR)>[m }Чߘ47B1?^3Aat>izq ]:y~筻Dzl}=57ܠ &dRM8JfRK(x{d^QD ҈Lv~ q%/X1H`͌'SuN0&X<;PY0a4m ?ʃ=ON{W3QB9?ou/2x`hDZnVu73 SH9iShX|HƍL#8T+alПe U@Tf̟ϻ~~Tưr+*HLkLh$ ?5' (֩vcJ)t}mVaf[UY6qn4Q9!,`Z` .6}OS`0ojRުؗъ99-r]8f)g \:FFTTD2CNIŹdx̟텦xGAQ~?f>՝n*a`3(>~WdFD`,O޿{`2{۱}c,~ss "D|!ӏoǓ#> `_uCױdOLh3z釻n-LY0Ap wPͺs.p\(E1ihҭP@)p?~kM>)$X ASG@%.ȕBľ"#Ki(N`3>0 O=r7,,2.zm+_qj|`Rmej@$whޛ~5#;1Iԝ>cuܔkN%P%oҹEQ%Z4Q#Ȋ@^)YJuĚv@]\@@ Gk2U|h%kS6 M 4Y[f*~P9` u{Ly qin;~<Mi*7x4&HJʥi4gAvQ$(TnT \ &Y(?~*M1CEC|Ն*,9@t9*.KJQL.w}<2K2aZD; &Qx;12xv\Np+Zp$IKm<ڏG` Kpӗ3oMpK +W%f!~ʱl3d43TSJdj\,XQni)d="[ ;e4N1+5"qd!!rJcb:,Vn7TQOx HsiM4"?ILZ+*151UZWiE]RUr,iL(7v?FljK2ۥ577o@0t ߵG3Z4٪`C "EDGjn6; ޵3 SowP9b=}Te?b\\T4Z**!jB|o.B|~u LF-%9ǴrL!״κm?VlZ*msN0QqnRտvi[mJJ:..iσ--qTt8Lgidu \4B  145MbIn{oeZ7(}Tg6=L4lSUCRb˲쥐 ]p~!Nde׳Uz9j.xON&+G"EZ2 cʥ^VDߍp_ki^oZyu@>XʵÿXv/?33 WHҰ@i(_{usYJDK"%:WSm.@Ҝ4b! O˧ͣi ٦ RKn'1RN4QaprMXB-UcF #E06ickԇh2.qxK 8Y bfPjO6f'h{2Q+urQǑ(nl <9NhFڔR{#UqiEnn?׽^s? p$6ZT݆$G(6!?Ê[g{1܄aZ7Yvo^gfJ31~8Ǝ6ܻlҔRq${2lypdEն>Q [D<|yuAA3QSYJZ^]*2g햚?COy ^*0:wcO&w/فg ݨ˷M.;[WfκhO{j{VqHpE01\/afT!, -zCp< !23FX8R_\ c򁗸;pal#>Bӄu˯O|qa3]y+!Nn"S壵Ơ79uϺM6M3Oc їpDE*\2 hU2Q"Rr^ɫ.߼s23W(#T,2d*/Xػ޶eU{oa~ps(NIڢhcuȒ*iҠ̒LKI=!QKr7;! !>!PѷWEB:i}BSJ5 4airZh&Ix8E@*1s82=0OD2r|3*|/H1L·ЕǏ^H-ʂ.O?ʹGGë .dAM#`Bb @ם};]n>G*XT3 d W"oACAH"OyX{T>?̗f=~bay t6Yi#P?`grbg)K!K I]M=*ّǤ+nj2ksja"L,Tdߝxm|3_ |1gF:~??~2~OejP2Ua@<ώ_$;.|\1 4ŋ'z훷:.x}w0^q۞?w}oӛ|>39 OݟWqQ,wٛCX?ecU8BׯEoyUh||99q8[<;u;6pn=Ɏ\E \_`eOnݎ7;S>9_I?y \K=ڍI.Dp|fyml£m tB[*ԣK/z/_~{ƗALʒ}:(W ^ A.}94R]BUI:fקfd^7 ֋Ϡg^>{y5}r/z/GI3 y- Z ? ٷCA _@ I?k5ȋG}?^ڂ;ᅡWɁ A;ߛd4z2\ 0̛~/AM},TŒS,xS;Vr$tp:n37kFkxy:FOUh튎L=LoLi+Q aoDm)iח&C ch r̥Aco4oSg=*3G>&CFe.ݻ뫁 qFr MSXU12lb4C!f}Yj#S ֝L<uHg WT5QҮLiG7(G~s* _?}syo +hnr%~Ү ba6 ~:2z"L}JK7w{4 rb~Fk6Xf_j kʔ{Aϧ_Nȇ}{[vﱯ|^Ǿe?۶uZ%9l|%?ӊ,4i8vθOH\2Jge?0aҖwdVmSwVF˱;+cUy`Dve4Ji=muv٭?yV{Zj[25tN 0Kz>.l>qla fhIZrzz:PKSL" #H+̌Ӗ!Tb0JOsB(|F(ːD1Pa) aPPĠ{N}JXgJ'7r%%cF 11Ihn`Uƺe$}z(&TWC1F¹KX I>9NCSxO3>ˢ*-cK1< )zɮ0V c_):)Y)dSKw`ːSb23@q%8p#X`R$@9t9Xǭ|+7L'E"[:XիnUcF鯍-uF5gû@A|S4ju*C wcT$&Kr,r960njXnqo[~IO;]+68̻R&LzYV򮇆y%G~/ľ/_yNdjʤFI(qb%VMX9oq.ceZf`aXMp! yx:eq^{d~Tw5oP^}Xاjo? 0Ն4Ӏyej RT2AI!T Ne)YU-8%kpu%2'` 7!,䟣^9YgT/ր3T FG֕8[qꍎjYWVd]|Ys]%n9B<{V'S o$ƪirC]Us3f'tLL?3%u**wԌ-7TV9^HГm?Ҹ7[Ԓ"q1{I#;Pږ:7O&0~v/nE<&w)aBb$9gP:璔0I>*Y1̐lsXx_O%N_h]&o`ds{>%; Y2jp..ݹQY=g`HӖ:ݸRh~LFYtb ֍'5O0Fd2sf IH>*n@lYu ٯ"IT'k* ;N1c^|]p?"8+!Z|lҮZ*CAי<;^-Լe~uE#jVSH,xcT-}[NxF!L>K8ȩٍIF9 Z\%f2oBI1IZ Rr[ktGsl{m mqnd4)ʉ$o W`6ȑ.@X]jAztcAXmu{2w}?ȭuW߃xB7x04WqC"[CVJfv`?:[>$FkN[GkABHGv$4l.$o5\2ƦOnb]-}2v1*Ԭ:7f}q(~2C^V X0C"iz^_g7̺!Z۷gG{Td q'ygk|{Ä`b2^pXQ+$O4A,!#R8h{tvڅ<$tj`Ҿ sQv;C"@%`b?ZD/=:ri[1!˲)M;Uvy^4h%zӘ,IksKE5_hX PayUyxu& %eA9D!<{b\c/MX8M|0[K8bDZYyzIxźAbqB$h$B\ycl^C%f^3Q% ud//n%j=on/5Ytjx7mQ4*eM?xNpD]7kPu뙉7jvh"h vZ~U.y70T:ńwmRtWH=9y)ŽՔIBRKx)T ,NܱRNeP 0uPÈvqHB5,y! !h9ckV,Sa.& If݁NhOI(PTXlNƄ3H &Tov&x$xĎ~6##.7w,hJsݘS6TĥV$L5a0@Y,>{>6i~@V1o76Ax50 ԉh{b3>$ڱ ~»:gx07@ߝƶ3Cb57߅ j»F䁙 ,{ mx~ \=[/ϟptJ";'A(mJ.w&CS"F95@AOzO nxf"7;J!5P>U5cPn>Ԉ[ven(m#T Z=҄PM9TۉrZ[_l&A|#L)L…` #'FR+HB/["3=ۋz1~XBUS2){lodQGX,`fZ-#3WR>nVMMC[ ESHHhÙœdzTv~ulá8m&)s2(ҘS"/LLe.@HoA@xxY6D4C.{ tHӫ믴|[IG+EWNzS,#G9ǸE}cUy;Vy=0.t`*5Xb]Yf 3ilI,S"e֤SLgB"_'i1]^ґ{sUZARӄ =%6z@ГJ 6JZ!1Ȩ(9FZ|NrCD> =]6WZNeꙅ_.oT-2z/%zA'.~|Œ&(Oo>?k\o_=ř`BWLVͮf ?]]Mg/S^NXΕ[^ߴ=R?ft}!pmE!+ӷ'_/~ͅVsRl&7}q󦖁J)URFjI"DMթer#ܗ&SϊH13`IpGzK]P&%sP #u݉czIaCҸ6bRM64]PxK[2BA1ۜ$9Xm$'mAUvW`I+ `HkJ uRaGڋ$T0zGZ(bكfA!$"22bЫY~Ҳ\&ZeA&}Imw8T8Oek2$nLMJvqA{N{wR߂,#F^p;ݩ[n*H&|1 5cΦ&8PSdcv4. 8vԂc{Pm& 㖛+sZ +Omsmƒ5Udc.< hCA[5X6zE77%d&MG#Sʞo#`&`PPɊyBIW{\}3 :28C} 噴>oO0E2uhVg։JnvK~FB\6cZ z{Q7Lk7M1)lGѫJq*tDf܉p|^k=a>?~@$@t5f) ^qϲ*Z.#x/#2_.8hTO.ΟT>u{ .')~XkZ~NۋB<ʹ_ĩ܆ gT9;} >pZnA7[ƙuÎ{NHƆϩb/\?#i9 J7M..IҿMrUeU.~/bqW/?|~qvw_< %.f/=Bb^-糏S ƕWd|Kai_Jo둉\6t wYSGX;scY F&Ӟ1 u׫iղ *.!yǙEܪܛ~ 8YrLJ00s UhUKH,_H*gY`s~]utBd >50²ɽySZ>TkaNX^6E".y/, mZCA/7]1V楊# M *r{x3F 9QlHe1@cXŷp`t:^X-oN| @55[]v풌 Q˄jԨvr sZ"7,p4c.qM^WYhj=$8 T f7DZhY97w ЅJRe2O_J%5 K)YѠ.Urez~ 'q:zCpqᛛP^}B6gM4Y}VwUI{Tʪ*+eȁjr;YCR NyK& JH?Rb FNE,|@_,uicU:D*"C0 ph7PEzܳZё7"ʽP3R-yA@c&%`5O!r"0XBvI(ZJaӾV|^[OHjܪU>ףa&}Q;`s0H$ "-VKRbXk [8*#?Z:_Mq4h<Q |R};{YƨAvb۷UF^/zjxpVʨ2 <:NRC* s:c!UUvɃ)H/dS.M8%dPI2k 7^ ] phh?mcYէ+sS.ޕR߸EZj7n)=peĭعu0Wzd+Q\ OW}%<=5VD1\aX14q+@ZtPn|8άs}y+w~pc|7Q@ 4$}պ$}yd$GxOTӼV 62vwwMk:#h>L|yti3.=+K/MFIh]tW7|з#c7."moF$}Sg.QА+DZWJ,vٯzg5fL`h'tj~[]ѽ]/>z|{ mno@׶b|yàFۧ>]'?vGz] h@Df *DqcL!B3c~ֳAfh&Co6x s`0XTЗ}WA_Ep!LhcNgdʴpND3\/{:H֢/hR=sTY2E4xE3O *BAInf7-H#N!>B?=$Ԣ+'trjtt, u4Zx)o棎)?.yHzdl<}>.GGS_B^|fvZT_ͨv2}vmCwVo.ڒPXk]vVȒgHN Ңι@mwWs.g<{D@nhxy`μ}FNQ" C}H hmC eGjngֵOO>%c\y ޹6R`Pi^R|9"}(@h_b b ޚ$VWT0;] 7LUPZ=Ǟ5T (`∠( g& !@BŹ&&JB#D9<$ Zk2鵵F+rEmwRNyj=rwUVE^)tJ}~6 @?!mf9OpFȡ-iaP "Z i+qxX``Uev|qڛlI!G-3Kj-UDMՅ/iזp7G/ǹ,@‚# I['#*o5? Ljl,UֹIX~Qt}u$fQ~F%F=mwpQN4ܴ|^]܊㽂NP.N>Q@THOdַ}#`A5\UO[to0/Wl7_~9nL~FoՍۼzЉyI`~U>Lp d|ѽAP 0@.U" I9!(AĒx c rAzQ"yXCm8 ƛ3\>4BX`kj/0 !ixSWİ@ǂZ+5E#T-ݍl<Uh#z'Nrt"̡/2b=Q9fL U *ŭGN djWe0:6$R$`a ِh=1**A/~-C(E4vE%zXB@M2U?2!0>J2bxPҢ`Vp~zd'rDf!Q: GEfPQ簁6=FFC%f T! R)e$R`}=r`fvq,s~sUN<Fi Ҩ/͉8>N ա7J1ۺJ9QJ1lsMvǹFJt\T.]^^ֈI?Kȷޔ2ʻf?ߴZa1{~7MWosa#8U۫'5›ͣoFq_BsJe4ù=.᱑m#5f" Lnƅ.F }GS͟!.~V(>P%F&^#cncTۦ.|dIa ўvCq04?Mojoz2%QxMZO숛?E5ϟt1 ǰg6rQwœV3^1Nte߂+IҶGP#Z,r3'nL՝PDe[(t0ϼX_#"w^|tG[7k_S:3biĔ,Q>JzNg_nKW k!Ϡ-5J\K1$ݑ^6ЛW; :< GtΗ]DIaXAEk/\ SOAVa";Z7ru7grQXj[3((m6hcj49MCbkWL:zЫ!{P2_*,"z܄Qdž:׫VpP:1cgI`~~.cзe Aߖ1l`II,$q"D(Bs%BI"4Pt̗C[2 :zq;wl Tx_}\r#Mb9=iezfoS}x /?Lks l^iYgL*>{.ӷwNߖKg %J1ʆ1(≎14Xa)cB\GPyU}PYW᥇22a(;c?bGu )4|Xahk_k1D,2,\~[IRj`$C E%8DG@\>`s`ԣr؃RI.ca&(bZb̄bŔb PH q&ƕ4XZ \ @$! #PmYb/XDLE"ᱤqҌ3K`eh/c8?ݠAKy{9Av,2F유ۛ{U-J LEg0n#RVAx y;P XHRCXoZfZ_ںd% s^ 2 Gc#/'G݌.8\U`ou#FLѱ20u B <͐c*g."H ٫^DHE`pq2_Sxtzml{.{Z2ob G0?i5>>s8~XoSIڙ-3]uxW_> o Y *1oXG ڂXꪷ]̍ޙBpo,"4Mж::XDۥ< *@Qu`TЮIXcD 6S=DHA]e2L,^Wrڷ-їbf 6u[ksu/^=BEzx-Z{tU #m;kbA߁Wd0/6r8ņ0Xn'/ yud֏\o̐29/FkĆr~/0fq@Q9t ~a5vJ*W2N72v   1L Accᠨz;6;03td{De?EG'm0VЫ`~bVW#=ZIi{g7SOٓ烇,yI_}2S+%wn :J͸? lv z@=BRr(=JS[BPUgAX$[j}3&BϦ1//Y63z&@EeW8OKҢk)T@Vm QTýaKE="BPA[̉J5g:mn3'σOסaSrR'6rqV0޵m,ٿ"npCߏf ]vr,+HƎ7}yP!ER#(0bKSUnЧv[(-,8%HgQnxոoOmx"A R@LQW[XF9' AB~/o C;5+ m{ޓE/ǚSNfcנ܂cDW<6/SJ*nb%Q n>1cxS6q̏P2Nh0#U}߽;=%AHϷpB~7nK,DH2'˘v.԰,ёhk ϸ%D_*UUjU b}J}qOfAAM5] '֧zSƾ$ 8|AtFyNۑg\AwEdi>]:rdr8>!$$4S7o=5g,ي8u'O`DӴfUip*wWJ0Sދ!_eSzi ^yhŏ/p*px_~yyT* av/)]@J]v E/2sze ჩI|X֫OLX @` _*GB tufZ)TϯysG[3V>Lmՠ.q JT]KI:/B}y[;NӬR=c9E)u80R$jFJ>_^K決l$qeVI$"ufy+;]Yv#"Ex`[,! î6G,7xlQBd>eڔPY/ABdYw,z$H* G#'FUI=}ԾWHK,-9T Vl&@8-fY/| l0s4_\?EOD"x̴1.Vh :_;- O \mE N=\*,5#GsaXTd+Մ8B&ױ&rG317%7P k8 WKgԸЃ KPzPfFy1S8󜇌 2eLb_sI=|rJµ0ySAXuk讥QMt05P%`jrnW;hg Ƭ3kPT ž+z?.cwns[ 3U7\-FBO3-L&xFz_}KK%{qzz:;3f7_}6eʆLٰ) { ID G-RcG v  Myk s&bbDc}u{7hFP[p򧁾<fo9oՖQxj]mǐ*/9>HId2`. V}Bl0ƉUJLPN 1Lś7sfPdT{IIk+50$|"sF8PI(+蝤„kԠ5^l!RVE„DgaE7gENGi8ݹ9@aAp=zikU8=!j(/'q<ʀ4$bbavY.(BvAJCêVqNc(MbQ"JˢQ0)q$ fIʎ> pt ˃X^GBH3ƀ]Ro1yi< Gs: R> 0&YԳ@I0%V^}WN 8+-kDTHqd;l.= nڡ{ Bu= h%A(cLƩ(Hc2LD0ڦ75k3:j,F:;&-:d@4k{ ,md2 G26A]a32gn7{]Χt5JVb\{vbl~ ! ;z_o@r7j:@'o(!=;ӽ!/o`\>Y%hX "uop=IW~odfL@ $ڞ *I!6Av<ҋHRN^k H)qr5B),eZGn;s]ukOj\YYRz7յ q5ءdkkk2L4՚&ڣ68!uLJ`SI#pwJ:$Lhq{o5J^$Z-" !"H@evGIě{@{kCr#܁,n2usDʖy1"y4d3L%0KGHDazoYDj,jb'*?ۥ[? U~UHvwf`Jn:iA..w Kgx6ȫ.ʫɠ BUxX5BTf.a8%<T]ϿhE $Ÿ`{@-/<ҧiӅ!ɧM/G\ɦ_~<k Wa^l~h+WF:E[7Q(zNJ۳P$=Z/Zٺ!\Ew)-u ݅Z+?w`Qvm`;tqxл?'w:|Zg;LG%\lyx% JېJNn7Enʶi(,uL8<$`%4i*jIaxAK1V^@YBf:~/b`//l;!ԃ>8ARsi&@0 A I#D<'B&&)q E)5( $a!Aw9 HŞ3 0c].h, `Mo2;Ew7pZw[tby3}37w~|M%v٫KU|y&qZ2 y|:gZǭ)+$iiWZNQғE+Bj 5Wތ8bj-#X FJ10;ҌUF(^ PCyZ8Y č0o,1 G"HSRXd1,qmwZzE@ R] K@uCr.9h.IUHВt92^x!>6a/P.HQ`!iS 6`YP}d@V$Y2wf9 ݧr0Uw6`~<&H!WFJ@yi5'7Jի@4KJjY$<[(jh0YJZP{PG;S(S/nwBdXI|*PV{ڔLQN)޴}$Goy&~'_\bEpnsc\F0Y>͟Lk²Eؚ o|~7xP<3,XS8Y\ dj!h# u~Dl̓B󂢣y3 T-1<fVpirS(B,A kOQsrgp>?d_WSpREw| |s; `o7ŏan%р`9@):=\&g\+AX'- jӼA(=PJ u ջM5Akg4jx]-h=CмhӇ\]|]eGU[ETֆVI-Xd4\OiNchB*xe qkgz+_1iwby$ `3;o'ݝ9n[jB IJT,VhN5W"9%N1Y[6 -ct6veDZu%jt A4e垁kZ C>n@cSs23'sƤdTV$^WS"ϑcs0`2["a! ,nkޭsFUfGPJѦ{"_e' s\_bɜjŕyoꄩPxSUT1uY#Vn{/]r΀"Kg5sϳim&w/M{3 93SByJ!T.\._\*q!y9A&^ɿ󒪏Y#x!'2H(+>A;WeNSUŧ _`_trWHB# cսzɕSkȮ+>=wu -(ymbh>wp3ߢp Gٜbɷ# 6W6|0N<]WL3;/s3}vܺ ýN*d\3 rb kK +5} J1T0Y APk<+bα2?H4}e~C&EZ|c1 c} iQ_!Ô!A+E|^dg|XnGDDn+Bޕ]Q(JL Mr}=c*|_\g٭OYg޽:G)I+2%}x&YDgSe%X~=>Pr+n@)jV=V=w6c6Դ]pY,?/rBZn^//FOsY0oiGFV6 # =Ϙ"̞9KIƲ?D26[9'@/RŏV`՝j/'-7x<6J%g~E)x_i$D</7!XHbgԭ" *4m"W XؿP% u{8԰obbd|_*0.Œf@`ZG3rEJrn̅+u ʿEOh - ~4ib_[֍ejqb nxwyw/鍖iԵm,Z]VjWl1"KR,U vTݷֶ7V0"pϩ/=,pVGW_C⯏?7̙.R/ݐTpUjO-zѩ۫rs\t E 2‚}L|6oVo \¦W[,: LSo3i.%n}v_*2Б+5ЛLErsץl*T=nNbkbSdF&'yE ].ϗ^|s6c.wl?ȥ6LJdtPcK-_u덶l[r)( 5^bvQF %L`JY=\C6`n?,_ z#M^PlLRp=,(D'ʒ| 5hoFse9x=sl;;,^b weUJ0jDud _s-g -7Ӣ^JՍ)J_9[ڼ\h/ryopwycȋ /fR冟$ FFsUT5xȨn#sA~jgrt4A̦_]ȷf?\S{e/ZO9-[17=*{瓫˃'<oMvsMvsM,x\P  !~Q8^x9843m_C0-{|6Lh@#q ‹Q% kpG6 jԶXG!U0qWy +WN0D֎G^Wfg"T~t@[76S#m+_" bnkqVfkL zļ B7ff:O5-C^)~TVWs QeTk# 1>TWI#L%o MhZϦ~d nyGR0-=_rxB]9CAѵ+aQgoMDe# ihB\lwvJƎ@b!DsQ;-,qY`f#8kDQ\0> cxqׂCM냙9gVɗ<'Ju", 5;=8O>DuT@k H[>ɥ#[ 3!!J΅.`[7N}XZyR3hTlͼ1:`ym*RhDt܏5w޺Rw!@VӫmW^>CoƒA-7Hh58Ф_<`R ֢,^`EYgp>^0*D-|SO@bO |Ђ n,2j%6ysƣw.hi7 !P+ъ:2)M0F+i80-P|س* q 9\΃jo(V!iUH 5ˆ3y$V$֚rق] |)ؗeh]ЁAϘ!AĂŠFIten\\96ss%e8;3+p[LC\gJw jkK`yg"sª*ZI\e-U Qck%F0`lY;Q;WSXM.f1}̫ CMEcRuSSԥ?rzi pc0ȃ-D:?xѻ?726UmL]oyA+w p}+"F![ GiB04.jɛ{[ |y[\Lb-] dS iwS+tyMRLUCƙ闸2l}|xfbeN_PrG{؏9ު˻|k]Gn 6YhyƜ!׼6>԰1K IAr] E}.9Gד@K"ڳ%-sD)8i2x3RH/T /it8W?N;l=N;˺8{RvѨd`i=jqɕC:[ 89)lݫN2LBm۾M솃VfS%1l88xab0'it6ṇe5wو^nrm 0D"al]G3 9rNV9{7օ- :pVn]zssKU$p>O=Jxh3->[ 05IiˋNC/S p0sVr9r'9/t7ҵ,xHfKe.RsVpwkDY@yƜ!G]k:dVKX_.4b5͛d˧Ux.d:>g1#AtoqcГWd]/x Qc!-ۜb(pLl ɗO ř ]zrc2.ӥ={/|eZtrEaM)ApR)Ux䧔ZJ<^o@&70A69oR֥GȤ(#}6,bP> @ ~ԣ{jsL[F#G8Gb[+-j(kZxcsHM͕QZa-|[Sسw0 [-A=:F|mQ>tZ){Kh" zC`m?"""r"R߸a3cM|]YɾߒLs>tn܇d|pW1Q5YqԚ?ChRO^뻄=o#YMbY;;PPGWʊp&$;& Hz/|>2L Tä/C-c#AhEGPPQ>%6; f_P|5޻Ģa3 aV7fPJ3R\BoShA"}L춫џ> ֧o,[zkY30tQ}4R,Sgl~K!_V "QE*\Sy ٻFn$no2Y|deٹW(#?"3,ْ֣-!i Hl,~_X$XcaÞ]uTlKiEs$byu?WϊƟ$#<*6.4K;%pʣH.:"Ч.k]#)8 H  rc`: >ZSx55]A}+UW/mUE+z ~˨Z P5"'%ćh%šrv>=Mŭ~rKX¹;O UZ$ @T}0 LD-JK15."dLM1Oj~.NWeI DSQgX{Qt,y x̯*}}N\Œ:\L:MDǨv"haX4m}њ` b&OIlT6q,|(?>1Zѐ;`$["2 \0&ڲ?١s{dtof շnaNN >Oe;F4.x;'+ ͆z/a.|T!Y}d<#+a`&eIWP'/zf?.'xZy"fOg>̪?gW,*e\ųY v2̈́g 3egZk >סO{'4R9sQhNB?+@^ L&AoRiC v?PD7# |Zmf}/5n|E?H!\IKwշA^Өg 6Q`0\rr%ӏG`Zf>7,&BeB J% o~wq~!fp>o?h>8848/f|U@<ڇw?}ce,xxa'ӿ5FElHƴg7~O?$(z@a~U_zU_PQfc+U.4Lyr&a{BRbTIn>66mlMdjM = WccX<8p_Ā˳zr>EQg7;_}֤R;\SмAx)k[R86nTXQ&(Fb yE3ZaOэ0 F7kL V$BD٪ X-y 0JQ*ސ=QG5K$ "mVPB[1W?\7,4Q**VwW1~Mt rO>/L遻op$)@xwI YXp7g^/>lh  lW竳HLo1py>Ywz:~t3~LonQME9xӷח > d]_㆗JDBX,Qe+‚btZb%,jp!jw1Z# y$ifdIICI{9cc,eE0x|$oae㾑ey< ~GTbRw0w'0d7Ʃ4a3_w~^ZlҽR/ztT0e)e3nn\qs΂rZ[5z8!p@CSTo_W [qoCp~štE 9|ikuz-m v-> e'"]qHoؚ֪f_37;*1];Ň;i^NU5wV+ńz;5u_#0p ,F[+0ʀО]8P?"*\N`B?-n W֡PqѼ_Bex{]qu ضfBcQ`" V1a•@g:U+2om%to1$mp|q\lAOb#9)XK9lFrϋ]Ni$Pj+RC٫Ǘc^Ŗmp$̶Ul⟍∱ sos@zōҐUmrk2p!ds#)nslLyteHN0"{(Ի˙EgӴPWQy~]θKv&'F3FIf=538{D, k :(Y%ZCDv>ܶ e ua#rDb /'s7$Sz?\M<2?:7a]v}TaOmޏ]'_1x5)e0MۘK;t_nSpq5N9BKQ]8 «W{z%bO~d݆pLssf>YyXAKUNjOM^naº5 IuQ˺o [ӎ&[4䅫NDZYQv}K"!%u5S,@SvKKyCE |T'~4c%H ʫ\ujHR_ 1 -Zo'sBx"A ,H g\\zMガR'(r$t+S4et5߽=3*RV|fFRʢ>cALh3j9ɤ"SZ(ƹOf6L(h.ƋsPXbpgI0ֈh98FCPfK JFlޱE. j3#h`ߛ%Fos@Y2c\pF#9spqZ3 Spr~ Q>QٳZLUa5ep7S-b02AFGFN!uX z4XO?d6@ 1nf"qi]1hKc ZYֈTGČ (@keuYֿC &04&#RQe} )6Ts),/Z_}_XuժPw2@P"(axAhݥHSNPXaT W/iRq k^h)z}P)m4XLV[6roͺ"MJ N$=Ӫ 96tPGTP,{8$\2xO_9?#x8vkbQ$yۇR2?օs%e@UŞ~mO m'@XQ5xMn+槅%1'qF@jk3- ͽ,HV":.ovoIpo@9\QwOJSlEkdo>҃`HhP[H!I<>{X׍QJ~C"T?wN|CBaNdXn=5tf#t_?a w+=y+P[ 꼫9C-ig`t9KrӼ+9|g6籜*eTOE '-zr(!Vt`;,k]Qֵx- mOk*@DQ!18XXd#St%%qN'd!( hfRkeV1 &T\\"j%JL7~6ɸF1}5s#wIzPcL.T0a.cD,yMt%jcUJAfcgK7 9sJ'O #(ksr"r+=z[Z(sGs 9l*8Z”+-yq 9E'8'H 8i g*BbuiQyKyCE:X!q;XjjJX$ԄtPxNSKk\[K4ϝaayǛϺpglI|a٬{wix{g Bm2oxJ60i"!ppo|wie[G*Jao0ֈ,ggL o3\(t(*Sٸn߆1$kI[7Լwiܬ8iIԺ%eh0%Ʋٱ\|w *!n=zçqrNNvA?$c[ܘ$,ijLt*ѵe `+f4g*3IJ UҞЇEKmtK4TMQNr$Icpo2vX:`+@L=qtJӨ‚"ll\*жF/)#,ޕi #*D3aYEk6oYJϻ0mF =ѕ^"[!#,!_ "% q"8 Q*ڏz FT]l"AlJ( cGU|TCNE}yd/wARV42NK _!ܼuE/{]qbg%˛sˆtrX!&Q\ y]oHW9#? v1d>"g][HY+JMId nݿztUuuUTIa+FIdR8h-0۳nOdB̊,ſ{D)Z6XJt -5b-#S4Z 1ȴcPR|,_L ȨYi$Q0)jpN.$_ +>匣KiI)Fsq\"w:Ul |WUL&Bcuym}bƸa0Nf 5 DSm`#LY 1i`KZ9M=&;/=H !NkI'5(ᄏz+PgߙfSp2R1f~8BB}]uC+)a'}2ni pDK'>EP)SF'I.H1N3օxuX݄T nހDaᅉʂ aꬡ9ޥ ű3R$, ";BH]U.r)7SS;RdpNhS"ZD"-i AzG&!8[Zr0XzEYqcx:b hqQ@#Gz%sZh2DT*M,>wq}Zt61* la-y=;5Ö?/JG- mv01EtZeg_bڲu -Nz[̍Yk[83w[~ya|18H(UBrN~3UkEhIORG$dcI0CE!%;̍ Bs348ǜqYK(Esʖ'Y>e˥QS^H`|D+Ŗ# ,@G% Ptћ˳! @j.p)ɔ)hY@$?P B04健;(AP 9ÎcrRkiHxU%'cp2Z`o,9`eG$$[@ 5J{΋wE 0ީa HnY1FRBp+;G( G.zKXt +8I `aF/a.&A2dЕ"}EkHPH0E*4a$F(ȝ eאqx`*28IiFYŇT{f""zX S 6x"WYG7boXw|h-tzy7CU/AaB軬QDE_v~NXIqUlsDTfLm2Ar׮m=yEP{| w.<Ă Ԃބ.1z&?>޿Sy4LK»\]+ ->C!v-Ż/kw=%*,6> yex1;mQ7O˘ T.syFNѶ;g-w NFm-N &/A/B7&^d_YXfO2W}}r;7cX=j2U- Wk,{#nepd=BknarX]s^o07`Iܬ>d'#־`֢òp׊zzY^Ũ~b]kG_0^_<PvG^5ҬaQҩXL4"=TRa4R_0E哷%,n[\ APP֟u_i5dL,#Rk;hY6uFdĝ_ /}g.7]#y68Cy}Dk֒Yk3xb{!h$I+di3zpbVO+E.no:1WKj@~-y` v0I(râ7'_\CtidB mzjh5h3㐜DGL$BsePϰWpp )Ck׈1m{޸7 pǔ'h"y;' p&]#l#s$ r{Z'SQIF(07Sjb6{suE<-XxE)a>詒" /%qhHFa0mQY {^5E90 AΪ@PQ 1P-$BQmy+"1$?@c6GHkzpFvZ_;a{eQ__7W jd% )GƎ-B ٵDu(z0fq֑׵}nߢΰZlӔv{)PKMZU2f9P\'씟B達U C\#P!He1K)^\ig.T d{EU ֣zܛH +Mk~5XHGY٩>y9@c{f!/ ׊eg¥Uަ $ < R ^s@6'f4I(^BJ1W6% ~4=156pWּe9 'ex.LҘJ@5cj|@3E/ hwX`, S[; ҚAI{}%nZ):#6i| Sף'f4Na|spBʦn,imPnէ P 1[3B=811PN=ex( Cs]HϖxKoWHT28LaDvj3QeHED㬲x<{u9\}+GLk1|D<9vRb uưހjFaFy 0D$ild Vߍ*{uǑ& sS+;Oe# ?AI\㎸*ٳjR *qFPeXT}d"TG&|\97z-?/l-^K7Wfqu3WW_)IթhxNad 8j_ng$*vDO_ ~V%ᲥWb ;LkQRFIÌFK)48 ךiMKO<@hA (11o{t]etNj] }yaV=;fl$ ۛO;ǐ S0-=w _a ZE{ڬL47h4.կe<8! 0  D(>2}V,2bp}y)X j;%^BpdJɃLn/N!-NN܌* E9bT2$K` CKB#yR`_R"Oך2#֛07B)4>+"* >Awm=HCpT"8a.6X䑶1'^^ɞ׀ZP,:=><81.܆jXB j|̘ek+)pD ؝ϏxD1Dw"_(dLzaIrsW&f)fʅ`SFb!8s3J>?6ĚsnbOeUdZD<{׼Y&J&49Ny CRX`o(]\qg6ΐT"!'B1L!"d؄`1 Id.:ui3cQP9*AN#0! f}f͚g50(RŜ[|$_>'s@6H)!M옐f!B6O7`둝~Jp.+6skZIi]_siN Xwͧ0z&+sSbQdty=Zaf13ޮ ka{"kʛ^1vS`wA}r94]ר?R*HT2; QV,#@S Ax$HW4e;h)$K%i[Wl1J&eL, gj?k)ƛඩTOٲKɛ#u;ɝ-Rѧ/^\朙2gY*$hMQ,8X&(R+YK-z$bta1_5-n#?+knFH~٘tء%ӦD-E % $HTu~PKbDcEBjyj&.Kf"ShE F*ˋcf\q-Htc o L\pEA|icDHB{;"~ڰ"NftpC/+1pգQo܃cş Ŀi`'YƉD!rq?iޗ/#veAIW6 :B> ׻`}hC"(tc6.K9bгyB3G@9k-N߁yU<;2Mm'!V!`L abgpLAF^88x|pfr 9"f]e]beY˂ ELe&QT+n`ȥszf,Wfi8"a<pI Hd)F)٨@F!>2' ";ZRn8}uDn]!w"R*Z/e]d9OeAIgT)&.)VkrP?((e&~KcM|PܤA9DreLR@,+҄goFx(9Pfg1D^ Z9.ZWTc{e̓ٶ:Poi"Eb26Kxme\WMVj]0?z>?==7K>_Wi@??Y|;[|ܨ:er0"2us+}w5)櫇r|z\,w7Uc9f?1Ubq#~5Y$>9ֿX9\ ZJ E'Ly3z!8TO~$3^M~KO_ oK(1O%c3 xs汜HFJ&\=ޭ Jfsra&3/YvC21COL3n[Ǹyݝ||W 6T-A!0KsEjbk,p5wLd \h OTQa/JGC쟄+WmGfj(5a AAZ!ys{}g*|} *|FPElU씧.4(NZX/wAJ4cD[@G ]Uf21JlIhaTtg}4B _`aX#O~O^O4)ΰ[Ĺ[X,Pl4=[fvb3"1…_uEi 73C"߼#7ON`"0jSȨ~<8<iVdT63J] vh׀B8%;[ L(c{=>EKb7=14\5?"#ZMр-  =BtV`Nc " B2.(T%c`@eH VRJ3eO Qgzl#0rB'qi\q! ]U-0\Iʀ*gD`ByDKF .)0#85|#ZY$#"h)PA!߈#%1U=0F-OQ{R!F,@cvk/+vNuSKhHbIBS^|Xw[$?D)?^&w2S@[<=[/k\QD,r^bL/>}Y)a~sw[,!c5~M@"z@>" 2(bT G42)"aGFT!#66tj$B_*Hr&TĔ3C6+.c ^+!㈢NZ2' o֨O_BW:k=%cki9Tb2叓w4~h07w0+>w*b,S{_<}5aعBg)@c#`펍#[KE7eռi\? o ȑ̦X^LiBE!&1odǫ絕u*0&sp -z^WpJ|S icDvjkZ zI3m#;u?VqH3-H'}2- +sZ16';K]6wl.&d&^8h>wT-o[}efSҙ!~ kF`3ĺZ`|a ft*,2qf cb!Ok7@f&QtX/+]w2F2w@~8)j:5ʥŠt D8َW W?Nm '𢧘7hRaYl ;{X3qꇵ. Ogum5jԵQOo0a9Ղ$c!j 8:MT*g Q62 IIFV|.~j,u)Bz3\VPDatٚQXmN?b?7c|PSAU` L%)H9 sDL$I"+@+P0 &9g`1CԖ%@eq/VfdqJ&̊@ G٢^ p^W}p37 G}-N䌴*Aōu93Ix֦z[y?>]|Ygl=5{bS8 M~t0ipΈY}ǨM]JJivW_>o5MChL 5t6w|iWYq3_]o]k');(e">'~WE_~ ew;)rf,[ cNmWpJ'cS5y93.u\t1gjo^ $1gC:4Șf`LsiM[`:N<>; 0L1yԟɜpJ5Sav[ /JJ2E2iSɡҧJ*-mG6Ϟ~^n_h{ϷGޏw.NgJit]8"w|J1hX/zY)F!meaK$aEN],_豃CnduitѥP1]Y7S~9Wr淄3A5XΜ#ʞd=+dC x4Di Gp>PE kv=BN/2k;؋SBNs7VΠ.%ά/(=umF -ǝAG:1bʠ;S0T1hwfa{F9&Ny嚳0gٷ:Y_vW}i!fhЛ6?lC YW4nmf~}M2ټ<Ėtm#BVXLǶsy<ѻ4=IawV Z\ӽX-1Ffw?Z,j |wʞ>FҦQR@_6sud;9Fryw{bg6K֗ Y96?<Ȧ+4]L0&EFʦ{{6yO'o'XjOz'~qOO9_.~76~uI8_v}yh-l7sV^l,`y )dz/n|R& %<{=6#^;TnrEҐWA:%urh8EcnNhMpaS'S-<{u!\EtJџZ7o - mutpO[xOCC^7uSG3 GuBhb݆<#zo-ӺАWuʴ~RYǽ~[!CBvGAw/ @K}%OO#GΘ\Fkf3a(J)C(!g;D% @5NAP tyz=I7weQt.Y!L)fm_*HtpܲVA8G^ډ@GV 9&=Ct7ze0Z eg18|taz٘ ܭjF9Q1`7=o2V:Q|<@R&FPRrΑru+9T)+aV%nsHD6%ٻF$W %}'g` w; lG4"m}#IJ*,2UEF_,VF~ Q]y0%VeחYZNxb6ػ]UA{e"J/Y65E_R,݁Ϗ5YsP1FAt-jd-bEloUo{q'.&z!%z7JJx/㇧z2&όMc3wӘ0n/_ ŇyTga< 5$]dA;Llސ ɻLdyߒ9Jv4a6p2fv8a+x:Ԓup8+|%yW}3NWXN,UE7ۂ}zz2vOU%tqFqD˨%L XMS60DY {fd)Bs8k< )`R|f4Tnf$|bn| n` voՂt@)#Db"Jy![nUq5kbC8I EコGXkGKwa[ 9>lz4VRZcl3c~ӓ/G1I6 w pR2#%ύN^_c6Q(P h*x)l~N@ӬP ['s-c?NVW-j 3R~>߾F│+eW-ȩykE|g!GLzɷȂw]Fv}Q#(FVcQR;m/^t~;.\?݌tHbNw-#˫wβ[hC[6IZoV[\j+.yMW,TRRۺةeQG] Ŏ{Xյ:=kp@[n34|lzF(7XSƝ*rb(kUN! qXv>хMNɃiL * PiX(2G4.D%BIM@A3AhLq=gZS^i=󻀋Z5hdtvSMWxu(T!]5ʦd 67.'%!j4lE%X"t}sab>v)uJ Hgèf)C/_:VM+_E&2D R[m9TdRdjfRFR`vm9efRU3]og!.(mȠA"nwVtJZNJ;.S!'N4tH:ϯ MMDg.ºU )Wmj~D/RϭD (XniE:J [ VV/+f_k`'NwK*8H`JN88gmU?PTImM S8 XY>"})|Jb+Y%t'PD7WF\Orp0D@; bȤ+*x xG96 ca fΞ~}y87vFF[A*wYjF#W;"*n+V|O-z5Lc ̰*籴: A#C ɂ +$,H0΋rP:)2LсnKi8aʦ3̝ *JE q `Q) )܈B]P5Sk*$z =!WW='kWw)1 4d h6%)PZR * | %U7ԧ*v;F~K:Yr`eQ˂b~3A̓ga[%)NDY﯁?;YǟRI 85׻mg5uH9gB67<h^%gwZ6TgJ5\U uIu,vPpʘ:lI)W $\C˨(?d>e'UOYx[W;;U3~*M;N}}zjg֭A39GFԢ0 ,h-Ho,)9lp2gtк|c.Uc$!}ţU m+SeWW.J7#)LQ7UO@$!2j&fBc#ih0* ݭ ,lOBM?)/al gc<ògc9zl)Q#ɢp^I타1`X KACǃ_|~ǧ'X\vA}th!{b t4P49܄Ҡ_o'7[x Nrc:PåtS>i`j{BT%)Boax` ӽ grT.!)uRR䐜0pc˖u}A7b;@ XHQL>'J i E+ ia"hoo r McdMxW ;7U1Kz'ᦕsЋM XJ<.*[1.eN8;!mkxSY1hǀUvKIMA,(1-`ץpSrܷX;%(nWգEi)+籭Ox|!hIUXȻO_TXCon1WS% Nr͉tubǵVK5XH`mr2)!D >}'K3R, ,fS IK="_)"q#1"q#ՈĎƚ V ;J,l< ڙ/3_'BWc )Ri$:@OeLaU1T#.XXfq n NBp"䁚k䌰BFF"4M >.Pd&FIV d1ʱ,X3TKvSD$|$)/f!&AjjZk5A`$Xka8NV8(xz7Ő*<8+jǻ%hwaT(FtT|#O57m~=I8n_0W wl;Lrׯ^ecrW<øE{|$1,^Ϭ# S[`z34Hy, fUx`)AX. AUpHd)iʕq0h d!0mh`Dvħ5_ތ6ܰAp /p^,byN~ڞl`-8WZVD"TMe%x}1v)!X(Jq{8"BɃĔ͌%.= )CvPN}qqI%*nf_"cNV1ޤdTANco5`>L"Si+A 5k .0TA#&E:k!ZО  Hz݁]#oNEqE9wy  Hț" Ѭɛ,y)pySR2Ej e[q۠[7(q&3 }}۞R?)&DYbY #5bFEi h+1tZ)Fџ)X> hk)gOHL܍d*:~2պ"pGӬx7+溶۶唟[ޑBOU2)cTdi$y $k+ƈ-7l>,]74\WCd'GV&lseN ѿMvow% \u[q\ k<#1+4i:koN>?ܭ~0ŋ~O7_aF:a^խP w\UKz?5gГdvEeyJ,EomՍytpFWHtBͰ 6n }y'҉S([{\jVaҁF=+8t 0C89uR}0J_cm+ ?f@AgVw& -y]Ghu#- K۠nW517TwfB!2߂fe%h fT)A(바JwT \gv4Jbc50,YaxV$.bVрKj?ƕqV.ݚ9fc?]-V +^{(b9}xJn%ކ"UXԋo[ܫtޝ=i24Y mMgrUOۑE4K*$Un1nN;n'Mb"'lwiw.$w.Y2%ѹC@;] 1_%]C' y+f=.,Fqr hL dd Y(߮Ҹf7~ ,7n򄓨a W#c˦*.7xLۚx0*(g5<ڲ'%5!?El?6rr~-(S4Z*5={q_2A("ICLi2zsόNEX9;ڥ&>r1cg6 * 1 Ocψqn5/gKݒx 0n71xhu.qox"06WCfA.Dޣ7:z( 0[Zi; M{SCF9 e/t{EYǝ2*R3,X~ת!"ۼ~*!**|tNd-] B|PK73obF~:n$^K,\]jf7կ!@Rޫ@:+QKqZ=p9k2N߂i]]uuu\WuUmE)PI3 $p_h*i0%}QV ܗ-X!+KTneµ_Fp"`x5u `A;K'J%R,HuBK4_*%-֎fC ``( ZJpxBBH1Ny ,d`+Nw$pM6;RP )bmMS1]5ƚAJۃ$ 0!ά:O:wiyn A@:@wgIlq4v=z|ls9?Bd_}93ϡA8 `EnH ܑ w4hmC8[*5wO⚐jw@ Ʉ,6d wGK%}<ԕذK/}D[{4[gI0El0z)ǖagAϩcOr*<|*ȁ:Szg >׫9lη JӨ.>nD|rUl8`>.ōMaQS| ;9崴[n<&<.`{@D )D%PIw*rY 5Z޵-T@*U&GZ+{R2Dn^l3'FD6K_:}^=XFdɸ~W$V'x_N~\>MՔR-d^=x/'p,|}6of7|1n? zb~37e(:/~ú2`X]5LmYv}31*ih걁E/%W-! G 2FS]ͨN9USvF.u Ӡ{Ep-3W]-3VO/Oח'=2cEFG-3pThQIYL#+{yeB3oƼʼn8 4Քv'bP]>!qp `Xl[tO/~2EFƧ/rKITvMsݱpDoaJM_7f)i{ P#Zo+ 9fLU $1fUbܾ6q iNp!49Dzzm寯-¤SC+2jfSod烳]amáġNǷbRG=i=n-{Yi٘:{:j '|]uCyTR'yM՘X6.e୷C|;A*Lv26˜_6q${v9D/bBJd&tv): 8&RIAx[ 1-e!`/G2,MD );h)G5N4h$.yk + kԞ J9wVA5U-xz/ElA&zEҨI!y"E4ygOU(v+ QXK)( 2!X 5b$4 VVmRt<0= RjT U肇 Ca]6 bID)8 Y[ X(m|eT-:~11,փmoGhiܻɤ ^j԰so5L.a<M7r붺YoOx94ߟ؎L!FOT!=3\+~44ŒNg8YJe;N^Sƨ=~x rNP=!a1DSF4'#I1J h|2^6>TLzxX>\ 09kf 'U*65INp+''O"\q𨜌!qNF n1E~PϢs;p: 1>@((z kWL48m]1X-tILa#}le7n*2Ivo1"p4#XƍsYﱬ#6Fp퉥 =<Ϲ`)v{QQfKR+ʧz2ਢZUvErD o8F>G4ⲛE-S:[A O@A ܻ~z꼙^;yysVhJsz%CV 뷎 9`9ER8h'ƚ (*3,V| F P܆G`r&DŽiZhi`"uSʠ{6)ä6Q\ic)(-J,w ѝaҟfY¶=?ɟ'>oo%(D|哇X| r35TkpM/.$҂x)`cP0TvW uY`CXTK +609 As-R0}ZDw{eM垹װe<Z]tXF,<\r>bLz t'&QÖJؒvV1L|Qq*7(g52XfR6.{&ږEe[ΞI-/Y*=lDzrӅW%: b'̖x#7V(VcP933@f6y]ms7+,}ۥx\[suuW~ٔ 6+)'V583$2-MÙӍFhw[E;wy&X-7Vm2 (TvlomzB5ٜ/-Bd\b}sj={pn=sSs8 Zr΄Փ|&ɦ𧈫M!lޭ+%m9-`Pk;\ws |D(;oW}t)nqPC1r;Fn}5Zp: aO=\i'd6o.sw3XΑQ>r>}gwxY)*n}xv+ 5%f7l  )ߒ34v^l/Oֶ"f :"5<}g?4/gD ;j#y6~-4 FDQ rR_%\큥 Jy,E}PHV<O:({9~Df~SO%Tb9;UZcKJ8(c c@s`" z뇯>z">";KwsEJXCQ>qZTN I=d6`f#kǀ g+uF/dv U`6^u\]K=DVPCٔyk8't0u J 1{th };]Ǩ1 _SІqsrYޙ[f ^,),m,y?k6f!'Qp\1 &% %.x$ '%mp#Y#nGatOT$SbkӓEɆ{]GG1x-+F2J 5d`T{XF*cFa  N],h=)'W"o]/[dNSP~E2Q HewIGЇҷ~R`Z4яOc`3O?~s2Jlww ~Jo;v:nAX\J?]]Y&P\C o>"@Ap}2ocMyw"wdXL! l~}َ*s`oþQT9YLJ_-&"9E=XNǔ3-t`Y&`Im9Q'b Q[x *90^`1fTd#}mNw}Л)rT!%Hڟ}+ -WR#?iuHZ>Ip~x#ē踍*4I:bת0J{@Tkaeڨ.>o`W g.n{L))\i襈Tck+ζ%9~GwL/1jw6zm: b0$J*}o.ӨnͯWScYg  EaIU0 JPm>u-U\ A{AWt:f1{@)F;۰CU= '+c׵ezDnl.5@G+:?o.N tJ gsiθ~DO9Ik=]pG,W<"Y;kQ;wۂIx)tN' {)~SW}8htSG G:)eA5V-#bɖ(l헺 zZjPQ)]m6tn` ΎHjle֗j#'z z (DBѽ6>UR~θ }T:w){2F@(ĝP@SD,K1^+h{'niFr7kØp8LG1)g>Tüv1:!챮Һ<5(+¼ {sy {n`pH]ч< K"D kq yR +90z ̆;z7yS%1&ڗ+yylE_4Խz*q4|aт k ZԞ~s=9)`Lm,=F` ]8s^A=Hz)&8LH``Nm'\%}1I InR8ʅb0bvAVVZK{Ow=Et]=]l%FfE8+uRFXN9.IJ =leu*[.g~ >W@k jQO9vl|Ys/CPwyPI,YRQK[Ҙ$!\>K&%@8muH)Dmm RkIOZaQEI4XXW1=}\^cLBAiUYpJI>$fX'G"7D<6\S6XIaaer)X1[P#n) %K6Wc-|V4jVb0@kk.(6ݓIdguf_Rx<~e˟hr6~~J/..%}F3T)ZwYZqM> x$@:N@P'<_krيz3xtÃ,cĠe ʂIU LL5Q%#dX7UgMwZ`$(ϓСvUjS!vŠE3 NkQ Mtshㅍ jx}izb*Ox:-UXf_hBb[N9rC r.N׉.{ٸAgA\e\e;>`0 ^&J 53cDۈ{M$pB?~ hɓYG_GU݅:8-IG_k zSL=wdDj`(G,7eXku8'}ndA}JIjW\L8Dшs@lhd!Av55@96,7Βw9ê:urd%KxguU期{6dϩf LD~Xa"x8͓Ht]6-][uiZtN| @+-*'@N2Yk8.8&pPEw6DHă8 #xG/ 4QZ]>r@2{ jLShX^1flb& A(ԅ8gj3MHYs4!%wXG" #".r,wc]c:͂dkQ!i&p]]s8+|\|TSOew&5O].$ڎvz6}Ya@fRĢs "7MUը@^[B$HlF>(@(;(B 7d3*1(DS\e96p_IH ?ؾjWQH>r%! @tz8B.k)&J5FG.j9*9 iL5Jt [+TXz=V 3 l9beK LKSxnJ|b習#8!H_V=r; A~MV1_zzUUu"U$iP%w4$=xf\<+.)\_Sя%x}b(Sx;5h=rMu`!5Xzɬʥpaғֶn$˼] ۦc(KݤZjW ݚ.ߥqK;P-qK sG)r^;gBaLw NnrF]MKELui4WO,pG`yo3c >rA|A㦚H-r|[CI?m"߁0-eġ-.[f5# Ή\h&m7ɒQ{Hm~xqk~ՍԮW.-O-]|ÅZmxH_7>'49=^<3tSaH!SIN=D)A4Փ ?wE<1E36E<1z.a9*U {u]-g)?(hS5jw\gEwYb8>c)DEoOzQ3z.)mp9CfPС9Cl`4{rŚ%8nWeuqKˣ1?afaRO<BzrzyE E,9w6O̓9E3{HPqzFs?]=ĝ>ģfgbo SH ^;.܃6LBF<|| T&Sv=۫ o?YIDɈc$&dL&Y* Lo2*ؼU=i5G_>ωeeSkB1qtbRaHO3ߓrB{+O :_Q)ξj5!XE+B3@gU/FRۀ|ЁVb G[-?GlY|"a$X>ؤ))$̐keuz>ۅx3Quwճ}+fY +5KAYL,0wxɮ6{=H9Mpe0*/>R8x_* 4S1M}M-9 .}&Ԧ jw?O?薊ALI}ƐvwS,ݖnt5-G yM4M;эCq薊ALI}Ɛv) |^r7DM ӻMGn4g nG\#\fy-G yMttT' P J.bQICa,im2<˫v8#c4T' =Ϙ#ahtd2-RV_}g`A(;hgf  -JOQXЯi S"=hil3EK`8Ӵ  ~b!jaFg& 3V[30 B!]\눁V|lH^l]JrTvђ\z{Ek&Gi/9:l4-Y04Ջjmk*Dl[ѭzo4HޥzPVraX?2ŚEb͖ JB`+UI&`kݪʚhvJ0cȁUmNhd (9)8j)F[)Մ[΍PZ i ih)vZm)rQ)e9T]ĨZNVppj] -X"%H0BDb)S,{ʧ/ eL7#w] ev!h) {9ب@j|ul]un{hi6I*W._?mr).?m>2Eŏ ,Sn:],G׍on7ᓮOr-"KE|={'T(-@޹,8#wi5v{_~eH7K4PSiSM7v9FI0l+) tluH$Sylc[uz8 T-J2l5]'ۥ"P0OѢ` 4 Z-ИfܪG H\>/Հ {1cȵL<xkɹm-9uqʌ\Untc L +(㎖Py\)㒧 qZo۔w 2$vtSv-Br:/oeбwetA:@9A((~D^QL ]`$8$Di8V~o!UOx(g sf0yHtV΁ A9΋5  B8 6ez pi6u %UŔV6^[u}e$_2S[sk=&-doY+_/Gs*|X%"`j]{wjN4AlSzq/vdo`;102!01Lwsپ OT2%+c8d/x}7 Qyi;*,0CAOT sr΀" 9=XѲsnۍb`ȭxvs? T#|^A.;g`ʀ6 +i˖@^uQx[o^-0rXMUqԪUVQKZ)3֜+hJP2D_$Bit]TꃒR>"U_*D y[8'5ZIY+AԤ&-hpޓd| =uSdܓA=:-)AS!vЋ԰BJ"R`ILUTExVZ3j4[j_k&j'JK*ժm{\0m.c*=LjmMݛKn)W4޿O|zW')0+Nzڃ9Q_;{񪽲\kϲu2X? "^A6Y40ʍ\Qs*Nϙ-K@9kNdsJN:5EmR8ʈ*Q5BKji@)NV5\4\Fp#Г[GBA rHxcYaFFZKkZAL()N Ujݝ9rUвrl%  ' V8osG#:[%5Lw .?#xG0fJn|p7ags qtj-X""9X6(=Vja. @hr,'y ` N'-= *)@ܾIұ+ϋ- iuDAt$ZRKCvNeS~81`!}*QJnr**X|77ױZ/nqgɈQhLgFYJ} w3 P(?k社r-<>-{B{ ;ghu `ƅ[S!g(C H_\$01ⲝc#j`ΨNg}:/'Ƈ:y 4[ͼ[/WLG[RiK{+sWOn3yE!N%r< Mu]yp 9HC]~!ꮋ'];Rqjpnߎ SO!Y2zI0i ( Buѷt%Ig%AP$t+"гwEKwm8L+ $}r𩽥*9MMQ.(Fπ+E1Rw‹"2i.+Vsvc?u ȩP.ߓt2#%jYID[VBVR@V7RFXa5HۦAIksIVSSI@0MF&:%oGVǿߓň\cmmeAѤvp%ϭӣytͫA'MykQ˛؍o>Ըpfu$:DxJT KMX T8'Q9uE{,WlRNfC/p^xqe `n9A7\WikrBkmFE˞svb 8 $0IHlA)du5n"%_ŪbU c]*3+;{g z/c^`pCͻ-X+J}v(-~N@ _Llz}Ⱦ$4j򕩶^U:O՜jt]"S!NP3d a* 6XWTY]I7ҲԨ˶D7a8a[1_/%S#E.O >pk}Zf,5u?3Hu: "N1P7}aɒA8S5ӕ?5cS}af=e% d!%~#^ս`#EٹWؿ=e}1b~[XX77Lv0oJ~৛ݒ{aWU%~@wv[|ћp>T!UUoMكXz׼陰=^/B7YlWZJOn-*\I$!W.Q2Uq5bO R Qb":ϨΗEyZ R!jr"#S;[[,!:|A JܪmURHȕhL){n'[,!:>I*A[|@BB\DWqM_o|`%Ul 3=k[8P Szxe[$w֨%> *oi'HZD$+~tWy32?+ |->`F?J+wR-D4jB DKtW*FK}ٷh^Ñ'uˊJt~Z ]{ɱ(U A3 ]΂>eiEZKB|ClDYTϬu痰.Ah5O){|-!]:SxTT`&p D6A,W_ܓRo}Vɾߖ ,M<ʻ|]^*Fo^.f#_G `}x@CD+?"Fo?<Sn ldrs0n<#8~Xv0~^z,oIvリO[8ÞrѬ,dz4BN % sjJD5ZAM:t~tW8׋UVPѬ k9 !/(*9cj͘ɬp3If/Sg>4#Ѕմawؙ9i GX#Oܨ"T|;km$!ҥ.PJg8 <#353A9fL٢NeYaRdD%ۈo ]\)EZȼ(ֈ:C%{Shv$g r,eRIĉs[4ZS/po6|q Aq6::vI1`K#D]%$"o$A+#Q>UB&+(RɞvSD^,!:'WwW*z-\Ȕ)}PsJMOc8 CT:'Vv?7'ii~=L5Ev(dʦb5ݟ>C6#b;uIfm`7{yK?E4]3 %rj"ݲurFږ\%OxdCߡA-Щ}ZTs%Z)}P&Au&F: [Ր+2PI!4vXv'u~JŷrGRxXQ+A0qm!+][w5ZuH:ݽ؃E-d.m8pfD1P??)#H 3Iq,e)1Ek^5xRDzSMdz{hy[D5 Ӗ@~3Zx} W0&Ri!HR ꛺]K>{?!3uLvQA\nVI.7ϓ./>=Lq1.;5jEdoom`Ah%NuGZ1Fe "۟~`". W*W ZEWW^*D>*{4`D?岛uN\?Gt9@o lXx :?qX tW~g߮j~g1 ܣm>UjwI௛e oMנX֖it0sM/$΀JR"#:נ}}Zp#Q0S`${zZZV6. ޏ&kQfqꕵ K+%!s)2[ xyfsʸQr2`Q,֘ˆHЄkd˄.,w"cD;g;b` N)rJa b"K2dX$ūw|+V T@Ÿ^>[RE nJW`@LT -xP*dͱyU.f# f9ᢶq\uiݨ}4a}\Nzަ\{-Ri.kT\RU/^4 3>W-.mT$#IZ݂F+깧hZ&{,Q# : _*lf{0^ЎN:;?!36Gyzy$K$Mə'P^>:ٶ`x8N GVMZѵZIzk\[lXs\A9\1digZ(L縮<ǵW=0hL%R 5Rca@- 'JAQ0MdF2ԙ5d*3[-2y $0N\*(:U*+QDŭa$}͉;nl,[}WyS24~޿^7-?Cͺ:9SDIn<^,{>`>/ ;ƃy`tKǟ F;%>u:>J('j7'#Îv? EluHK^b@݅pf+I*6ӣėWN0qD ˁp<8޽s&#UG9Q@=s+ޥ3 ]OJpSVt30mň $>ЮMVӹ^o'O]zjl"akhd^5W:3kڃ8,= Oj0/i?][б@%sIcHr{䮽, s"`QCa`=901EJe-s<# :!r 9GV>FE' v(j5\Rp%A|+)/Sr>.Ap;Q:n3ԥ[Ji[X+ZrOG7;Zay5AAh bP[.-Դ;ݎ*s3B%Qx CzEJg9"gI074R1gS2_kqq 3Fc$%Ai# QV XZtLh @wgswEdz=u <-9:olBZZ2jH RFnGUŔVs֧=-׊~Z=t*?\E6hSǢMSOC 4`%.X\Wwd ևsTnHo]EU'GzkpVԊ.yNjEbaQskz)Tob BAS9-bcј0hu/"bk&&In[R.{vQ $9Y3=z< h R \i"ApK\PK*ygZ@>w!9V-heŨUQ"z"DPJL+KdS-9U \&]xK=njp?:(OGXB}+a؏lƋ.g<%3j"aCYAq1g9E=Q-|9(&0)ϱPD-*ˀk7#JJHjt=&X_| lWI?X`u"iYt]Vf7'μ6Lߖxmc<=w77?eC^ͩ;s;u-h^?1_: |sn~3_Ud~{qgnGNҳ%+Ɇˎ$E4F;?ǽnwmq #RŋH'Xg AƒV3Z`!{FRLl-.WYU_u}G6 VխNc $YϿ}sr C' }+FvK3Mǘ~M $Xl? R N/})6U/xxQM u,}ȋ:HtyMND}6a6cvU| K] :S-,pdi-Z%uQ(I~ljRIێyj=Rط')Z9b}ZRD'r6-nF'R&kF5D[I6͖eoQ7cJ@+JU6Wx%@js: CNhDsjy/5\(.*]woM7n.3w$=^_6*@*9}-/&s"j!gu08I5(߰]c_6~D<2㚟 @. a0#(kͶYbd~7CZQZ Rżk'SKٓE5bމsFQE p1SEIqꁋyL ŜBG"sT+%  |6Р( Kw[D=3F0B2BqVJe8NrNAʬSJ X*h!Q "I.21S\?ci)n~ҕ}Ԛ"UUMMizzJ4) ]>ZD.?nrqX̧/~x`6{:A\AlX 2> -(IT!=otqpQCJl 6g`!1RS*~bSğڢ|Y33wszI*GIKJ "ldE|>ٗ͡}FgNt_튢Ic|,P RzYP眫J͹݁ T'6>)DUv&p,oo>_ s"~|ssj~Lƨy)+8t><{s4D jx G7O!YM@;g uf2RiG0α,75ZyI@LBb2:C!*ϳLRe`F6b0BY8焃Ȍ53E% SWX8"Z/ea#a%ASZp|{cn&r-8sPJbGsW֠#HʈCglD 9'_2]+ʼnV4esw! //2ovm:?f'RGc|nS>+ߊ͛ ou%˪ᙋbQn7۽[WW?y{ɿqYYSl| `lG+? `3$X|6wTPcӅEvI*f?F /eC ~s|JHuZV\*8=Tp`@x¹Jx?Qν߇h߆~lΗ7+U"?(?zE͝_֟v?n7ׯ^Qrɨ iIⵒ_<mvx5{(P,p]뒚0?/wN_Q tpK2* F,\IkU835S4f-_q}ʕV\\$3H¹{V6 CY:bYv 3"f :łИSrgw7%o}*]硋; RQ1^c$:qzt֋WcFMN`4U49T*TZ zA/w?*uJD{Eʧ|L*[z?3W8moW!`~ZmÎ0'#6v)P@7Wrt@B>mf!EYuwL*`xfV~ ⾟ezl>إG*s7#qQѱrn/= T3s~I#j*t4uE=zsߐ0pB961&q8tIR8VYi=EZFT 1@a"9C Ynt& t+ N8f\(Cs2W7PG  y]99c9<친κZO -\ya G^Be5~\gr܌PYX%QܹPP wC HZ|$5:x5-a!Kv2a@xC!AɨAoyf9TALyH*<J$d d#]s]OV7jխ+1C;RV)Y}vխsGcW>F7>D[)`KMc<[yZ뙜4^[C9ycx`Un^2tlpHڑݱX@)V#{;;qpC-!]ttKOU:`(OJ ͞i~D|j0EX8C9CWTu[tBWe9> D50%q挋i )-Ք￟ǒ ]!Kx(_<Ɏmw,!Y&y/7U]ڐB]x}lah}CCZCkvueJ8C X"C0YQH+jXF⟡?K___w^/@(/ ޶q*#/ FYWXm3BYG@ ~u"kd L[p֧II!`^d)H76 sE`uj?aܙ2PY;6^T_u]cSzDxBVJAPKPd |d{8UJ=/\2$"+8>@ċZ\ .z 'Dd%W!;;X]J:\>r"\$WPr` ِM?,21#vbG!>:~sAp:A#IY(@1mN)⢔ SP)rf3~/4z_ԁXP2_??y |2}^y32 KElVe:ڂݙhG3v䬧 dHJTX7$iJipChӮ=7aVNDkK*QP>ꗹW  2 J+n$%?j\s-1waT!f^?;@֊VV$wE./,pLHQ(b6shp7Ȁ[f kh{?Ȗt,LRYB ZPɲ t&G{9 Y\Q(PI#O*ܟT$@)Ib2Þ;KgbՐ Z1#yN})a}H(diM`Mj:ƉіTަ#1>\$~WV D[cjeyi_Si"(bټF^T䡄{GʚENUF9KO&sP|U;E{?1̟ZjI7Cu;9+?$*9q1>YeRlʹ`N.6Fo+U C^z}ٙ7~@^v? ߣQ2w,ea.Y*j{;PZ e):sT('}pGIf4/r.sV8RR w]Yb c!!9ϐ]j( И>}Wf}XQ!C%n愱 #7}\xu"M#gl1o\jVCvF.xa%y7_qkYqAm, wڹ(/AyHMl qlQ޵0*gHVqZj1IuT/WW`gs۫a||ZǪjQO8Ұ[?Wt6}r7#tYrD1\қmS&CqlDЃG`E`۵Ƀp[ bbPf|0mv?*1P?| +2(R tfgeF\l٤a 2p!Ɣx&dWƷ7{\B[{(X?4|THXg(rp ˈ^A`3G7N< 3hPHY7gP ;D ûM{j'sv:B#sVUAձz8^pQQiXo5wͦܽ+rhu\o a9pP8zeGnH"2 ViFIM8Jk-;]lL[cl̇+,qOX@z p0GyYE Y«giT,`>m=%R0YH7cΐÀ1I,z dQH?@6G恤 `BY (kdf9q]m  x7%E/2ݞ6G+ie$界JT,=W0:IPHQ*(9o,>&l=JaQ儡 ?1>EIv=; ON'(#,5+6:kDKt gNb3+AmR2fvk0҆HPgD+1Mh`;k(8Q z+ӠP0?}Lѐ&p #8eZK'ˠP>yBGɠ06Ml~Wư@2Z9B%cx-&K&_+23ex :"(A:%(@DBBGClm&8<ɪi]M$V^DXqLʗ(qQ:>W킚YƝʝgj0Y)c` %VT2/BpQfH*YXaxV&,eR`Cbitv[4ycm6S%pĺ)?6l)Ve߾Ǹ!Q0٩^}24#dF1]oh:kxtZ~Kt&o6wd\}L(v"Lhg9Vf'V$!g.edm7jlh7e#[Y BDE =CM~PV.S3kG+W kȨ'5tp*7s1ĿrvU'7@XRolΛ QiqdӾ d\L(w{)&ƄcѾ Sg@nx[`>"/-,ʩRHz U#8Gy`Jpeɔ`-Pnj8WX [i '|/d`LysSGx>PE=",~M#ZؽZ.`ͧĜ"Ҵ2B̖,ӏ .N97j>29s b<TʑJju1H_ i"8=i;mǔ y MPJjRVt Ktk`f(DfA:zq ߙ841F[#ةRwh%|BΡVj1Wms7ÌfKݡzmيbh;'.+s-{%}5BKY$si])˲o` [z| C* /bvg[R7%P^kj2qi8Z=ݚ#{;dig<xM9C|YC\@d_2aȣu%?1E.龬"}I9eV\9. rYLzAZʗBDBP$ b:.+vՅ3&r(#U=}&,yѼ3p7l,{9;B=,M^KuW .qJc.mvXasKUKz&{= 5춵znּ3*0B4 69ami.Ɋ3ޘ agh 0#A3%8Wg}ʂI/S"A#& &jN%! (8=Õ#]87ǹ($P60H5aZ m ;<B&|r A^t>~߱p_N:qte'x`w~]J_ +|%Cjq=鏢Iu  "qHeb"L ]jM_ܧp_;w\w;egG@:"Νj1)^LX4?{oO҃.2K.y}B.0@d_m:8wgv- X!#a{ 21M'ȽXE`>&^g-.~}MM꣚\ b+?ˮBp[ ̵Î%~CPS O W@c=x?!啽"j`"$(BLLcb 1QQ'"bʾ@}פ/a\S_U m|!.]3>Q,2@HlL"4;UK.5wd)S JU:5#؂q(FJIddr"@hk(򐁶FHcCL85BGOvֳ#t-g=Bc qfg1zuœ宭g 4eXm޶8/@ھRL$PFP q)b*!7HƄ#hc@9#%/d @ZbhT)jm" Һs$3:D(+ bf- ׵7OsqwN*/b8W$-|y ==P_+K2 0X-.y+T^{H_}݃5J_ \ƹp>K4 an_S$ƈƋ@2EaDsŋ,EIʬn)tt:A {)J楢}1v:n]>sAgzA)ӿuf' +Sש~evCa6q?[+"{qu3 SnPJ@|~n?. 'g|ܨOcw @,8;_OoVLS(p3P á64@R< :zYo!8p*!/8nYZVy_׃|M~2'JgHO'vs;6TEISQ[)J<"NvIz* &,Ҫ'2 y.SmDV}nGXLUdF^1 YpBǙI>|  6BNKu$PXbrAv؇f.G2z(*ckko~Q5<"ẻ\xso(@Ӗ!*ɍ $:8+ mdP+($J4(s D9.M9bj@i`q@qq?G$JA sBҤSÓ(F%gpĄ%F6ep5ޅc""DgρOB#q!oķ|ǩ 8_R#J߼qQy?ؓ@ a|Q.9sc! T_$wMHIG3UloG(v*մ{ڌH{ 7{4>MP`R5y7|jAwK@.[$ ½܎> G,5Pa>löuxj^p]d:oޫkw'7}ݕee=@i9bo4YFЙ_ęGz)k pt6],NqH1Z_|}ы22.]{R<\~JiPJwal@h@7f..QAW~"{ NyȬu(FEXC+:F ~\J h=W/" I%bJ/4?%ˑ֖x'rRfjz+y(m;n* _$ D܇",.±.hyQՖrד~4R@8Ba8bE i$TkMxu8Hh- O%@aII7{@p0©kh0c퉔OEnD2ChzogE*L-4Ș:(̖yk?Ó}Zbz؝j|ݽus:?tZ6[sN>w»%s/_a.S!3sJ))%ݾ|FDN@D㑠G9] q*ozX1go}eXx q'`_P<]L'5c d b̫; I/mn#YئLiSD cDHibwUsi*ƅU[U'T4S}bkTu7;v73!}@CМ$8u3ޤ?|ȉzw +kux!)9k#hEMehk@~ju9CJYv!_ͅ7 L}Kw)Xsc$F@kJH p͡% RL9,R2aRP" , ]n'T;Ҽ/,2V;mP-r}ڷXC1#ƅ`'N6c"A!FFjN"'A2 H4K%|Y .@&#JJ&OuS+GYs΢e/v2N1xFӖ%N{G:ݻKki ~^o6~q,fd[_ΚNO9_&ٞe'FgKmfyO!L+Ișh)ս6mhJjN1hz-|nVAC[r"&S2PdśLtx_֨jr=T d/"6|/kػ󿼈_NPJ^y{Ӂv//Ng6ql'{\|/V|+.(g85$(qm#X/BEI<)oں@םUQݪq3{o%oJ~fouS înjOTԓ4Z«9ݠ%f=mDG|m@b׊Sf4Z2$RKT#(nF"Y9p]xG3OĀX\ [@X Bh<_+,}IUB :WW\v+/I0b,K4I9Ɛ2G|'G6֢F N\2( T`N( 혧*/=6^|9С.I<E 3pkzm5 b#v+H2iZe5 @ծ$ f(`OO|:T%{wrzdͫS띶bu}۳ᨻT>Z:T7GptlϚ'N3&Ζ7v6Z~]1ycBgќX Fz.]UN\@.ݻ :0s^ iXjLVnNʒeElK80 QABHp103 ۜ xтjI8eg ĨCZxLE!6_pċ͗p$\&j*W5[8 +')MC*SoE0D:T4.nfM<8A}Q݅?dKz]jw_bCv?=qK*iU*g6PR lփw0cZaPua0 PAWPs,ΠpUZ%A`ˡ5xuH 䜪Jq~7;ZLIKR-b|9g7l?ă%9 #yß|3l;ʱ'*[ѫ/p?C=Y~6/f5AohAٙiWTm66CbnLghAVDwY}ovQ3`pU 娼ҸK%PBd9hiPa=_5aw%+bտ2uZe n`0F >%~wIFCoﲒ/D<2$S؋-=y8=y]CLOP]LP(u+׋x1}-DH+t Ji|1y}MWp^E41Z;|Ť1Is2ɜPBf**fYAMnN&RNX(X"i LQsEeКBzcY<>~6d'H6Od[s2Lj"YRJ9![³ ːDPUN[Ay#i mA3Băsf \C2e[2pd KJimr%¢)hN%n"brLDm#8,T(#S:ѫ&rYAB @[V*='pD+ h2)>*T+bAlE8%ZQP"XgrLEnww؜\9W}1=.eC'R˗-_ZPiTF5ʘLk/݀f䡗JF P+3  ZI?SfvAQ%E#Sy⛑t…ZC s(SOrhBܑ(9Z7 Xȅ!V/4q{@.6c$3&0KlqU>Պo=SEjMo2 /Z4۳~5^#I>Ŕz=d9AQp;*Spŏhʽq4m3_[VT-5F})P0JPKnĥdKg.('"QTHPYXe_lQݭ-<^f~{WRtmt3h<I 135/pi:oP@2O%8,ǭSz k͎8D KUdn՗iTaYx>hAa1,{oh•R]tX:*P(7v:  e\}(fXT4mesn(uNZn"&Ri'bBR#rBa}ݤ63-̭. ShPύx!t ]d=Ok' q'Z6 TJ>]|0`]!"^T\mieT&%=&/jj}|< zes({a!G)I ib4Eyu!5IA@m呠i<4M纬Q]V@̷p+\u}%)/ ) Aus0E;vEOhI܂Ce?kLɁ#=Jk6|YMp-֚31ZI+i`prZA28}"LQX_LK0ou1B}D_XnZnzҿ]I=Hپ8^Mt_d"38eS;:&K|T}pn2_r{9m=VTvR n[_Q?C^c=qBL暊 KŧK%=6 IEuzkRRݲg*avcCMdiՈ4Ԣ݀8w3zrhPl,!}0j ؑFz6{xsЀwݜ* sM;\)zNݠJj̰$8B%iz~%{d Vyڻ$!P=[O@I5LѶ,Ț`J#iqqi_J𞀺)gn:pYo/{\[=%c/rpIz|K76Fk)SJnqD?ɹS$8=9I4@.vrm%dH% u_17"uBh)>Xv'J`JxjGPk Jd({(j.{hM`dR/PquVbw*A4⦰ThY36/j2)Mu4iZy5Q}@: 2CzлhU bDb3!Rq2!'T ͥaͨ0k/%$7䷛ rT&mRǝpqzD7M|9߭1'[_dZX3\ў!df E,x7"QI|i[OPCe"SPC*撗/OoKą3_6 N3gj}uld}F"b'gX?? Xjo++MTL j+6*msW^-!Q}uiH~Jr ZQvpVth;KETݡhcj5}G0쫅ƁBРNݛQ'O3gLܖˑiwj}nuK:di!):!zJ˫\\ħa">5Y@ 5#=7 PQuԫg`O2nvRT3c/QݺrVsez t=PXÞU&S7Ԝj)ѯ{T+]{yĎw6J_oFnbj1/AKA_V(C:,έFƀӭ/<';YW]m/w~Y~>r5}VZ-Yu͸mI'W^š+f{a-=~Ur}ܑ|"JmvBAv;߅J2vk_5Qu!!߹)0bv EtڎQGs!xWڭ}Dօ|"#S&QYs2wFyOu$^ 훫vq}7˯7C_󄤓;g)LcKLrL-&sRisɃ6E|n \ /q݇X.2w"C- 5<X>?{ڷԚjajwy܂Wv'͊<}HC)JSPdL€>,sA9Ai?AHw*άO1(-&6}u}襭tF 覫P<} +XZI+eaJI| LzԤ D& dZ6#Z!O᧸Y Onf *) TQ#жjFz.x6HWu!jw{:#)3y=S7ߣ/7ٻ_8ZgH0Lc"݅&H*sNL.}7YTBQ|W޵ƕ#"AbhdY   6`Wl=dl9[qObXA1gH&[)8҂c/f-Bb trLG%)J梇#1%ك\3g-ɾnXײ|mK#GA*W:EcB#cڸMR#C-׬kj kJҁ9}$PJ$6|} ǾTbv2#+Of%i&A;< ~lp|r7 B\2"@n1,R~~MYv @e3D3ٟ6n6m{nzD/d2*K u^DyɝC*4^ZrzNe!ۍut֭Qcg9Qxw@0CUY! +˻{捇!\h.CKF(ai&rVNk+;M%~ aڴGk'43 *hcjjhpVΫjN+ U_FU3mfJӮ Pb&m.+u &xZ0kkb;^WdKېʶ,nUeOyn%A%FW{MS 2F J|d8Rx\ffMs87[%7^IVG; s`Bi%BxPN Hɝ:RQ2*ÚP @‚d{{DQ9"EdrޓAN(O j ޢHK,@Z&(+Q~>; y EyYذpV6]?_ona pZ0l:=cP:Șs)\Adˑ T+_YA9f_}f L`REEis ן~U:/D]E=UH%{:eucп!SwPڜ?uA #'kuNJKܤC`͐so=Sj{JL[ŢJ^E YnH(nMeyyF+R*y!$Z`TF/"1q;O`9Ԥ &@(BJ.${P4lN>:\6 "z63eZO`},c3z^1̛֒2^fUd |]^xj82ƱEؗJd׻U핆{WTC=.=A32Qh80oYU:$F'HٹϾC&잊t°C,?)E9SIK(R0.Z8`rCB:Ϟ?Z*/b栓8Thւ\{gc3&& a'+׻dI5>HA(Xm!Dky>2M0^YZ@\5"o܅,sH 6z CN x(VK{<'`@>P@m`I{%˱0B#f ˑRA3I EFgEjc/5J0IwLb]'Z6xʪ7.Ld:6+BĹ+Qvj㴋ْE1t-PlBy=[Efc<|q~M۴Gmp6 ehZMٻ}zqS>.$m"nq- [2\ o0gsQ=pQSqP/ 6SYeɄDտ+*mMu9|Zk*&b3W̓^B%JvGkv;!m@8!R^?4^@iXn:nv*hIbg,@Jz.f9oz6P_[R=U=Xn,ۧq1&WT׌ksa^u9-Tڢ?U.Gm%m+h6gAɋv gGcpM´"SEm }"S4aL| N͘׈ػU#`p)hdR&}FK jnVR NΝnW$fȫHx_ݞȀ]Dn/5Gh8Ptb1PehMлeԶbPc:hng Inđm Mæ8qwjj;j{O3 Zte*qJYx9J)Jvlҙ"9פ3ez5c]PJ˯(C8exv^dpA\CP|pmV,(Gsn'4↓;QNDg&̉D?j,m?yS}MݺRKo hlӼSFɈ~}>ēʿP:CY_oW*h( E9q S-)J-s$VKGx27*x9 ۥ}ћK+k{ǮCCUZZ3_䧼M  bÖܧ.w~wd{GݣuaP=~Bs3@$ZYqbrRru(Gw,JSS `g=/Iy&I8d Z%CI;Gѣǥ!pOc_R.p?Zt|[p\\W rqu'﶐we6%ԃ/d\zՏcqJ}Ecb)_o޾wY*O!SANo~LSUצbLoШaf?j=ghq>+s:4/^YmNY\ vK.X\sO^ZO/]N!jNoG?2'r<59̊ JpZm\x e\F~gTQ' 5^Z2r`XW| ;H66!9G7zp1`sc+~C.k}n#7 n/h]/8W$[I\؊iJKRR"% !iޭZbF_g_l,U"6Y˂&y>5Qֽkw?䲓b:MgPgϦ%.KGFjJ;TgCSvVr>۟gpEA"7cͤb9_Œ̮p6B_mj3h7[u*o'*LAey(nF2eć \6RPۨ]3!|AzTӆڊbxSD$h+*vj=PمuD?l}Nɘ$M'~r@}M?&iwK9KNe5AJ<'Pղ)Me5)#lK#"8{#[JЁ{vI*NSy xsx jJDaH=ReB2J={x 69{ԃRR|ZW'v u֊;|b'0>F c4]3/)jtݰ&3*(> L0RcN1^e<>is)t.5FgK@+g-Gq(Wē3K6e9Mi3Okk.@2 5 ,™CMCp%6j%!aE9Uu4Z76 *R@ +fg9J!\ 9b cP6O/ĵG^[ஞfX_pIcWAƕ0?l -sJh1oz7 'YAWUhdz7ƥr6_dӶ`LDOۣxS]0hR^-ώה-E)(JNqضdAjo/byQ21yЕtކ!Ȗ5>3~ jРv-J=a|Q'$4-ƧI,%lpĻuPm(ToXMĎHG "mad]G^>p35Q>F{?{Uu͛qO-AtN цq4Hb6|PK)A8Bj"r:t{Ī;mZ}'={ھ8~󻗶Ԛ'C,>Q2F%FbS\'T_JAW zW  , , qM;؞>w:)~K0q?B':wH=4[7H9T3O7eӃOeSdb? tz9s{pz9R/yj( D Q*lWQmc84jrq,nܲ~0-:]zƇÿۉ_Buh![o7򧥃/r߱ya CdևS /Qad G1OwA@~ 桲SXtTRϡ|' vd/׎_.P^5{b0Dž1#U  7>hMtYJT %O2~a8M,?6]Ma<|g$ZWF֦yݚO5&fFGO'Jގvuby1zY].nnF?ayy uAV+,+jWdp7OSmYhnqjg]uTv!dõx؛?/х^%[Udϩf= %T⪮+MSLֆX%`nyE?vXfB%^ 92l} ]w6OI.(/'e>/%C0;T.yGTD9?M tw9*-@'wnzQ `k6FL?h$A,-OeXq `/Y*M$^ T=keC8(Ϯ;&fO>rb$[M$UTNy#QO[R"eqPb"QQY4R5T4 5B|Y_Dwšǟaİ5 մ?jȻqE~(S+ZO ld+yǐܕ|zϚYNJ^Q+FA8 dF۩sp]>G̡kxAð>]}Ӏl Fh34gC-}{6w9Nmqdv5\> L~y)+(Y"hAStK$kV^t`r(9Rrآ'{,*? [T‹ΔG7K~Zz'r0*:*"nfoZhNhe 7߆9)1)(o&_FLbx.Hm`[;zvl1;Py#'ͤί${]d1V9,;Z{}P M};YTazGbCzaxk>UpL䐭$HC9-h5* uA">M65.EDW60S՞5uN f,+[/,i{iRuzfN&)v;k%Mlm?m#IޗU{dղ[Ev@qC2 _~é U! Ur KId1L2.TLq^4%Ǖ))6r14mW4 Pp \93VԘPEF+)|Re/@{<4 p_#QMx ^ k xM,raM5t Pf90D1ژځR\'4[PI p%jj.wTkݽ*1ז2'e^A2u "<TiA{p&,7S%6ZF{DraЬk\s "AJ=P@B1"P1FDNx/5x0b6h\Z\ z.%u Tcվ t`҂V: Z1R7^K-KKbXRЄ{$ BhYC@vL[@B &4Çhɍ$xU &FSP@jIJic:d*ZkE@gK{UТ_b]$q;Tޞ}pNϕRDSuڍU4y[(Aaڍ+-P0=O_P>7?N+m;(3[, C3"MIt JM!37LӺ{]7zPpu!&xצFoDžow3jtrz^]~XVG菣_&Dpn`eMx 6A'H_sBdǒZ/% .qPh45X}KyKr{AP)FTaDgjQ|*h6޼eøh<Y}n]i;(BlϷ1~nJ6zك]^nwq>=3Y>8#ea|92?L\Ӈq AaP$uP6w C5TƂ; #@RFԇyvbts´#'9"] ۔;m?E q:M\%4hǐ |zXE[!$= $hP;Myt8 'A$#4ht94hP*N)t0LOr9%A,JC$`!;:M#_zFKKhI1)F %V7m4cZ3s5W:W{' 's{I8 6زtp/RB٠vm0dx?'zd,_''FI- :CHYִG⃁Ȝᴨ:YKNJ>*#'&sq(_<ӞS5 =f{UDÀvNlX>{l.ŖsJIfeV.$sAi.Pu+4gOs^T966;@U4Ѯz3`^` LÔ+#xb?9oï jpI pE%ciQT T<2LD \b !"zWA'd3M1rsAbSJw5a;]4P->2k,N .܍ˡnlFsuViP|7Z;Mc^PdiEN<9oRYǦiby*0J&9i]̐o:>2YA7l뙽}Gq޴_=7_rh\pS^A*d1]|C LdC4rCXN!Q?{Ǎ c.Ճ3DmcӄO,$ek41})dS@=t_lTDߗR ;{ʀ)ssjۗ0*?h6KNܹR%ȡ7OYjF吻sr3Z\ #+Aa 0vnJ9mhzB35oCy1" ұ*wr#!έ|fQ>?MT>yZ❐qw'QQ h=A׼aq0PZR,:9RD1yxZ7Rk)5wbBQ[Aù:XZHXi]iog̥ɈJ-K3YI o9WE Rz 4lOKf0Dܞ=(5TՌ*~rS6tsJ*aĮlbGH> T~p! ֠Ry0ɡ*fW8Om&_"~ 4/ɤ8X niI>/zi[v,t2I1}5_.n#ORhEhr ~BcYVhWWwϺYpms'LJZOM8't^zwoWQYPtHA c4*Sb,Ht@\!ʋz}]\^ޑ]^UO75y3?]_o߼}iJ?^V+;+'_u5s^İxp߁*JV{vg)KTB_d"[y*DIEfk6 5)vU(l3ҏA@ kv8XXgjlv8wshRJ^gnc! `];J%##RΝk7JQK;/ݙ@'Y Fi)Ǫyqxdx3rsOs,'9kQn͒R7F=2b:n־Ûbr)b[ Y]: 9_pO w蕺tRp-l'LX9@v< k'䧅d2YI!MP:j j2hjO Hmrڗ'hT2rNnrgWMk# D%Dݺ/YIܴ^YAdiePZ&!>kW׀rvq#sRuUVjM8&5>1M^XVCݽG*X|Wt\ [2iht \}[A~>(C5~6ױgW'kvǛzAC|\mګʿu74^6(ى+g wi:Oz8Z@y.R+4ErC+$dkmG\+€2%GձSڍu`@ӲLʡrƣq ?`0%L3z ˢO0(RZ(ɒ BYd{ЙMVߤ͢Yz)m>.o~A&Hu") +qkΗ $rւA9(of݄Ԃ84̌c #g!Lfis}/$#3M VKP0Pf}$G`fZ<(v=iּ$pd e몧~@j;2[W=U QԃuMQŔ̳YQ9O8C2XUE'rȼ") MZO=Nj^s@"eJ( X<Ĥ1F`JБXٚy'*4AjkݠF$—B^cb%39JL||PRRF֍?LrQ0ӆ@\]d%o%dU=؞lS#s.C@[M'c/{vZL.Z0<HN9eF K|GIq'%$oShk8IlkO%!jh$IdG6iLɂBҒ% *'ˈ^ʌ!gUx< qRG~%(G j[7,24٨o2T+bԨ$:i("$t;ϩ)(99' P(<5BN\J OV8`hCgyl DҖdSXWP$]"'D!ؘy[kJIFv@||ۍ}tAn"9h}~P)![[l Rjo:uhC*)rԼ@UB{dL cw_yгEĞՁ拠D@J!@`&;w>TPh}3B BHW }+a씻ȧ~/ wq+<(v|]*IsHġ1\&183Qj\tG/Zsjp`3g܌JO+n>}"o/xfOK94δ>*""idJ'So0cT? @;~u)_KMU>ҥbQ]bQԯH=;;GsR0uQ K*L]na2J2Jo=?[Naawz4rf825- ]JQgT)6R8 16 fda5'd*8䶒܌@qK"T a951ω-tE cbGa!d)*˞U=k(Eƨɇ$tǛkʁ\g'`4Ĝk_:ciH.MLy8-!)VKXxN$8LwKGN#B:y(4. s5Vup+.)//A<X߿oG%uwJWsKq -اbs*o?onsa }puyx}(dTtI[) {ƛ Q@ ?cKˋݝr>;FˋB'O=*oơ7/nGwȯ䏏g8(s~^_.Tv7:/^ȕrߍgv'|ǼunʹZC8ve-,>73Zx%›d`AxEF B@z$#3$M1#VK-VqzE+FWW jUFzůӋQ1 l( @H{}^qIth'܀$8Tؚ*G3 0-W"U&zx(%v4Ub ^rm*mH TG}0NҲl @-j(Ȳ}s$r>8_~05Hmr#{rv 1&B/MOBٳ.j ЖFy]#9b],EҎEO'@i B¬B߉^X;^jx /)'K\d֜Wn9$ِ rDZS,W6BFI*&(<\%o!4h8CVWOk/M09fy Vi[r+FW_{&#" OHwDTp-p'LH"o,4l~Fg@5xK +Bysi\hw (qdGMpdMƑ|`ES? s$9Yxˉu>j,1@n+] p&D@_K2ԝ6䶃s3 ;$3/n@>K6P &CW}u<ݧ6^0ԉAm*V`k c؈fE`9f~Kz[927jI>uv~J˦]5dɲ<@ LO2t2U?],[.36֎n* w=|5ԑA.9v߿d^T?4%gKkWݭ<yԫsn1*3ғqՕ_jmhŕkpaYkYD*;wE!eYn2x=S E`L DqsL40lM.}rM/H:JKVct" %"9"qL s8,/+͸&R6hqf_=l4Ml 2^7p[pdai+sFЏ g[ٺPB֊3^ Eo}0C@0lq~rO*Cբ -᷄ yAMͩ5n)s nH8PR0`L'<68^rĻQј`aEϜ9~?" Lʈ++~9 =(* o?~AT '`=>o>@v[D]AEIMɟ~`:.1`\V\T/+qM[[e\@sLĭ"ɦ@:Q˸UgnF#%`!j 6$QVC[-,L[ˌ \^#Ejsp|| +gwˆǧ*O]lxPˆ)U;-m0m>ؖemh=Ѽ=l{djK86\ WVQS͂QP)%9Lu nI'n0`'PvoѬ6[c?GXnЁXU(t=[$c")Ue:W\c4|]dt+I'Or˖Mc1lZDx2%Lkrɛ'h39؍iNlOhsWM,v+v̬>+bRmKWh>Ӑ~iYoˀV1_hVKVM 9?LxqZu@n+ex:d+ltezC ahz6hqy˗Q.`q3kة ggv+%{QM^ ѢI HH/L!v0|{7w=Ϡ: b™2:MFjIwٜ$rd9D.@'`h 'E~`ɞ7bRiA{C5;K  Q2Ѹ| zd =6O%يH966(Ø4Ebņc+ Y;DJ 8zΠG,5}C9#:S˴'SU=g3/HGKiƁ{7f؎:RKҢ'|}l\[ڟ2z}mٷ޶Sl{;$s2Uzd΋1d"mX*q'E1gt ϧwoVN9)O?]Ƨ{Sa,2M˷aNKoq%zK;}A!Hߪ?7)8dO8Γ8[#=~ د2Xxq~bM.n?W{W{{{g&XXVe'ŰWn?wVd=VP[y _08Yd9$i֡[=MKVkSg1J|֪*c0J>7t eh{SH__J1<կ/lo#Y¯ Md6N =5;NoN;u:tt>:+~ls.J_3?Cdq&"7G_N<#& U2<Tjݧ_Շfю ы߾pO8QuaRj}iL%:8}%:S}/ vB^&)nI:\XwktjIF[^ieNxÆE~;$"h~N)`h&[EyA#⣤D9z9\˵&|(q* hɐQy=w1XJlVo2!#1,>;5!&26(ЖxTZ bDS%yCč~wʙmlڲi, Ƿn!@1$F;XцƏ$c$k(PcƜmuءٹPS9-/׾eQ8ajdm$ oFu /;7<7;&0A 5@n@ff$άʬgXw؛d nl/l`r۫T 4TUSC3ȵnZ[^~~f"Z%pudޮJt-*< ˲25'U N'? 7J,a Ѻs GUx:͵43biХha[_v&+//H .Wf5kvqDU6ktPzYJhF|f2H*mgMi%oIk@3y> yͯ[\HrΒ̒F6gHZIkM{zږp{X?=sS맧Y==}{nڌ[Z7Wn0< nQE4U}@< kP)$_QGƉ$gfbھnp=`x ^2{X@gިqȓ:o>^:ѐT,nH@Σ$U|CTƎPE'Fm,a !,ٹ=tO 5^zw2Ujܽ| =eԖOg.,E)+/˶;>?NxZ{w\پϟF/}ޤ}/ bli{$-#kݗNIx1KFnO|ǽâU"I8{ʺ QKuкuAtFv¸ x`֭{1`u+!!?v)>|mtm [WNo4nhjyMwp$lSh `Z85 _?e*j])cpJ'h6FA%|<55k9k3ϯ cEŚ-]xԎG 1{nAsM+0$%xUL5M}4Vo yƔi `bU4;뤖AY!t/%1,bcҺk*1FHuc56qŁpZ <S6*2;8DQ@h,Cr_ucS$MQ,Lm9DLE =H{la>* +{xK$ibB퉘Phz 4d-}(CF`R*FfPeie2K뮩FH\JuA||TJ4*w{ëE5IIp2HE4V2]i/teEe82i0g=cyr).QҠ'bf4(Bq ш@AD/0P6_g"Od&Z'un\Mu D.?i5&M5 5d'14%+Xz#;=8 7V^;c9^w;DA ..1DG"=ڣ\I,&|Ѣ4Qz<(̗M;ݠ!Oƛ6_?U)Cq^| oF+4s=iÛL5%YMLF 0\i@'z;}j>I;&$!ʇ1QXTEŤ12]I*0Y);'kx3y!O/Z* )ZI`;G?a @Je"ҪЄn!)𼂀m]K׮~2~0w^NͰМ#V㴺gy1b/ d]ǘOf2fLzojDnH-rPȂf(<|>1G! . @9 <,>ɴ }HifQZ vzǴp?_D~ H-Q}Hf 2*K*l俛 Y_3XȯNHR] rnkt s ښa]KyIum d$ тS| fXƲGęp"E8 Jv@VpT HC-Ŕ39(ŵcp35 Ղ82hxBA56OpP9h)2kY20~|-Mݗ3BU,*q0e_ 7MoiZ Dc?y矞V]dϩ˅٪yӟga|6Ǐ?MGFq\t gf)ƂaƊ.)=Z9B,frbKL, ̃Hp .mFe#y,̄f蠑g2I& 0a$s)tΣTK}=Y9+} E'zE!]c%8zbD,caJDPڧ|26. uka5Ȩ;Yv́_cEExVMΦ\~ߋ7!|MNxm$ƚr~ذ/UH ذ7Χ ѧ9m &'yz' ߀3zsO+(Mќ%]l #űgH3R 4>`{/VwY+ZoX%F7cRs0˱ÿ58+38ƨo{ٻ{l !v>S(G%Ds^Y{UblyN}e{|۱8:#AVg}DŽ_F3)|?{WInJ_\%Kޗuwmx`aGN#KZ3c7U.L楣[R2`w_[ߕu98Cۘbs3$Rbq 3(J4I&*b1o#@+'˘KÜ#QǕ(7VPb]%7Z:T(7 Y9rs%YXiaֿZ 釜q*C@]di KE*,!)VfǴ׫ 1M`NXȆ Z|ӴH ݳ>bfԭ+s3DMc3w,@ݷc1_A\'jWT1yTgˮ*[%Jo:ztuԦ__?,!r_uZ^҆ۻP]d_?Ytgэnws~N9#\ S) !PSibP)+SB`)Sq_>6&9%.\5#~hQŷ "at0yw.K:8/LDž]{7%ΥTPBWkA5+&q]a">(km\QZ#^2Ѵ uTNn{Ħ̇jsJd%Q)-hFPƿ^Z.yr^IFAHVR .tr8f0 <8fGJt>@En3Hd8 q.m6p駡P}j6Ox*~3D\3ctв  =Psbt G-ڣ/QC"$` !1-${MVZoVA딾#'M:@V[yևs-)_qG7Igv+ uJߑݦE( ENѭy.ݧ 3KYdeynǮ}%'4D󪏝_J`B yon˻E/?$oUD&gkOmF2^*N7׋<!4s{yI O6r.yJHQXNY{op)y7pÉ\ݙM_vZ;c]Πut%E٘a*22S%_&iLeb" 㙤OD+=ULF<6y]̙P DK0$wbm }[hz?_DpöM{ݒ;\MQX5BAnonʅ!vf\#.|F9xJ(.aiKFV~W;e_B]*[裄XBHE6f#U20d#[Яsuԓ M" _`; >r9-:QK T[)L8ֲbW4İAć%n=jBZ\<7n"~9T"E̺ fգ\Wftݞ.OŔE"g6mJ pB[*nWWN:*s*R9㢪(815ZgTpZj1C. 5EɆ zz毿D l{&aꧮ ,jhD]αE%!HX)TL+ KK%ꖨl@)hz/Ue(1 s2Wʝ!^r :U/ʣo @@t%ZNVv5G4T&TʊRaFBPU^S: ƻ:\M4TCpʝMj$a'0&5.AO:V[ZKP?nˮ)VD5ֳ+VEH Iq|0r$ﮂbH}q >fOΗgaWKBdJypqcȵR,)/8Xw3?!9 {$:}Fo X꿊s0I:K'KЄ3ijlk2~ kgMGTyΟc" yn17ZAraF3 y r(YI-u~PHHN*HA=`)yeE4> bZH$ yJyk*' CQms'AR_ 4½TA,嬆pɅ'FXZWjWh3:芁QWzM2jbV1eD0E?u4#((`jR$ت5V^ԄpׂNR"XY* 0eKf= G7P)!0|؟'qtH$j+GLys^q&gГ7"ipq!6L2Bb413R-b@8W4QZZz_"PhTD7C]s qk*:]ns.w28dαꕉP 9IkOQl]%]Zܵ!\-Iq\1CYFGh=qA^7M cMgfɮ#v@P5_Mj^( 00 -Ƙ:>;]/Qn|QLli[1Qg(q%F;oکn!p 9k%/5OnY2i0Kxy'7mGOzx>yb}rdVӣ[= X1zf㯐<+Be?Acۆ)yy |uNs qǧJOo;' Єן`=B o`B߅]:5y`M}T3e,h>C \\]jڷ:I,2b9}'hV}?}wE>չnyz/v [))S6Kjֺ';tt+/>Kn}x;w"f+ Ct+ uJߑݦEhʯ$F><䝻O%Ұ_Mv^'!P^f|}|N>xbDr{KZ(Y&iOe[fD)L]&r6&9ւ9(FG :/f:t"d[:K*K2+8ΓR2KXGWyUk~(tǔ70KIھ?.a6BωMkm~J[b.-mPK%=<[.{V4/z[@јaVup`"~Y.]?=8zT웂`Hjڗz G#@)8#q dpDzJMKݬj8o+z%݋{ʉ7z#{h%i%2EJS.^R8_)% VZ"K+GJS+ {9$ѓQGjm'4˝ӚuRx ;oZ%XU_׻( /_/p22YX%@X4hN$""J46НRɠ42'%]@k $9c]g$+]:_uH6;k3rl)INS͝s-sw<ǭkSLw\\^~q?׷.lі.7" bm\G}\gIsJ?WI^UM΅]|k_̾ԻM>d鎰LVE5w!9wjmKAsڋM0.0kG j,5XFPT i K^(" šjl9|}ӭKt Q0n~P!=lT!ścz"~'bQ`G8W^w':szaZIokL!pqux77\e TNszv}(wڞeffJ~=`)qJH"&3-?hqƮ(\}' ? ^_)%:']I=jt9Ycar*% /ax Ksxo`"G4WB-eɥrJJ:-!J+J€V ˷>?\?&ׁ&ؾ뇮&@CBՍ%?3>"ox"0uP".|4\82JU)ƕ6TTސB%+[@5qR jK4c(g{)UXlܮ㶋q)% tgwsODJ;>I|KueQ])|xVVaq|0ڻS)F]`o& G ǚJiz #؉X$AC0+uU}Q4L7=m-^GXd S9||6D02E[M3>nbMњ#hAuj收9QԈ%g{,<wF{͝~zӦfmX;# Q,]!z%l45)(sc'R`$ ]Bƅn RKj\1$Cn#g MoIPvA'EG|2voV^~_WҎuQN51]f(g"2(psK,}4_Ǹzp}߈| oC~kv:LyLN܏$/CR Ӝ&"%dD-iKGB+$jVKpވ B(>M[ÅdHJ9l@ csyI9zg[\0FNP$VU;(]3cU-M15##LWĿ + 0fN $4*APڽ2vUSSۖ(H9#*EB?ȕZ2B(bpJ[΁ %V&A]}X 0eJaE0WE3p )t"SƽD$ Eń^Y[K L9]TTSyځv=[x8 "z;ysSLRfVZK+rcU~w(FEZݎ)Q̐=KѢ>j^X,z;[VObAXA*w;.>NZ%ְXFWw/QW"SR{ 'g NIyކq_cgڟ7`Ou9Z%b/r+OS(/)J J(6C@GC&;`[^wȫWcXMBn ]QEX1/ RQM 7nbD8i4MDauan6[' u=hOp3!85|:枳Ms{tJ|*zή8@s4<^FS$!"wAa> PaR1%_y8\ĘO}Xݏ0p߯BU2|4(5]g=X\#:7q$ #K_Hɛ7쮐ؐ*Ԝͅ]PXB+{URJb 5ⓟ>?՘\?&׋]pQnO~jNg@Ճw%_U5>lu3?zxŊ|CvPhӽr!*L!ӪJHcY190Њ;F8ZI&0a+CJ1Kͪ~ Dmg!C K*  ULvJ TU i+hlRM@z|aqF74n"S"CMB0GaLh:KT1¼gxʽ 8.>:ea4d*W&3Gҵ݃D{_N@iIh{Ԝ%lH1@*)oOAG,C\yu37bt7MT:iz8q-FnxD'HA.A'Kb48D!$dD=鵗v#bi*1D4Eϧ_̳C _Hco5Zǿo_Hqϝ|Y;eU>ϝ-ofज5,, ѶJ9Y0yO{g^\gr`˘7\n3hd)z֖~OMG7ZV*83 I̵b;EQUe4&)ciލTK¬7k!#;a7][UfDL_I]\!瓕l?Q'5n?> ːݨ]VqWS#Mϳ̻Z_6,v. *%h'ˆUܠW .$B~;G\͚|Fؿ#Abhi CLID;J3=dScܩT&S,@ô|f8KPJ mFyop2@#ffWh1JK:X^c49Gr dx-jgMN9be),M2.d)RȔLAT`:ꗕ+z DV6z×FqވoCMWPRڠ!(\ơZx&~W̯)?.6KdXtc6[WGl>; =`@{0:ET28"x<]O!m鑬˃vBfZx0u˭!ۤkacvc :NFżm*B^ka qY_y{|$OܳEݥ[{',S|څ8r87qd~8\IډZz8ntѣqBtH:h7p]fRrU.lXs61{B@rSՂy*/sN(DXlYqpQHmx?a/쥧2'&"\V'pP>wHSHqFʓ(@X>FsfKo^)B>W=|0) !=fLVFwoX騦(@ UQY`MKa%q[#(`x2%Zg 7T_Wߗ)(,|C+/Ye2A,6I(>e0ȡh_$ 9;FE`԰}DH)!p:2B28 ?1y{&}#C$@scN+%B߯]Z>^1V,_SKbt Y[i؊GD:MȚrrOݘ <&`J;dVUY(2N) N$Ye+*b\9S*w&R蓼Зa‰ } !H/N_:`sVR2K_J=Ѕ(qE] >OQ\I;6|^yXϗޓ^=yuo\*hR&vQlWweE^)R6yywUkƽb!?}ϵ?gZ{ȑ_%Y`U,IIA~ȒF=,Z%-1ز:|ʀk>C UP\ @-ʺpNXB J-8K+)]k-Dj<'B*% Z S_WJ: vUlɶ$fc$64HX&lۤƒmsȱ+!P$bɶ*k"y XmX-}>-˔D{TK4wNDiurKN&?,Do7葕^:|zOr_qEȭm#߿zDc|޴⯽ԡI h&]n^Hʹou`DqH;w}2˻Gԉ)9ݕN֌><%*Yc151sX*5{nq4W;/~ 0nB.!Uu7c:nckع!!~>96u߳{X ṕ(;{oc]uڋ{?9l?^zy{znMV7f=~>k4?~O΅lP&7]{ l?;rOlgǘ~L]Q*`JtQI%tuR]`D?aeDyɣBzcWdI8L]ie,k59y]J gjp*7E ԥ2"@IVwvv [ ~2a? FRBͪt]&쨔n?==a'@R}n9_9ĊAtPdiMlhhSFvq9w}W4sks9g$5cioFHQ4-yz+\mE}@ۘыZ"Jh5S%6X=I%csHx) Y rX4tKHxJv_:}_0Ԩ"pIf5>bD/A`Zx)~ﱦ:U7/ ֹu]zFI (lOh}탧áWuП6ÁlOųĴV+NhgocSƮ웋b`zK@ІVa6[y !p:U@k>f`'ҠpW2z3Lg)H G-sW'Sa["!S(tQz xvOJH=ZwK3{`$>AE2<цAIljrql_YV07/}wbuކ견̊v&?ccccʻ aN:Ue]aSUQ(縲ժEQy(ҭ1 :wNņy.]v/HU*çz(~Yux>ENH{`vg-*Fثpe%2WG^ ?g)}z"9NC,`smNLY9zf -1Ap.7 xN&oޒ§-1XƌI_u<"iN4p05?>:Z,ZCshx]w]Ӽd^cLC l'[ޥFhE@ Ho}ywui14]BKD枈w_c|v͟ÁNog鄋#{vM2F[ |*yuhUد݈i&jmA7O0[ wjlOɇN[IjE!}U8y"A|Nui9-e2"@WNq*D'^oiY=zXY :Ț1%13+7^gK\e4He9X~v|9k@4\8 >We`:ae;dn Fh!<]" *Ld(Zb^yaىotMp,V=4qȕ~Ǡo[ ,$3#wlA],jR;6-1gι*kQRck(uTUF˝֡%qB7&RKϪ+2k:sH' Qd*Ygok竒KXHuXbԣ)LE|b{K4B\މ%|ˏz8VdcF%)DV;x@8NsD_B\{|D(0s5g4E6d]AQG{]OOR\@蹟IiyN ƜzN=?)AY_ `02r>ǟ{(UjE`Q,˛~U*_n/nn?feqouY#06cVs,~y}evocv77v.7Bx .^K ' W,ǰ՜?C+cHѓ@g\(hD=*QCӸ $@ǃvP Tt-K5'tb#ӽđ8Jp8lS~3)MLe⫃z@ Dz'9J33,jtdw 󓮭8 "(ɫuZ'[$9:tӸr߃~k❸MQiDV8yCMIm9ż;ιqssC'(z&樷۷H"R 6^GV3N6Ѣ,p~ChզNbn jBc#}5гZ!,* 8 WƉR :kdc/ي(O|*xN:Qɾ|W= cp;ɞUAsYOv 4 I_0TFV{Ydf,lO53"Г1ti,4\U* pq,uF^T1E6Ρ3sKLJss#er*W"V:];d"jVf@#u7lɠ6t}em w)t73@(%Kd V#3޶p?w7qoʇH1F f!we)X l^tD.WqDD-JqR "$ Xwo7 ]q.w%甜E&n뗾SͳիG$MTүΘT_"k7GS 4?#-htNo2Sª-35XƌI(#W(eM!8'`Z0ǃ_wN$ rht ,]G%xiQ9Hڊ noKZ)fB}qI' (Mm)+4bU[Vwjp 30uMe-dnQlfBHR!NzN"p&!EoϚkfNuQaf25S*f6yoHfUՁb|.XKDn1jxROY<7G&2/OOpFдJ^z[6Mm=ajښi\`POk*F,h䏋J'k@ FOKZEe dZQXc'yw{;SZN6'0hPƚ%왾NL;z#_>?{ޠSK?ΐ b("A4NTHs_p8O?NtpZ fX(\xL'@),x=N*"y=Idl1݊5 #K8SMbF^ClRep11c_j}maqȤ dt;?NK34bL͟Í  7`XzO9E iD \lhUTnYE%Ӵ2rh`O 禇O~ԝ\Ap]V1=4fD]|3ݱ# 7'/M==>7ferc&g5oMkf/vw{WO]M~o26/n_~:f]/ 3rF8xyeXvkQ4P0kiUJ4hG,k%kܪq+[ϛr!a˭rff$':zIf)pȻ JM-p GB^ ֟™][UмOJ`%p̷R-hR`>9&V,WҹwKAsR08i;fDIa"o77CG=V._8^v0GvP8f߫efLs$'Xϵ==q7{G`p4z湷;F\`[\l.+|]?=lѪw,s~yA it7dQE"Q}M^fQ~W[1UQ,ny*< g#3J Q Y@6hƆݜ *%oTr*ʦմ-N8hMjQ3;/N<\,Yh2绿u; N;RI K' &a$%Hnen)}-Z]]v 'N8x!YrE9A^VQ˔ eOD c7P21aS"lc923&p~13| L0pQ(HvKc8@jc .͹:8IB:3S٘m"5BVD0W2!1#^twe_p~vWī %Hj]ڪUIV4IM+ T붠luK 0?j[jrxX:S(rDiU*JҚڴSHKnT+R -t[˶V-wRNJ:BCJ2 4 bp(=d&D]>g| F`Lw1!X^<ݔ]K#sV2H M;3nz(({Y(N6y錒&r&qmqj-*/Ktx=ZH$C5~oP|[OǩlLbos}ZA'H|ǧ7B1Bz5M_l+3q1h.xT=Ec{iҵx*zw~U gT׆,./IACAiQV9ROUVSfE M 3^1o]>}dL*1: #iϢFlN D\x͒SO6`ߨP気M͒e\_=J @>jet9,0I^](RtZQ,ֵ*yR][Uj9Vk2}^HҚԤf0oRQRBfo$J}l.0 u6܌#pt"T#V ݛˆc,gH2>tS3tlҧ;љMzmh\:=&4#R:8!fC D0f%8uC'x:EǦ4D=Uzb~Z{=/ nm1,WQۦ"&_qzpRX3]jjC>.`})5ik6~{v^~X}7|X}{#(wk<]_ްceoW0 FgJ!lr8\w )v'l!BzexVt v9 <}xSF p煻K9c*˳ {q> =w}}{zbX>;=ܙhQEGWT9at2LV -C_`9{ ՞ǓǎpjĆ3Ȍ]]vB1A UN}EtT *4.l:,+ Roklܯ݂b[TQ=@sP\ K/0I9|q 9䞳mDҨ|CtaY}?Q4uCI:t,b0#k1n VΩ>kY t[-@ICz/eMA `Vo;^:C{yQ#NM2 9z d T Xl^Zd´vOBx:/(1 af_l=mTJ D *g6}$irFPei̳/l%EKN8-[ SU)֤)]+pN"s̷? 'x֑N{觔=pxGDRh6}| Ϟ*z9Ad%vkȗ540? pm:[!jHdGd~83Dz\"r׺BM}\؞QM*kd'>g@ip515 b@Ϭ">pYp; ARpwS TDDžhS",3.@S ){$[oZ(P\]xɡw4bK(Y+%$ZUDmjY9o%V%kҒJ o8CuWϑD,)#oطu&*D]2&KQRYV EJ*(RJmjo 1exx{&/!iNI^5["owSw9v٢t&(nJ>Ӝ#PnY.^gɉit] 0kOiʬKw2wwT(E^FDJLj3 JȂ5כVk|d hpHͩɫQTr]&8 (6$E%8&O,m̈́a58*Qzt[ްǾu}  Ӎ:u#̉j R~EKlb כVjc(5 6(-)"JY}aˎR823*1zDiOf=ou]&:W.`)b" +_@34 .t8g6l~N-Sү\&ӛ9f.2Jx&,cpU9bN9w X!nY:'Gg+JaWA<5kuH{"T oCbpM osBYV+ndQ@f {៥I9EC$thB,lu$j\ꌧ9'C?t5ߞ&@\Q[&I5?,oN_G;y='6T"l׽o.CYΉK91:J^FdC(^UeȺmʚ5Xъ4T֡*9ɻEaHkȋM@3WynY$'jCiCrZ1ő1)E)٨ާghpY*#6LBwq8YvKQ -Ðw":nnEtVkzCh:dpLA()RpYD;FE)\)>_I/Y@2Qm=iH&ʳr-Kr*ed*S>%e%_AƵ>+OY3ϋ=Gx)\/( /1'Zib$I?=[߮\_].jb\lQm%o%+.7.Gru !ce#ڛ[߼9ڰ/bR [3mSqw=[)4[9uo 'QA"x4GP7xy~qw6fۿṷeCDg7x&|P M/s4gVp4ZQY&$H1*j^0A:Uq** [E dKE\KIeq€(L4TAIPsSZtA׆vR}!В(m@*+G ".1h=P]?&k3qg7CNAOZa8٭OojMM٬s[+B/0ʘ NX)v-^ aήhrXRThɣL{X؛k-^\~yhу`5ߝ#0k~X+LZ.MaZ_N|j?U 0*]-DX 90좮ȰTir:>ӳu *9 }08yfx̃ K*[s^=Fyj j;oD2Ϙ)L!BJK!x|R*2uZI$& ,w)u\hPh Rz4uWJg kӥZdR݄|#Y[yjwTokyL^+))%epvOvu7 |cP,,fx]}ݮO9LTV3L@#cve Dj<7ظC,Ќ9.kuT'iFbT:83sy0RdЈJbunbNE<1gֺ6*뫏X-fdy3G$CXƉ^0&_|MB>޶joZ9&O\*'dr8t|L%c̀݃X`ZAvF f5).5Z(\0NaAkUՠ$VFk\fęf}6  Zd5y_qeU ˘Of$+ CǙ@'F?PveHe !zPG*y\;R!-Q`]W9[*Êu1LG4’i ] :]2()C1_#!U+`AjWU^?MG!K&XVa ҁdpUi$G>+-ض=lLdLvJhFȹSD^UfD#viΗE#@>&Fb}R"eX>_%Aj%x̣CBakƓ 8=i~N2\i?-?39gC(߬{o? r&y@gq*=&gVE#y^ܬmE4aj0s{OW/U\_;$|~u)Qqr˴ |oz γG K.L;x:+&cG*"T(`0R>&'mkbu ['F/a.Z8ջby=oI+&SOC%0Ng0]k/Y HG -Ҧg![J*{SؤGmS謇GXJ?݇[6%$ӓsa2FJrQ-j]>ɮ/4zpH4G;}nu%`zpOn~L,%I:A 鉳4H?w7.^*%8vLPOC>jtVw/éVgFAQfcH?@+<S fpkK)i,hWx4&ko.߿ܔMXh+?՟PU񳿺R7{[]߿9i|Q:}wUi6hijlj^ۭ׸,lCǰjr)IB^ȔfnN}|vWɴ[yThvCB^ɔVGoqj7>>v+ GtJ?$ЖՂW҂R GZ3Ӵ&I㬗N"DkVZe :K ?ն3JzwF$[} x&iT;.?.!j*FyɿF{"ra/*rMc o7udf\Y%smG֜Rfo̎e[&(AYhtBIlp 8 Rx ! <܃Xa }A}!0=OQDٕ#1uB܈cBgKϟ,;q1$oAM6|ӛ<<ոɃz87mM(n8xTIK̘^h;y ;K/0mZ<vJS%0TS5ul叟vN= NuM]]Q!T]//~[ rް' 2weQ!}Vʐr*KXR0o]=&KfZV_b P(ńe9(V>jk0ꨛq b[\D0d=pi* ֱ/O388Ͽ3HB%zԲ~HdH,ıQ{!oez U&mKr6YU)^~I#R/D砅>k40.SZi?ۮyj|&*V346um*V^Ml,_ Au)pId8,4-č8?Ye%^?((Β$.0ҝł^@,%i)WfWd&14z5JW{Wy3gL`4ZiDKv|2aTKƙVc~hVb31KXB:4cqK=^qҋ! EPB5OP# T lb)4LqMzLӈ5Dn/暷g/N**bKf8.AD%Rlz5@DsTr}r)9OؔkƶfSlEfɖqw,6| ҌŚn<) ?=8(_ u<Wdz>WSy߬^тMB$Avy&B>GѪ*s<)v/~M\ث5VV>0_W"D!Ԯ 64f. "2YMWgF::JV93C搒#Gnng15h$jlN 3hH2ڟl$qf=,ht\z߿*jim}tU]6lԦa"3PM%F%"r_Eb]="rD'tq{:x.sS3RНՕZb;;+YOfm^沪&Mrx` 9Mfyn5lmAW1ivjm@1qן*i?#Be!E(qHCjDh܍"h<7)*9ˢͨwއK NqHLsx4 ƔD _\1`#V6|g"aVm|ܘ4%0sMӛ!e!A>aA7V U* Ol*YHx?cU*Etu%k캒'dTWG-ی\0TY+}ǔMd>$b&7{ 'T:%ɀWG{/h{@b뉡| cgPaQHŬ/M8 8OO>37ޕƍkR`~ty (ҮIU!ZT*jB$sGn=ڍJhaBi{7/W7[G85݈(· W"oķbC7I}g(D[jw KmriOB.wcCuRr 4s31O?0Zg;TJh`Φ6.#gӯb׉O,vzR@ $?;p}e.vM.JNw!}O'T4TV6f1P)1:cT**Zî9Z6Q` J,6R $vOj4N]R*yzzP0D]Y8 CZ!ᰋ}R' t?}/)9kz;iݝv/bIP3#H/B f X҄H$1J1V!IѕN\Bpvd=B!^SOA@ۀAȯz}DGNfX ]wi ޲\- AHUld)Q$f:Lfz&H2H2HA0@8:H'!&zx,@ T"MS4 ܿQ@,*AvrgUPMӐ^Qg )ex0gY@hpiLrT4(r?dbR%o#@F<7vxdžvd8:Il ͧ^e<8Eku;"wc93r1].X8 XLno!CwKq[tw!z:}"W/n,rLړMI>FxsO9{MOY b=*נ%Բ$BT" 3%A=ȋ}|9^j27m@ 'սz#Bg)  q* ng3ٔ+BZB9@ !(XPZ:ZьD(}Ve$e(X2L؍2ηcRjܫ$T[$%9-@+6܉(0E΃lx+=>YB s`cΏ1+c.FD OxH%q+!Øs% se1G}bCI(†1H"A EDa̛呋KTBNs59-vZmh]i((8pNs9W%|ކ+aK?.p,ͧ[V:*[VzT̻ <To s&C玾֝ë(CthcF@ U(\t}1 b/l[j" #ԩҚ3G1w9&JL]kNQM%7 sW )n9LEĢ0k0ӭxZTC973kuP?SЯ2V!cQ_ewAǔ73>WY yK4jMq~ߡv*z i7*u.w&) va%4сv:z M}7*MR.0I ѵ͡jX!|^8\+ E-m/HjQmiFD]:[M7*ReQ=]/M7&41?O.[N/ prud0ZҐ7h\o .p_'&dY_': Ă@2ͪ92p#UBM4XԦʚXc`F&(|};Ŋu ϬI3iș &qr Щ׊%\~eɦOIfwt/֨(՜Wai*;\*|rDr,-2B#0e0[_=HVRB3R@Hq(q!%۪ҿ̷hbcG|!9ϓ|1y7oz>KH`{/a^XfQhR#]{P \䋺eY0;gu&R]r|mEP(,tu+H^YҜ! Je'sv !qs67"`'c8fmc#u h^s&]-/777f<>3].5Q.!C_ NECt+C;Ck &] z61dK"Xd2'v4s$4z(*I(JYkmfjcl<"9^kX v_%qתY(z%n~HjbτJ{`G.=m9gDSR$&# )QºrrB؟2ם >cM(F[Ў? B;(]hyoxY }cƻƙN!0ݭ?_4=x Q#U+IP=r'}")5̰IWNhzv5hgWw%Dm]R Fv;o~π H"Q=-(ڿC1&E7jx{)Q"ψ:@l.Rӟn=vbD&au?iZZO-]?kԳ9\Ԅ6g183^Jh ?z@ ^ՃZpoE"!n>F6fvfn6[ƚ4 i_.} H9u%}޽IJeuzv?r$ e3iVm֭TRn! CTҾ@$!G^8cJ UA gc!yC~柋mKsX@IlT(QDaJ1LAcBaAKs([1ްl!f qx,2 0 dI9)Bcs!)d $-#@vXNVi.zno8[w6r=zz- Rݫra} .Fo~}7(gqf&?FO}iqKN~ߜf OxpwbQoHJ/N1 jW'.EoC,0K'J/ݛ<[Ʃ3t>]=,@fZ߂_wRI0@ܹ 1[?"=ۙUĭs& bf N$'+R+9Mr..gVmIpnO}ݤ2+Fڸpg()KVo?wD=W__1Sq]d2r$֙Ħt""B&;́tv>MN|chp'sN+*!l6CeQ o8cs6Ts!!qjX~ $aZ*!k2$j*RTh2FP*uj)"V#`{RK{! # ҍ\t77*\VT(y!f!EZqDrJH]XDg\j2 6J0"/:)Qi9<->L2sDPS-w_G$Pd4=.B!]\HQhgպDa\8Z]ON>juj\zd6z#s+!3>nRej%foT"'Z.9ބ9Rp_7TGgQ|Ǧ9n&pK#3eHo/[?5xv*a>UnY{0/nrzmUrfb\;x6b1unNp)n/nxήLr}dbt}2)l[u7o8X'D4YwjY f!KW*Ȅc ̺;rX!s4QX㓅GsuŤFcok(% n4sNɔ'"BSzwb .NnQ(8 1EfbMOSSil0UPJkHR柵, +K縚*`}h<:k"I8ن>z_d9r~<̜.ό8W%vvB>^rO1@,QwAg*J#!TȭP-TB2Nո.|Q)틚Oz܆v훋PQON&*<1:HWA<_۷%Wv @dOae LiwQT!&(+B-%ډ!WNoI&g6!_UC4`L "wa1!D[)~-9 4f}YjLf CX9XIʹuV!RB(!)`) p~n(OR+gZ/dKPjg451 HcڷN5Gٷ4|('"KN)xd!;|+X0 mv-7=XŐNk66\D0W5^B @n ^<6*cCrNvܵ=&0c9wS@b0NtB3$EP+ w+T4z]x0:,]a?B7>G4pTwbmՆo@Es NR\P gJS%7>i ʐ]P 0m@jZ)-ɦZnb[rV}`Yw~WGP 6H0&x; a!alz+TV!ňR.?aJ~ h)c %FufCnYQ)+b. PJM0JTJc3 r Bd.TU]70D&9F$Uʬ5 )(%ɉj mY_Ũ%@I`dgaN5Nziݲ-ٔEJ 0eY*օ\pH8f4(",xyz.6VYU.{oLIaXi*IЈa%4,FaX GDGbCl-a4/{Aq@Lm<3uvZ_z{ |[pq1㸱b-[Bp] z--6L&=\SIٟJs3\˗qӍsesT@k^|6qٷSlPt9EE.)R(r\> nlxOlbfow֨6}Drf UoeMr7ɝ$wMAB4VLWP$Pr2JԆKnSBF ?4Lx1uES~_@j DF-&РB04W߭>DiZqu,3H!Y|%06{.hiA$i%8g'8DCdH„8"ca]N@ FA!Bp1OP0(QbD@HJe=r8 qDTdC E SKϨC6,6TiYXItc֡vRؗ,Ja;(o-aJ%m[ͳ+OJl Bw&{ׂ\w!k\X oP?aڵ6Aތ5:Cs-Er #| 3 @gpZQ-mܠ^*/FnwDJC67zϼ曻C Xg@;F1; L^J/y Z4^pK"Z7={\6ptl|n`m&/ `JF:|wdvI{2z,Ʉ(B.O\W_6η[g L}ц^1 pCEڧjE4M9wjgNtJ4wC{ި<w)bHnp]]wQ܅,A{l; = 9L$RZ X 7y% 5\c:dZC` eG@bgix5d ⃑UvStRxnQQj2r)Jj[%\"ќ(ȍČUn!y|eCf$$(8 !L7(L0CQX'g "5wEz] X?gWgrVV/5,lD-"E6]jÉ( i.%?`+"!y8_ 2?m ƧU8[)QZqcWZ˧Ma:xc_<=94Ԃy˯R\f{s(`6RxfBlz&|1sYy`n&=k*h2VA6źcJ ~=ϖ'sԞ@Z䲎jO2H*57=$ԠZ[(-Lʉksf?N—d??Okm-fYoWBo+T:DsfYC ؑ[Z$w-v8 uD[>͂"O@ːRז=pD_ۢ\gR5Zt1Ξes*)\YuDqO~Tbpu'  ) (THep4_>`=q7ǂX0a7lGp*qX>݁p,dJyLQ4Xh#~P>DnsT ==GߤI'sYczĮB\*x9,svEpx16EnȮ-IsxF-Wey_nJ!]=;1%n!ڞI'Q&عJnT,+Gw)\6)u?"`3ܕ%qrh2Y8Im/(ie ljf6˛a;n%zag>\`L y!@/MƉ1# W+2~U, k*VzcUr27xW>oPY]FzgF?@ }Z1ĊV|Cx~?FX?BM|w<ISΈօЊW1]a -c8f+ltᖾbt|8D7 h'v=(d_63s]/gAM@KГ<_{,:h& Am/qk~w [^1kо~t3xs͘VxSΜCzdLQr>[R=+5d]&SCre^ jYtB)sJP7$C$#%Efz>+hڣz\@*7n~$PK|g/I0ЅDطH."!g7r4·-Іm_w-F!m,םp&Hz{ү'̇׸2%Ӄd}x!uBgEJk!l;4؁w۲b!m3P- Iz#0~$ 7Eb$e[EXtvU Ƶ4$fz=|ԇ[=UHx+ڢda\͖Zz'1B! z! ;RPSj!j lUJ EJbݡ<>!L U+T\H]+PpoJGJZ_Hcb[,1Ii0reWfe8M5zrq8Eo j Yim4 P#@3T_ e_Mvɟϳ|7qn[Ӵ|[p:AJ:4X>E|Aw7XP}9єSIK yHÏh UhH #vyjE dM.:R\!%kf`|bA6QR}s`{`?N— ??Auh]g-fYς77cՠ ƉrmGs'6sf{>71A0&(Hrn ܱm2~m&*Yr(%eA͕*#̑қblP!x•b:Cϒ2R/&?- Z_~&k/Ak-k&iu3&Z54E)1۳@,m/p$*HX 'kj|Hun&J;eYTpH ;_Z+{qX`8V'n\b \ \[w(Ks%GkLI46YLEH.$@" Ewe&T1.XAx @$RLS9.9Y{6㿢6Jq:*/0\'҃ 'x*@+j@yTp{ g$3)Fl)yGwԤ6}9{|sf ]Wvi|M |8Q4zW uw ìISKN̕oOz@#$ jO#؊! 9u;hxT gfo9qɅZm)󪊛x:.YYO`V`\[ CFddL{BH@H)i||tV+-@ޤk)~I!P+*>ǗW6#)50B/$U6Ew0zJ,T}}_V{]TQT1B?Vһ#1LH( A&/"] ݠa2; V)$JsG`d"[eu6H&\Gr$!] Շ*1  anܐЗe}iS|k׳EG2]cs l=h8Z0K{%"-`HDnC֋s@,c;zЯrAdmzvsD m F@?zHq^Pǜ2%%e*&IRd!TtN+t d/G ~>?ӷ`HoK5ٷ8Nw(]LUo9\YDE1DwJw!L]$ڎY2šL% !1<," 1'Q$$$iQ*ƣ|Zh+SK@w-±m,;:2 OD2C)cc4,ScF(z*V]@`}Ugzٻ6ndWX|ٓڌK*=lTdˮu}IJR>[劉i$s72F x')P *H7R"|녧"/^:iy*qVJH׍Wk!rQM;(T&p}keBmҵ{}*bSp>q^Rz 5~VJ5tw}Dlsz n6 De3XSALc`/!gA ^vcM#l/KU-Jj Qj%@O P}DY5-7eѢ;i^ʟ :8Kզ'ÎOSMt]K`hhƋ Rt \%C sʲ4῍w73jZB!o~lϖ<.VF!5oF?L.6}6zw}a2]-b*lq}c+D->lE]#Yy{}W>p/I elI֕iK< nzk×O;tJ0DVJf٫Jhr'On$7zĠZ3ϰ,eh}UǗ)f~)ٲkIX' gT6Q}D}eDЖwjKZr ;=3;gR'ַbjy,!O-F2'aeo@ƣ>N!zO IlO:35dudbG ONn3)kl9un6=fCëyWyf:޸Eʢ~w@~8zG鑑F28h):@cPv"VL4 ]цFA0AhBC/':tmhzZŻ)Qm\32w kZMoT9XK$xM'7>}4V5iy{%7;t]y]i# 4 hٯޫJsCޣP(t"/;ޞUlԮGzrvB2MCA]}:fV)Z>l#vskaa3N+T,y-q~%od://⊛> Jʼ ENZ0RzנdBsi S87`AgNJ:L!2 PSו`_OA+1b8<2x^$* ͢jR2~IT¦rBK(X (x A+X 2 3T8p jSq#ۻ\t2/5;\gش_X[J6O| G~i%[Ƿ/_3} {zxcEqxw3G}3dBx8L2+?n< |%9'۾C6&G/ʩ&  >N^|LR@ >v3q O|`އGA>~&0賝gWoc_Fw)~ m26>XG^W^|]y+/ʢhcD㼓|ThdLidugL%cdX>ThWmhWmpX;"{tu2ւD9#:k=u udL?lqyql-eJqI׼e HFKKNj2/z>c1o8_]enZRqyГs !M@͵C4 v!|cP_G\'O7&.|_.|_.)$.TH\>#/@FM _O(_)B 6K;0N "#P<ɣx4_fԧd/wjtGۖʣ;|pʐB"srEڍQ{WB0WNLjuؚ@Apt9H( hƖгMVEӘ zR2s` |["AK! '6COhAZiWB+@Xa\֞o7cQ77fB$K$ ě(P8~ I t@ϓJ=:U*i֡"` 0JJf<%iNxGpBYjR ;GBKF&k@.8Je9SFlG;DYoedo 5LAONҌie %5h@3Xrg7IoQ!N#K=sDMS58^PFqAs9sָhN2+6S3p 1s*e#@{4`R/GrW(zk{铁k Nx}.R|2YL# .=0\L;uZ&=f&B6yW#=K¯eD+$ZTK DH_)=+TR VB@`&0kfpm1Zn}|p5@m Ѕt< /- ̭(f dn$`5,Z%Xg0<6F[kȵj ǧtպn`Q Ft꾣v;j;nڭ y"ZK;T+z|+#'EQ3.{yOطtL gt)3S *?|a݁_'ZJoI^;䇍AB@tS0zQ?Dͫ*a_q-j|ݛȸͰIE?}jfýdBtI+مjf~9zDjFԑRBQhEKkRm{+ݍBVD-Y-j7PeE,H֡Bw& * #H^`">Vt3NAq!DbP\JJO}IüAI;VH=Ď:8)T=~y^B#\2vx1 1{ R~ Sf0ǧPwqE/ŧppd5pۀJu],mdm2϶ƈ&x[D#$?}|"N) fgebZٹ w}|pW%@&}jI]^BToK"o+FڮK{7N4אFBx4b1-\̩E(x˞g5=)ZJՓ ,=V 0EBo|_HOGٗpd:W/R!,LB!4b$E_M7wիʿۑCФ;.ɳTŜ0'8Y7}UPA[5P%ewTT+)ݥjm*DuD0?F0)xw촔W~$֩up-(#lr I2V51%!pw-2z74;T_5pw ME=c+Iw5Ф =B֋~6d ,FS0H!2yed.lQAb=[q2#v,C~Rs~@1h? )$C44pe$Bʈ_TK.^-[wVι+CrFĪ4x}/Prk Ynn uJD/0GQіb:?oG]ӖP?6W`՚!z>Y.El1re:h;[Pw%L]nZ:UGhZp}VQ`x kܬo;Y)oթ z3\[ao?'OX5iom Mw?#,Ès{DJBkh]9ln1\R|y0\1Ó#Bf9;q.گ*~>Ɖ^!80@59<])8W#AuE4l/ :3Y =^*/^$m|:=ҌpJ7,2q*rT;Mei1q s^heߤjH= 0%+0*/qorſ *^Ԧ8ms*5Vl} O t`e/RߒU&1ms: Q̋TuQSBZPLg0O- ZԲ!?#x0g: !ԗ)L:j~\tv>j[MhI%>z;Լn3^"͘B/σh& ќjJ0S< ;~йK-!LQ&Eإ"]],$Sz($% %e(1xf%Ab+-xV6kb^UkY @T1߫JQrNˮĔB'#rȓ0&yfJ-x켳ijE72gy[X^ ihY#`Zig( 9zR89`ZoyUX}=8 ?͖Ei?=։)I]u!11 F gqMKwU8yBijט AýyMdJ?l"Z^c>x1efV(-ԉOmvA}1`"͝4,K;u&շ|ӌ;AL` NUJ,N6AVc^7 fD<>E$"N->x0-8-b$֜.:F/M4kWBnU΂mpINp v5pYlC!I_ފ5BDg _AD ".4ph&O}i~l>z cػQ8x&ʃi; Gc"!n͓O.s2|7 9-N~0y?>ͯeJ%&Z]y3<qWa_VEw>BE2ɒt\Y+x1 +c>04#:@\(.vQ:yL( $ !H*0RBO^G?{Kf]Pſ\ !+V]emNCX0~kb?¿;WhL&ď/fI Kg|nB?&)Y>NG(daɬ9I~%M&%(=#$"6!wĩ;SBw50BF.u_t5i|6P}59K#W!J-N!e yI,`KDLiC<,u2mE03gZ)F M7YMLC6Ƿ/}mR ""&Qz(T'ҒD7}kr܄&7MIQjB:=Lhr)C*<0`XN1|[ulұu_~sJP!im0u#}von{sܛݛ DZj%P*ggR`YAErhZ[凋B9y zGtCSW/@D}rx߇x;xUGk?7/`1+8ܵfTb'M?aZ 4Չ>6`Xr p_d8ARgyb6;ܮX TX <S gV#R疤rV3AYʙs@OO.15EPY^}etEatqf2-ElUN۫鈁# b11^0&z# 4:iUG:Z?W?^ C =BxWXh? K>>Yw`,ZqoJA](Pc ߎFC`(54v#l%?>=<ʞrIx9(v1\D~~3e>ﰆT[;kFD[zX2=y/N=-<(SM3F=$lj Lk3A`s%SDBo-%vD0l.%2#ukfgr3cPL1 'Akv%#*)#2*Zi`k' ~I RaL[fuP*GSmMgEafya<0l@RfV (.0!pe,g"lʙYr@D`Sޯ q"?܀=m:nEcJK]P[x>%t)zh)t)(AemX5E7UG 5@n-PWX]yT=GO3n޸" Œh+ 2JPBm gE&X޽/)pjn=Aּz"oBJ~-3 z*BQ}SpPD0;3P \Ҕ(z _HgMhaQJCJ-A:MH0lpj8J8u '({Fqj2N)Ց(MP8A6؍E4ҚY PEKj&Gd2&`wN"G/`iӓӣ;-'HqT0v^68Tk0,L ދt+9y 8t5D\fn<j9c!#RH~(#pp A8--p+tI|/o:P`SN/$ 1M%a k,g`p]\#ٽe3:4jx8 aK-VWUU6AZ%l.2fdTe,Qk?njeup:8'Y0p*I{{Nnqoŭmj~ ^n7)W*ކVcFajzGئ9LB]^*SAOe5wUz9M/0O'%ȼj]{3Ч{s@[Lr26۾fg[rGN@%y.wkrã}b< ΂ fÏGɎP yx7dB2SAײ<4q/dJAļIIH|v>sC5\vhv(>`?\55bS}ʌօaC|y[(" !qgD~y*#Z3kxqP*C!w^V6Q{̻\-^*z| _A8bJN5|~-K;W9InqF)Bp%kF,ǬfY :\M^m!К2^f#( $UKճː)]}2DSTDVv.&VX\)Q4 kbqCdc4I飙ra={8T,@|H]>a-OܜpFqJWȚo9Ew/1OI`s_J]^<+Ɲx]^ v Uq@fMDQ(/ yƥ3J B{bVHJQ#^GW.ʙZ]I `bSY>{E{Ϭ{7!0)͜5| B^(皮W`JAIMåHk`]$'72ŚHLp{Rtrrb3B6}|ტMU>)7Xן] i~MY< LwuKDzDZ\k' Zktv9:_ E\a.ʼnLh??q2i-scsDgqDD'7{_Fkyq$j*x8A sğ ?& N"nz5 Tq&͗jյm/4M`dμ~hcyK)kuƐZ&R!D;]Eepݑ㷪ve_K״oJ7KպIp@I?l-tP?eN(ӄW}@i?JX{āT=,p2fb29͓?W=Ɏkl~ɾqOwWA:p"1Le |rt)b% *8s:'T%)r4%35q6skc(CT1{ BHBASSKIjsY"N I1F(,P4K-N@ ĥ,,V ͽ)3඿dbr t_Q$NH|xX^/EY$$$$ij`q'k&󶿺튄_l0{S|_h` F0y7c3Mob~pt2}CoLzx)p : jחtG HSRE%s7 "N3q cD\8JD"H4"R[2hia&K26spvo34i Wb_cb(B0K(Ia- -)`cS0X Apz[Twn#=UA =#%Mv-.f!SC86ǂ')V9 &qH^bBVJkJ$* YUX:||k)R n5(@'J 8nKAkIKrGLlnZ`V ~ߛ1L'pΦLer{_|)x^_iطҎG4[˝'` ł% v8), A.Z⽒@P*A# y* np*! 9K n (bK9Q0ؙT9DStLcp$QQ]6S\ WQE381"Nxb:iE41Ӡ6U0ju*^Aq,U]!9m]LcA)'Φ\"8) 1FP!x hJ);ID|8&Z:J,}8dhQ|?ցwr~ʥ^ȚJ\dởYyLoțY5? xOQѿ"_^Ng],C$0?=y0ܸWXzO`mT{S >KM78T!W(̉8r[e#wo~~S("q:2j$P⍢H xnDRcCF86[ZDۇL__G/Lads pV[6.g&h ",ލ((j]:lFsdFL_M0t3xDփ}_`5B`^{nIAb>regaD"054º=%mwS [6 !͌($XX/`ߕ*uBW0 t@a .9`kE)d/%B=84*wK8I@l,iՎ4&G\2":j^EĈ#$G/d/p Go?U#Ak\^ U3I?WT:u*;;W]&vA]J;7~yY1wFVpaPpZ w?;;"[  Ⱦ~4AGm+`ԃ3P;?_<cz?/?vlDd`<[;ySIPTt]GyprYY|_N8k NowĝA6өS=SܻvP٘t~CZ}\赖1۟ܘjdϱ(3$W45?9B3t ޓ2z?DI NWI{@5bҖo f_E+,M}e*!6nGw{iؚ~"㥼;(r1E+v"v["+n|+{<ťcC',XڿoϏ bI_x&~^E$䙋hL{BI'r1HwN^" ]nq [E4K$O59[] T\LnT)Lx9WCd4o(ID9H zK;AAB.4~"vDMkmk(04] UClOmHkqȬoSLH}ɻiRf_Z$:&ip/}uDYzAF e;?Ƽ 2JQO sVkcEmlyYMF[58*)}_wfg,K`y4?zgSX.cslc-өW_Cžٷy i0z8xJf7$DSL`2,n:0mo xGC,` gEm'9տ3L$W~K(%dpg:xMjA`8GcFk#ݘ‚6+w!"{'ˀMA[EdP 0 JJ@PX B[ST #8A:cbRDA=>3Bi%gٕ՝)=-Gd&y–-yy3-W0EEMЁCŤ`PAD(`M΃Pi[(el\+,{dPͰgzB*hɳ?L5 |99d&iQX˙2i,R [XR{^d,'E|XSpo.ӆ\8d6' ؞ApD'a1HX^$x9"ޢX]`!כw r)/'yMI.ƤQ4j6E4䨃XMFQYO] 됥p,ՠeUi,=8Q{>Z,@juHĨj5:s#[۷]λ=x{w lVFDž-{.Mk6tX5u\,l*%GG:?U}tɨt7#3__Mot;,(#iu-Լjo-3,`ap)P VH:m) vkgH|nμ\ *@k n<2c9A9ՁGhRaIϩ*h4 ʞIPI+w&zIhu>;tjaڑAi&F5w#q_ǥgU)ﶈ#u.-k" 0dk ʞZgXUfٸ5(z2pY,#^K^~ ;ydɦ`?P-J\EOxis[dĩg]2$@[^͆:@`\jҎal l\2/.ZR:/*+OpN(-yn,)o.J \}S[űum%Inn0SVI+9H5! Te>7~\5]@gѺ^:f5 n;7=O$P$z _7/tE4KJPݍCNbR<uvS}_ک][g.Y2UJO59[] ll/&Ky/g˙JTZ+AP AiJE oԘ@ ǞIEֲȠZ Zj]`1XrfvH$H㽔Un \`'ȹA+A6/FJ#BeSUaZTtG:,WΗPWxߑQ,\K /~}ꋙ\֯~*'VSB-O>2ѱɧsvנQɶn4}5 ^*fQ,f.=r['x .Y?|5]RB/aȻ.Iowװ/ q9/}ݛ fZF>O&`O`F31ם$Ss_)VeDK5gPB%zZl>8VR0+3$Qf%I&rG~fc <`AO--eK=rt~c濌VZfᗫ4'!.M 2t hſ1QJvˣf^.]O]>G늓PÐrj &)3X:c8؃f;W<o&۫W@;\8EH6`ZdcNBP sV$YX˩}X5FwR+N5_ɇa 7KQRH{b6MD^VTec63~:ϋ>!d+˥P{, (/et7O4".ew:pٕӅNL; yrp_ߙ ["C Z<6[M/>$ Xi@k@Sh@N5}mj/'st BB=:RRqWa@+Ȱ;kiYU|Ta -g ŞLIqS o|/4ijPI9jZ@[$FZ#pF;8`5Bn酷N]PK혳`䗌N[N <@02]*-_OnoxU !3 "qc""숍l6(GY8ff26W˾ϕ; *[*R"Pb"ci3p[aevE90KEo_ǟ;wYaXM 8 ]QO-'(DVž JDV:\ۥe~T&\ceA>hIP_bGC@IX$7s*;Cz~z@UkN T!sK ĶJ vhRA:KPkj%ALq^jAJz _6,$;e;ReK[ɶ S&l#*@|^P $l%caCPJƱ˛Eњ2 Rec17J~Mf-՘dPҰAR*.qB`nQt枹ѤrqDz|)5=f5V3Ȗ{b5mes4^5G0y,{x1&&.D Ca^k{$?0ٱr5Cw^Aq-:Et:jXsKH>OߡBqhDAGT;cQ(E HY(RBhxVʄj'Z{ J/CЙP)!a@9TRةgkL)"lHj{opwޕJ*Zk0RiسƂ?{۸?]nr`qA&ٟ ͦY1߯Z-J"ؓEYURktFKLOT8{;c1%8akY!`q||ep!D~Er瑪"{g3 SBR$d!Df蝾1RS::I}c>bB(:<y8r6a!Dcl #!ھw ^|bb:Ϩ7:-PϚz&,䙛hMIwsng7QE'n X37(b ¶3zX BL'ug@Dvz&,䙛6EvT X }Cb1hj^Њ“UZWa(*tBvtÛ7m)L;317Wp1DC SRx`,㝻[SG ^U3n&we~%i'%{Ҕ>^P]b\b0>b^UA,o OTCw◨~.U&8 ꠦ% KF@sħl;] Ogߊڠhc 1Ϣ?uN?[~5ꑙ5)R}wkP\5<'8:5H|{1 g.; |x˱y(jzv3RV+s4*B^%oGޭO/x<]RgS>>YJNd#/-O7l:Xol:Sϝ"!V._2HnՙPg%Uv௎l!_RkA1/w /|xFEw8z1W ª@[w~`-)[1+<fE m>TZ?Y:2uڱ*?g2mS/v\l >TPӣ+TK>qGW uZK3^F{Py PPXm>-9ܗWxy`6K#Ȓ xPGa8_se@ -U@.*ō26Y¶݆|K[c*Q *Ж1,Ϸ+0Q9"&Ul!J흙&\-q $s % (%$ʹf(&%j?la+ n \AcҤZ3acYx\┰WÌZWC, 46^_@Ϸ|Rs(4".× sAqc 1YJD괖i$Olk&d!4-UCN ;NlFQlVЯDr ha9EB9e1hiL :I G%pb}\\w?9ȹ{-ב3W_!֌ =\~K޾y|EJ^K6_5:Γ+HMԿ ___ lhr<MnʾC Fo^BR)8j06lWihۣr1E0&Wܟ _=9<×chH)H]Yw>[(ňw8ݥD:GXk b7w02R#D7e,uUe} L t7njYx{܅/[ 0:L$ij-V^AOoKJ0|}- M4K⬡ HYn-lNdT,9I4!^Qrq.Q@X!m1@5#T i'%.3 ?D18 1-d& zG&!X1鋊R3FinH 5\zi$Jfn9LZhBK@K Kk-9 2A %#i T)I"˄Ma ?֭#+#gl#rrxq-0ٓL!c,EiS&Ւ3HA9"isr`0D.c0%9ݕ,-6D$kDÛJqC\*,8S"o' '=+pm?Z]mqd[ /Nju Bt_?+6ѐ`9W_gf~N,65$#&Uڪ`a6<KkͫHg0> 7no:$`JPHЕ;zQ*%ͱ!'޴#ߢ[ڨ A1P&{k*k-tQcIEgVC:V\ !XT2RXֱJSA%iNK#0JfF-YN(!X3f8 ѯe ߄diFMPײ;O ¿j"1; o[avZ ]}d+vsv3W\F*qQJ)6!d]/uI )e!lCU/'@̎<ds) /!𜐾N>4dG,~/9cjſc PO MiйDriSp u琍(j% )A^q_+lZ\kv.\8Fa6jۥ^'vws ӸWWןz2詅N ;2`Peb22YМxdSf3֐Z2y'^ 9G<F~_V?ӺE$@dWW_ֻ..}V¯4=CA|Uߕ^Iq Z`hx'4(DRIST ݯW'\Jڃ 2bZ/*-k 6~7G+z*9"Ă풴{IBŠs:ZMc[]q}x׬e0Ad%Y`4trtna2]]'}C4 }SX+M)PpWSגXC8ɮxA ?$%VՅ@R|]Ƈ 5@8gG+! ; ; >>B@b `lV6c5z}72,v()XT사*+@Epfa9IJ2*Qʄ 伲 o"ϕ 15H &vvЪ`oVs"eH^ cae_R: 2:n +E,,%"uҵ;% k@9p)@]`0 B.ĺY7(A!qFa1^"5Y>b17^ٖ%=GH ǐuwoI6[h*}m>C+}gk0W<ϐKD lLq!,(LͶĆpCޱ!Cݐ!\E}STo| ¬gzѹ>#:w(kawjjbwGywz[ݹhFG]̔hε0ra_N4h@%#\ Fvk)#ͨ1 [-Պ(-Rl\#I/wL)CvF3Fy6CJK\9?CX'[)1Ŵ#8lIaay)*oJbsZ "~VCMl d+3AGw(,ǚh:jdh:sō[p;hLSL\Ahsj`sޞw<<#wy5@No✶X\`?VHKTcmŲ<d$l#[aTTڈ4Cm )q[/ ל sZ4p[̊A|xAPC: ,Wz-ԥAh$F4:a0' *KsZqa4.F09zLjL(ugm\pd;Y8r]ϑ\XKb Ah7g{qYx)Uƌח\NE%@BgC7JP͔Tͅv=r,_6.T[R#k_8wq*j"{ۗ6?ihX3nչn]6 ͳP׿.cueq[$4?.1;2 t㩎2;vMphTޗx\PTKѰZ9EZ1!L*]뺶ZNN Ԩu^ÇθvښirIdbߧ)$6- <(cYt]J;Lle<uiS ͣL^XkګeXo_Ňσ5CS4e{%0g:exOku]潟eX9H]9@>Y4(Nȗjvɗ,7(g;bF:. vU*#pթgEɉU͢2=N3cJo]͔}(7JѼhDH钬& ]ըPv^D^~w?4%#x^'$FVu/K`?G ӏ&zW2B+E)%5SP*x*%̌,Y —x:fso^ŝ%Tz%Aƚ1"tл:)f'HBYǥjpCeC\?V_*\-E9QuLlrg<Y!#e}\p{ usn{A~|Y|}OYĖ@*\z9l@>EnK׌SUeQ~N$`B3V;EW<+5tuUݎɯ]*jBy uiyjEIMP?yZDWզ9h`y{˔Xxi i΋~hED!\1zW_5u5?\? K EҤri~{MZI<]7Ff⧫WmMQ&̷0 Fab. "=;RGhN5%g?69s().hYEFeH\X`{"DQԱrqb1C$Gk[Ut^dVGt NŐI 8]&a`.y&BEsq9+g1k((?Y1BH+rRVMZM( Q?'gTgK4Vlv<em96Bw=B"AYgꄄ11&T%]3o\ẹS< ~?.''{ Ԝ l^_ %2'9iPqWDT̿}<'nG۹ڷ Z+E'2Ü d=#Q)-f l,W[QERqR[e8J,' dS^ j8)rJB CkÖGWQ8H/Vi^ AxF HaH\ ֜lEx+4CCobxrwŽY-G3ū/Ur%#jYnxlyP14f^I-K@}W9.hts5 bgXWؒ3P1l\Bb)Gbڣ-%Q-ĞnURD@f..HЩʀ8QG11`1>$eHڳri.6a#6%mHhkbCA3nhJ^Br,+1e`'f+~Ll/`sM~L~qWwd $\?>zM"9foˇﻙ #^L|OP3ADe퐿!O/0Lg ]w34`z7\ E.v9Ò|%\^%Tn瓙>t㎇rʣ"ktL;76C݅tӴu~tpveH v` `a4 ֔i'Jn%&y%X f!-Q>0`(ꗳB ?,6sس*#Fg"҈ExX)!&ITB)`#L. NWӡH"4Yj~ǻ;lGL);?wf)F 2"O=yB!U<} )`]x@Tb%rJ1F/U26K+^ϳwW70u6q)y;[:9&EO+MD]^lЛ %Mc$Od!EQAeŷ'_Yr]Y6\ ^Q_w/_aziOw+:.Á?{ Q@#Ӄ %c~,ЏB 'AY\Q1(Ϋ_^} li/}`뜌hO-oaXn0I |9u\}UbU"v_ ͹7hP'=cGn?]$Mq5ںlգ‚6pKWnuַ cF.ӷ1˪Xނ@Y{kDҘ\+&Ǥ9ophQy[FuCQyR֡g;}c3E& lZƴ,SWa(yDcl$C?f/ט2 Ra)] `BY<6 5ZRgu 9* q 1+, %e> * y!P#Tw=nKG&ovړX'68b x4gFRDʱcouc4MGfuUuuuuկi,▸<(n+PBƂ0no(ݭTؿmOE]P%Sn P{oOG7TYZ#5[]FPdb>CpuAQH)O{;{?Fp '\|j>Q~TEkn}GNP6MnfGcL"͌<妟zKn}2/alapSM~] 3YIy}C7W::JǒM^n`Gu꾣u;[JZ'%o'SKUaV,ո ŋ303'mWsTXhr[U $б'3]  ڸh`,aǨ7ߙin`A_XDQK 4 _ uH` Яr,)̅:{cARA{Ryכ4+)ʠI߁;…f@hbON;\u=: mq{NՅ%p6-~4}tEuz+`5 .CϏFOQ5hMLY;ڣ=35A*CHe3z,f,dO03%ӳ,0IV'i-p",2H.۰@./xWB;e\ '7l1RH $ *UxIa r0?/Kftosud8ƟKH${ j"Þ6{ K=N8[չYuB'$E(4PFY{$?U6)G_qz@583ƍrp't*\T_5"yǩ`$Z+B;Rue#ݑl~ӣ< F{kF'm} Imqvrk&wᳯSA!'s|ry3("EG {%lvփA.8pCWzdWs>bb + ( ޏyUUW&iL29['N]{3w٪ȇG9{c\fz39A0 ѰAk%WŔ"4A3u?1aieK%UG"} 1RgJ6vX@grcX.j95}TnZev[Vy\Db$a+h7RID$0bY|_m9wn+P ;/ 5UPTٽYŵ_SuGRjw.؛5 <no]-.~lcҁIL!_7Q}L%psU.CKsLA~go` Q:G&f&SM k5[Q MUzQDkhEԜӠToigie`<fDqٛ%>Qq4 F.2p(q[nEڊю0E8i?R)nVM>[^5$g{9h+"zAl 8VGN tK#XlfjL;/PNz=卭 o&ˇZ} "c2C}.=|u]__> >;8{IڑdT|fUAߍ&1<ОGjI,"H ~p!@TS67c/WfF{ɞϤ+0@PweɊ0Iv>7wB2:"1<'5/25HtzG]Uy#p܁;3Wi"{ˆŷ&`6 Uk7+-̉ERAzjh ܴo,μv 5?sϯ~6Gx |ξ.٭WfE/@I퀥 P_ {YfoW_n~^>e3qW%%;6ﳸ->=8c?[_@nISr|k7+xxQE9sM"\t%g[П̍KB~}HZK[Ouʇոˆ s0b%(ش紅[[s"`P,&{nZ\>ߺ_wxRX\ ƪ+z 4 ,rCLrkܟ/ӊzCSOiq$C󭘬PO CC.⑌"D4\b'AB "ZSYL 6f6vbZ)Ә[*(Їض3.՜!E=m=r[$EA%RA8H$!f#M$B`,*jWfjGԅA֮]D~ 8xO=ĭ9 ?syk:4}ܼ>pQt&,Sq +% at}AmD"0{4+&k~\z'tjZV2 i’qK7G;B1ΑZӚ&¦hy܆ZZA)̰t=֡"ϰ}蔹kc0',הTnmjUlwNx GkTZ[S3&=(;L2Lʮ9S;"{y s6Rhr/jlr '+VѸ&1X&W^O23nd\`Zp4{N]?-֛7?+z/AOn/Q)[)lŽVaB A; cf(B˫6q c -lCx ˥W>-\^R/~&'Gr 9A4+P#Mp>2UB**fcB3I-XhLU&L.[ l#lG'b+f+Q$B! P{}HQ+%AH?opč,B"d%y`GBLbW(JB{5*a,G)yi.8d,HX|N5?X<ɥ6+iĜDTR%iaLBp&E @  I fT3bˑ0U[3W Zڄ>S[3:̅ۍ=-D2՚k~RY].ezhJ6>Ú!{/.V7SzLD5yr}|aT QCPM1mXw֓0CQ\Ka 9\݀+DCpg jh[sI5 byt[l=Ka>/Y[K"1+s^){^0;w*9L1 eՄxe8*`Z %yi״(UΩn]ר&BPŞ+DlJ5;>^B.v$ M?_Vꀦ?сUlޅ(\`܁ |[ 4CxuZv}n-8~R gryZEn_ܢXVpU G΢`|{T|('hQhTXFEzTv9BibK d"osE' zy-ش8K1Pׂ 7;!-hkWWy/߸N^Yf„]BTWsnC{C;@_nSTYN`gyetw=P23"v~>7 <<ÛzެOaU8xmoTKq=V$cRp/3EL`qhǽz(c% Gf&GGs{f+A{KE Y9`)tVcS:N@I3 zymס=r\ `QkQoO5( V V.Ն1 -1!kmM5a YZerJs6Is"{RO{>;oҏsޠ{>T]|._h܍wmФ@H+ӥԵ3ȑcMGޗe:- _Y޾3YINYC7WZ:%+ Or ֭.|T;X] S^޵u_iݚА)\n>uթ*퐫E\wiC*xZ&4o[#Ti5^x U#&,J8I\/>'xsA%JX3l]p6&}%K!9 "Q%Cd9HF]X#i%ZB)m`4 hDҙ/iPgm#G.!o &/3H6xXYbl^6o·$[bUX`A˩:;=>yuMO/?1o{ܕlt.켉cͨS`:4׾mp`hH )'XT!'=5TSc0TkdX6?1lVrcSkD8HzjBZ4Zv}M=(:r;DI\lp"c))6Aq <҂ #wك/Ly0M .h[Kb$W&:̷gb= 6Ԓf\H!^Z$7nᲃ4 !R2a/y8LkJ¹pJ Pn& 08fictnK;DZ0@P)H2! %Մg0,|>UЖrHL.g[=.sI2SV`RΖEQN9ƨJ^KNK4KD+ H{ u: &dCIT 4OΗ&៮M-q#V[a> jGESǯvs8E*j,l,,٩TjS%SM|g9@N1I㚍I5{ W ,v6\]]t)dZoOz7ʳjLq&>Ɗo8eG̦WD%9jwI"6GN"U$-8{h3YxOYk/R-6+ڼ:+zȫT-gp>#9M cv _tuM@GxI؃\E.yM7j ?Qi@I[.(7VqO* 4\*)"nԴ9B"$n%)4 [-(*5I)prO9 IKCpYr$Ԭ40BsQ:JNi&l$j'.':}S?yPJs}qy}sx,\[n LH i Ƅ[6D^RxO\pgRS(=;Ԧh$H`1hB#>:DBR%~ӥ xqXt4uXQc!6zPKV+A*ʉ׆Ydg=Acþ>_nqOw,`^7QYqWpObzQ:Y XM5(@ 9k*nwcRŽg~M:%O\:D}c& Q^nehg/@hlV"=)EdFD J,LqOKT*8G_tCM_($ F^8}Kz~_b.~imL^E뻶Chf+~r7UE<~vYt·O'ՅpJ<2^U_Nu2M.5MہpepkWR̈́D≮@`O+@X& xW!Ճ'l3GaM&8waNXO <#D8Z’L9 cRY v}BaJ}htJ:u3 у̼MT).Ѧ8lU4,4䡠yZDB1* %pD$5VݼLanADJ.( Z6BG)Mل$?ғ#= JCt.T9-ץǒ3M TcjF%l0bAK˭ুXzrexZ@vS&p ryˆT2ȁŎL9M-[R J9 ^@N*Uл qۗ }PIeQXBH4.Hw4R@8@Šh9eNyp&]0Z޳Bm 5IGP䢘tD+E/kv+%/;nh9qHŕj*14xADUS{HA#@}PT <*K4.Ԕ`Ѷ4H;--r@VϦ$TQȪoZ.0 FNiIk.#R1iʊnM;秚$wާ @8մ\3\nԆf eZd޽/N&Up*Avs|j!CN2rIbjĢдZ7 eP0+ivɧ $,s^Oe Og^5>_o I:)' ק1tP@){ }ZC,Ǩa. ;!XI!{<@Gܴnq\P>P396n(yLjڪ׵uwp!i*|=dԊ, 59*~$xTԉy$(K92! 1( J_/هG_s5ͯ>/Il3WA>y87K&`x^{{)cZZ= P#é'0s41JqTl~GqT(Yh,I{JXSBĕĜV* ,C\sy d%ƱD=@{Got}k1gDQEiMPK-;@}Qk 5DhYc:ڋ䉄cZ<-C-dšydI?h)CQP㑝QD!ÅꪣR&B˂&Ėomv~TW3Wf{.l~<8'X!_vV :ãBapP(*- ~)_`80>s2\u7 wUAO4 u8}2k^ VGUlk>ELuW{8t; + ?h r殗? b42p`d(Vb}?3V =V-A-gsa=OnQ!*v)7F4sǔ^=#D-%ݷը$4MN|Jz$j\_b}zw*$Dz>ߩFR9/])9Ж/߻CBc52 0MsQj00.piumD>CIY(P?Ymag`! S4Sd \JNh%-r@KlCV9?vk3q}gfdu&XaJ|Ika\QnJM4a xxaA# cEɋt x?g;P}hvS NF2@68䙳hX5tS[SN5rۀWi!FҭD ΃fO!X$U:0g W7KiF1SGA_Ŀ.dmq0碒BUG>~u_{X9y <˯˨`%$ M`^#P5$Q򷰇$)-ĖOJep==d$@Z9EJX=Mvjt#T{`$C VIR/(Z% ^-yRt%Rx4j(5o$OI -JlJe)wZz1n4+7ی)VSJ%YoaE=k*ɛzз2 ofjQ@>U'koΛhX:@wv#o`ZO˙~_pSl/S٢,  e]ϒjc00E#%x@ڰEwUuN_+҂݁ < WL 8:-K1$Mq[D6+d)- Ach {Cn HQPɓyp6-цrI*ٖi(D lp" S۫9Hx '_;(`c%3"B\0Hp)%{J!&o@"&??{V5%@^v& ,0@fgzuNNg[dHl$;tщ)n,VU>AZ|$Vh!^[x`pQHobcrYXcX`!ZW8f_I/yخ$ v̧ARRW_m|}ka ŶVWfB5-ƭN2S$P9|d[}vc(D\rQ`hBvV,y ;cDm!zM?p(„Br.yf ?Wvc#/t[2ƻwv+|ۏop~ǻۏY^[wﲿ}W_]Ə#95CÁu|STS!]yZkXyu'Fv1|u^|t]];+8Psrl4=(09ؿW+k1T ը Uu!>$d%=a j9nN)GH)}9ޚp j>ֹYKtS"p' OA=2 X'1 LX,NBl0-6q;'O9XwڏC}mvYl0Q)KOUIwou#*o 3Hkg\;fn CxbO@#Pk9[l 3̡J$s4m}At}!eQȠER s|d7"\<%P8SjNLP&);dɔѓנQ ptk_%rM@** @cʿ \5Vm7 pMk+у"iFK࿩r^g;0FQ{WXwR9U-[#N\kZ[0Sʵ"pBdu1l".BA5Owվ/*N=7# "7 xk\\ksʵk+2\Q4oޞ 2NϷgߞ ҅;{mܽzů(hKju<=bkOe"i=n6۬ԡ)Q!M*/h<1?IACQr$by+Q6/\^]@zUf:Ag ͡N~&^b{ʓc=e-x]8 ^}Ԁq@veC w{0%hRˏL6 0s>5b]ndM|* Hҷ 'Dt1%Dl*.VU"Fc2Yȥ H' w4^wK Y3`TJb:6WER>r}H.L/g| 1{/jB$=(e2wsB}{@`?;|p 7`k `HYdw3CRgʬ[eTB@'Z̰ ¿U qc!B`)DE;蠌(u)fޗzEKٕ s6HQMz!T=eQpX7+ߥzǕ_65@ͲD`"%NjQqH1-)6bmTt0ō3̎/ ;މ|ɎEy)3K {yײ03+ hAv%0IvDP32NjDx)! (a#ietBq عB))kX7 33ȼMXaIZ{i JkapRNV3!F{9%;6 Nf +:J뵌;c3c8|5X96iYTTk< i֦"HNN1}6<"F% ڠZ6.of0o.fYSkeo3e'd꿋: s h`0v-L(O3Jy JGx[C25XqrloǨZ7](.uBKu|Ǔ`{T-_70q䧥<笣Q:B@yκcG/3q`'`X in޼kɖ+^):YoȕHso_Tϣ$Y; )Ƅ2$!jcټ ,M*Ɛ8EA(+@A#ƃA .[Bd4>i-80PDGdՔ/sێQ48g_hz!n's79^ 2X1q4 hkxp6^ì#v8/fj'Ҷyq"M[q"+2f Ҩ6^#{2FV6Ve͒(3vhJ3~# BNBie!JYѣtcЦ.P1/D[fy!*/긅=ʠK:gIjevKXꛚ.[U` +ˁH󗘜ƨMu[u?TZArCAr$w j $Q5Bvxz8WXyxA|lװl'5xV>j`Փ=a3~DҲL,DVY&g+}Vi7'mh.bL[YlDHg@ , (pj*T+74'`9I%gBK)􈖡Dc #XE#\J0t_o^b2y"rٺ!N=dEZS?$h9hHc Zκ@e!SD_fqyeaߋePwף^_;!Kuє|ueH&,F1g\74[N/W"o+ڑJ'0z $j"0a4bXQ'uB)zq˖cPGB""URYHm ˘]J'QJ6Ƹh+9Z Ν{yEGY}TZ_yMqw̐V c7 O)*T1JWtc)]IH ͨFS<=h5~i1nu\y /ӓm5)ۍ Pc[z8oPmO_I<6Z).-'YEb wT-;e=8o1|k _5 nȘ7HX3;廓eqduB8!BtBg c̑Rأ[鋴ƣ&I͸'WBb*ȲrR`v)[xP+ . GO*VTZ@$zk *,R5FD@:CJڬtI5'xlXj@NJ;wuW%_U5Zo;N k.NLźS3Y}Y;P ab)^׮Eg2n;aMg5)mVm #$,_=^oo'd|t8v opJS7J2@6?gщAF$˞^怜[vn-怎X~\[ nSj3p:b I{$*@{5~rF5ig#{p>XM= 9{YtC)s{`ͦ.@.A#cN?!F;Z*ڇ?S7"oaY)p60I"dRBY+>, E*6ʖ]}ò%(|J9:-R@ƙM^Lα:rVAʘ٢0MnFȁX޸V7vB^f'ϴVRƻj Ӆ5f% @ݙJL&C[o3Q#cSPlMm{ڃb*۫0|l{ PJs "v$(h8m}&cQ+['8y"&Co/nsֶ믡oMA61=OAކjJ=vE06L4ȌGؽ696j,Q#pLJ (裫q7piT&=GUN̎G>j}0Dt3B4g[҉K|'&H?6O„&CճOJ MaMR=Z~yͲ:I]:Ą#]Vd]-373Sy]8V=tTݢzѸ!ߚ^yJ+}[T=g+}Vچo#TP ) /+> ku, y3diJ.J]rm= ν^jRz]\ P̝b:ܹKч9+hE+=ZKƱ]IKRp%[j?$'eÙp2sZ,Ȥ d|r(,2)˙![IXR5Uhr:ӢD}8P  /E#1gS.KU]{ L$"*c.hu8=baʍ?/]Ϳy巌l3KY3K{|'f֍¾O-%m^O(=ƕڭbDmw}grc&Q%)<~y9x9:/ X!W3%L%hF5]Vr1$7onuBSa(ՐDXY^^[?E?O/2ڙ9L+Y+= gIҋ kN|D%9 1**;JtC tPԉcZ ]-}F  *-3kH~PS͖N$#Lw~`5;f1(7}YkHE)sNס؄|qba'f'.UFH'os܍*6Y(i_w |V=#1Uu |ݾ V=+0iӉ\Kr'=,] T,0U $폠 _wB"%ďQa{w6g|oEK7^?rR7He"qH-G!BLq^f3^fAz코Sdۤm`ꅦBk_D1u7~Y5r@6d\GJm5Y w Ewq 6 4=)cJX>'QO+wۀ8ZB`A =X7U 8i{QW31&3J,ZlvϮ) ,*HOXXWiLD}/xъ#F _hё88~AW:ࡒH.H E Yql䔑 %(a@ ?C(1B|&DL>~1Fp;`jH N>L \(΍),\9BڥY]Kz\++ɴ/D]E"P!vx>D.3bܪ8exp=,2r#:@Jx)#];Uڶhz.H^=ģ{ Ten mL eV#imJQf;?BVωHX~B4&'i:wf';?u6A!lLc_ %t.ҩbEJԊu~4@L=)H +87C'q6C΁}i ^;r9:J9N18[̝.qT 5ݗݐ@` lL}o%WU!8\kn {!mb~=[j(ƠғL79o0L<8C#Ytyy]QPmF@ķa1A[ym#x{562Q;k$' {?ns41(ٟMA ԤTOx U/qo?yRHWpFC(stWok3ҡ?߽o^u@@wlf]~E:by64T9ޕf.=˧yc\r{O|W! ބeH@h/>;zG [_Nwtnӻ"MLNnCpO΢x6"\5醜sn}q:}EҊ@.p2ֿhtC~rmH)Q5O+_*REv.=OWOϵ~*:&BX]wџ䉿^yysKȈ(eR3bybNf!# J*ul^w=xs%ͮCևĮRfºlW.[o=o,7_S6Mz5v2Zd)GR8zLGvr4KX{>S-! 9% ^Dps2&DMe#PfCjgrz#ӧvQ_$W>Ǜ.k&GkLΜW63 T箐x( w[5wџ@6Ƒ%t~Hh:0SY8:ƨhJiWBS CaC.'NJT4+Pա }x{/]xǍ[_(q9 "q. &3,.mj-c;$wR7KUZt7l< $61tvi:b'Wn2BkphnR9iphRi'{j_+7ޒuUȺpL?MnքZJmU]=kJom~ִvɊp8~!NӴ<aeT N\>n~Zvĵd$-"}BjG1pM?m8g evfsyxQ\hŤ9v xF HhviOߛ75 c嬮Յ_FkzUG+b%eL(GYɣQV~>hW9kcIby' â$F%*_g~~ Xx"r[ պPj@OV,)}J]ҋ oQm~ uwiah3޸dU=ϏtSμq $nDIWhHR(`n 8EX69v4qXϣ3nh?}V䡂ŧj??}R*n+UOPXC|7:A+N"Tfg]a˵F~L0Jx&'zhPxx͟4RHr~+%h<WWr7X(&OWLKaͫՉ"]7\QQea&Wn |cgUWW| 8mo^R"k^{)@]tϧͅ9Kі96L6Aa)yּaCNda,2%RnDp_ š}{\cJ׌_yP6D?sh^1s%nRdO {/rx7iSlqZt1K G .R<S.Pj{OJ<|b}:,ROG}x鈇W?u"OD**ϵ=d[i))%&Zoav24WU߁j lWj>}򀶛{ J)fTéC'IFl)#{?yx&݄8j Vߠ :PS-udv]'yK;jvPm5`~Wr"frD]x&fݶ!GgԿ噡y0W 鏏uy'ힲQș*י佮zm/e&Yev˷'qPjM' h'җk۱[w`IwB&ɦg[WNwn"inݡօޭ 6Ņ .ljkzNڷM@_X-g97.sg#+6x~1x pP\))c/STݦJ!^N|է`X)3B S JFϼEhri[Έ_ w3|N8N(+:}OSvZ }>tl,wbvĊ$~ʇ:U'X{cF~n<ය(pvNJ0uq9GPP:0!)ogKuPW|dcL?c_a*m_ _mp$3{.+a!W'Zn7u>,~Dt^c] ٲbj,vnMT.3J>h@@*Uh%WJ=~N @wv9 '9Q##zAҲFKCX!l@QZ=Bf^H&&K`& F9':TP)'RVKa8hyF&6(Q(G1c9M9GfF=xp_S\ lLR$G#Cdwu}|YZb C$[v}} S(*3iYYۇ+Pc= URi>U)[ mA\s2~Mƭܻλ=ڊP:=٦ݘ,AOs&ipq#ȞR"kj~2p!Z}z(skFчKOfdV+*礓 VQNٷqz/@Z 50A{q6Ѡq*!%Yl֕N"rVPb_{C[IR Ѩ?ցBuב6npU%1F{N*@| T$b 7D@!#W5#kfˬJ'7?*8#4#Xf8|;l<ѹwtlwq~8aU~XԹ=7XL-+C8xs_L -yLà'fn?gPz>^un%)C%aceUrÖCwrqcr;E0t3D't#6 C։X!T2=l;vW( =g< q^'tDs u<2 q*5t6؛sI SW~  \"K\&5D8}$KIb֊&k\W .H׎:s):-dXI"O.JO%ׄ23 +9~+oSx).^LPժ C.i%)*b>Pa$ ro[=x+w]/:2wS/a-$r^HwX BJXr:l;vgUK>Ӹerbr(.bt,Лx!z􂮒#C,L7nWbWUm 6OwZcq°#=V|$(6\?ݕ1)SCk(I{)dB|/˲OnF1 *`fKL*F [[BD>)ǑPa ܌ K9K,͛"J U̍NXUl(zΡea;)bdrM8%* 7F"p$Ѫsz1*EZy.Am@ǭӦ[!kÞ5Ljq|y"q,bPԜ E,7L9B>~dWold\6[{m $:Z$&pIfX"DZhv6+oEh_² ?yCz;>x0UStkyL> u `+HZhe7DV-"@`Tک=2θzau LQ&^!sq5mKD|j[4I jxo e9s TXʴ)i όejՌOZE60{m,Z YLV)'QJG`R+Ka(̙y-Qk)RjzLU"pAoV/s B8)/Qp2[*<}xOxr3&zˡ%{fgL ,ęq4ĻԾ9e\t.b9gU3FL4+2$IUKBl}t c\WyikX6yHcU-ƀw.Ax%F`K˻}3$>tGÄySaz:JW&YP*FXV10r˩4yl6HܙLww;Ë7jh:3!cB$ R&bLڗLY'ZsSIIT)59-P7/El keJDLmD>ȜI:X=e`;m,oGXV㢮UA쯢YZ8b~*I)MS,$BJAFEqAaWJ+QoT x„O-h1hAx_bz )-^@:WQqݚ5I\OoW=yǏ _%kKgM C䉇k$$ i`v}P8dM4kn>ӊTDLj9e9D㥑J2JB1`4N8kfRb5) "i pq@K浶> Aw-مKat.?V1Ys0"rx `Kn%|W {08Mø{KW(:p9Q <^Ƒ  Mx~Y0s'xLPu!$#9 >ꕆ0T"K z#:b)1f*{mάX[b!4ps10x%v۩GUv{N;R+a ϊ#0e0$:&L_gUUF/OnΊ0zV(0I9+X3+$;pڟ =%hi9}3NH2a@ v`I>V3|m4wz* kNFJir>>7dG[@"= ax=\OĮLCtEe<3zt.-8{f0QG\1uYL ԕ$pΗnj}N:zOO+&ܸ%=9J@=Tr.+1െ 1 V> c,; fdXgϒ``|DI9S[+BIRK=;`$$x2Tvus`4>owpoсq;Kc}^Yj5 8bx/Lރ LJ~pl>ntUxm$LXF!9aה aPToL*ws n"R 8ͪLc.60f ` 1uLFXIf67FA@08iGjmsYj#u2&\ aS$ylz{%[n8T|*L+ ۖ}p1ZxOn)KC/U}k PHaVpA2u^`:?@ VOJ, Q-1xE}OӞ(YP+y}E*;܊̝^3*, <=ˢ'Y3$d2u,0Ofm$Z%@Of% %d SE0ƪ hx_T|!`D \ J1m,l!|xD{m()9augGXݤg!TW-}2}0n,Z7ϫgyQ /^:2P8h! y 5o͢<Ѝ@8t7yyR_T擧$qQOA#Q*< W\M9+_IYh ݚ^WzTɇLڶg$*#ӽ7/FSg=|=[M34| ΊCr!B?]JBQ4W_לpZZz..ŻQ݌l9$ރ~j^q?o&jԶqmɸydwy˷Io>"]#-fLKK%sQH*jE #.~1嶆_s>Ak7f>MY qC_On#jgyBaպxKӸw/c>Nls~+5 FZC뺜+)/֗^d{cp->=θO߾n) A} YBtke8EPwV>htp8as1~XXk_t{[K}}=&0탽Y[ w~ $ʘ+KvO~x:lZv%}S-RV(1 'i7{iik0Ʌf䈓 G %Vͭa,zEߦ /֒3nw=3Xl/ dɴqD%w$@b>߅&mN{Wfة3Hbg<揤(HOٟ͗ր&egPAiij:O#ˑ/)ЙO09֜IMjBԋ}#:]tվ(}FI/-´f.E^ȗN7ߗuKJ9hxk/ٌ[r8}>`oU>N |>K@66B8P.D,KB~(ҤJ6?_ 9:x$Khko"ivkVʻ<1_!b"*[L dK! 2Y.RԈ"kTcJD3Hݍk?e*M&S.jfrS [E\S u)dHy/29i5e>W~ #ô]Ø&3Mȱ6׷!P/V:EdPq݆<-TJHRVV=.( 1`f~֣BJf\2ښ7&9ꡕOZ!}if$1 n@t:?Ƃg>;9Pfa"hdPo ,!f 4jЀ0/X*dZ3#m5n'fѷGߊs{ǰ)9>LqsyUj3W:6288#yͫ:*|==r(s"0I+f;gUu$H01)dͱ)_#XFxyU04T][)Y~RS&5̿ƚ7lÿs#B[c;m(52uV^HaY"N8ͧD .I؝j;QbLDdfDa,ec`1`]*ދY40j/k PCJ&zQݲ`ȲsL yM"r*#SU4g*E2/ȂSЀOgǼ:K7ksป&{g #C\5Ҿ\|cC5x"2@ ԨVI{N&,· E[$ؒ+"<ͯT*9(U6:SH]ɀ){D_2jvft"@p>.jQNFqpH{*a­I펃15 Kp,g,q`vLzЇj)rט@$\.TVN^gN5 zr}\@;8:\T/Ň_Gq2nr?JA/'.voT`S7a!Ο>A9^_\|_^ntQ6:"lؚ~TIOJe.UL{k=˓l}>YK#4䍫h/jPN&?n==z ֗57XcY)J6qYd%r85h'=Vܕtey;Y8G'uly>\1[] xzY ^iXW=\[ cbNҴ͡ jub #={oU7EUXy4*?Lǘv.DIOj i1,`q ,e%\#QD1RM<-9aۭKo<{ꀧ3xp4.Ǫ=^1)p'A ~ | +|<5V58{.Hܜ J>Y`= ??]6BF$XCw |$hq†Ƹ;?tyÖCwQ)'hM9=ٕZG>$T nBЁ;R6թh|^E  QX!7 }&1X)uu%zZƴ]8]씴}79bNl֧ATu̍l9}-98dpAr ZEbzwWJj&27gmEtbe7~occ+kS;L^p52JD@}m6߷,'o "Rh4=ԟ屋<5Lt'ٜ:H&Z0U+g;P ]=t^u9{: dBS%q) c uF5jdAhv"I*Jeu 0piS΢q3 lV5"[=}{s^k?fCY쁯hI,E\"XH#-h  ڿkOf[iN &cjQN'$ٮsx ~HNSyHdN_)Kw׀ us/CӁyB\gDe*^V'{H4P=sɿ*SH rF 2gysovk\U"K+ 5.]")ŇNwt"Z{alû jRIzgDcI+3ֺKv*ƿ!hSL:PMkka<]d91, ]d`cIHG-TqŘ'H1ƿeO]*/Q^ˍ&m]V!&;WQ$&P;? *scC%!is+g JhA$Fۘj 3&Mjzډ\$#vbe˔4umv"BCq);NC]nht;OFO1Y6Z'ΟLR$h_mSM0-k-kH*H6K>mQC/ރHp_LVd^Zt~&0¿w|RUxq!?}uz6'7C~]xY^.W~9G3tke&l^>woݨ=puYX 6EpJ;&Eo&D,lҮJܡǹh2{("C!,DWDALWDۙY" t1DOsS*< $5f/Z#hZMESo 8٬ytw(^ [O};ux=m]PYI?z1-\_P$g6p,$لطL_Coˢ}P||!6mO2:6~(dp䇻iio Lo*heeR`lŊV8]UҳŊwH>1Ѥ;+[%Su~dR 4%F#4DyDi'~/!EOy* 0SUOPfg[h$X=r3 C.6Aܒ!lrJmhT3<88#9UnJF'rI[ɒ?[װt1"K,|9>$de N^NYP_N첡zQ1J RE; j,[V;3tQJo#FהFGjz Ӂ&z{y?IbmMiu`+du%9ZqUe⩉ B(;1M!V |̨$w}y{W|w#?nzʹm"~MBݦE' -&A)RCP#X_vb?cr[$LvwRڱlQ˹si~'B5Jj. O_0 ,ֈ1`}#/M"dacT)JM|( -{GB8)A0a$BٶR#Uhǩi `y-,S$$DY1 7+#cD Cjϧc0&ɪ8Js!G@:MRB *ǜ94%K>Nb"m*31T[DҪa J Yb6ܳ8qYCs~PLTcwکj0& ( !RYF,*P%i'+f3KRz8a"\67g!pe>ʾ!Zוd3ݣ~nV+oA'& Sҗ[_2Ħ/oNAh.-#gG#w>}vxr;BEy{,573tu.˃)?"ӯG'߃rGH>}㲇JPl'"؞CqڕBĬї@0pYn( 43,)10=L*/,x@•Pj]HP!4(DE b_%ֆHO3ahq+ˇLzGDf(?EǣY(@ Y//6:y~D1Z"ŷi|@&fdƥ\R 0Lў*,i+ % 1C"4"\`!F 7e-\L,h5nh1J' c`B0fI|\P.C ׹C%V{u'2Ydzf)uaqN}uޅ >mQ CۛRB㴔РNhsTqtR! C@#'&ʃ Txjv<_"c"C ݤxF!h3ē]xF 'oVN)_A!xظkz|6㭱|=;PFǖИ6J k)n]a-Eݼ k~sJC\&.,ks 5w.'$ z]R#E6sh"T)5Uv>gpGX*!;^a/(>C:/e1zl}o}Uݻ.' ݷs]LTw2E#C'໑#)Qzu9UוBpC/b:'NP=#;JC5a;Kb#7'7 )bZڂT,+d)M!+&b;ñ66 1OONr1`GQXu&g a=&ziv[|*'_=%@O<Velc~~dCJB> B> B> B>k P͡bQq#Yi\e`8%C*-ĈVJ/V}yK?"iH=nI"eIޗ]XbBʪؖ%5Iod,IKmuWƕxvwA^zvZai+M'BeZ[};cJk_3ZCi/t#Bg'rlJC? QrH9)؞MsÔla ncJAGS!+a -9\9@&6#դ)HgʁL|O>_<=8יx:gqx u ;8i}ˌvӶKn k[ NGVJbu"MU)O]҉E) ԶzoWx !fk1-,VK[RUIR7>TiU^3__ L$VY1j%Y'K*ۯ+޺]wTTJ3Gpλ!q[ We9OYt!`%Ih]3gGZ>fʶӪKYx|W!q*߿X89;q UtNdvz?7s4N9щqݚN}!n:ҹYZ#ṔwmWkDn(!Shw@J޾~@;&3N>Tj1] ӑbZ PL)JT~AJ5kb:LWo&2pDCMiKŊA \`m*H 6JgFeD^%H`:?1Zi& %9iO۽-P(n (lUP<2wuZN P\iKkf#{v"Va2%G7NV5wB ĺ6)RZ(- ֕rr̠!ZNmN궱0(1}l3+dR670mLc4F0f}2r֙Λ빑 6H ns=;Rr=/HԢ[s@üGXwFpVoU'^|b9oA䶞߱0REٷjʤآ2ͪZb;pEȭ  Vt/YJ doϿc*(LY@[ tV5wyߐ25=KV>%?ړuƝJ[U؎w|DQ,4B:救 UDZMXa+k<:_w^ !Qҩs킭P;_/-U\<$D:ʑQJttR@QD+H)*E*!]雚 Hª:ITnKb{"v65؂y.8.߭$ P[:-ۨbzb$4N7$>Iy%+9PnY'!$ y.D;0DS4=M}JڌqqבJrqlH t"ڨA\~I;? m QV\I -bD5_&uސ_\M zd:~S++L'x]U/ [BHVC̓v,hJ,$<ݻkEo!eY z$xI5:/v&2X&/2y_JXC-~ rzC_s SEcts`z"Bҧ9-YM8ύ#}d>?.n(4}IǸxyW͚_>%?){cRMW'}GY[lgU5orqR֓q͒)NFoj7w -).x~Wi~9ڭ y"#S{ʹQ EtrE3FwKn}H5J,#T5tMS3k r/pk[KP^ {D/ZgjV~ >~G,Юs X/UhMo(laI},x3t]rPo=X{/[b9|@ 9HlaK+]ul<[d1Ec2L˥ɞWx e[O8J~nOd ^0 zqEN3g<Ū+tmy%x`JqE; ;I*NyëQ Rh=f&ߠp^ĨINzi܍8ghRԸ}t-9hos_Xڝ!'cQ(eOmNȓ1.́YN<]\bx(xD+@ZTԥvU^ 2&vOM,y kP ZFkdβFF'J9R6ޞMoXޮ9_ۻK(5?@; SċLYG4t HYתڄM&юfA/v&Wbo@#>Mqo@Pj$Hc$="6F*=ҀV9eF`}=G@x4ݶy^*11Ĉ1hH'4r7`Ck2!m=n7X` @3k*1P8 BFo{8]ڹfڹv:X+4&Y1pkk5zs9$VR}O ';=A2̻ &],ղG`Ãut-$7ut_i'wt]<=8SG.՚:uv_v-}j=kI-t}u;'1KPNCTdǝ*ZE'tv# : V}:t!Fu:fP[)txEdޞh436Jtw=fO*gmQ v ?4H.MŽU#Ok9$6 CB޸fɔu'o6{si$:ݎrp--U !o\DdJǹi4ϥA侣v;fF 5KvCB޸TbI2n$o MbsEJ܆86)+hC < y#iT{MhhYVWFk# !kueE%BFڢ dXN֠$|SRjAFB紭. ?-&6grjWe\.upd 'nčЯF2O YsܸĘ%v1Ѷ!Oٺ`NRAV8=CB$| x9+rѴ(~\IL@G 4P"i!do. 32b1dEPxmlDE!>ɾZV_YYd3v~).n DU03ȔԪ~}5p.e(rnBPMnRH6e$ir*#Xep>A&7l'b @z>/sFsFh+ĦU.2-oM, kn&Q#~lBaIu D%w\z;uD"uHF:aUy)z)W![Pzje𪤠Rq׵s#ў0-iD_rK.XY&G60[b"B&ϟGq-;|v8Խ˜_y'*)*yM*" ѼU*Rʇr)ub°X[FI00-ƅBؚ".7@:3n/`i#%AZ䧖_aUc2V{MXUd /e 4gcDAY>7H'$v@5H[DImKkYBjz3;M4 (ܴ`EM~ lq/x!uEM=5VFQR'>bbdI=B͓3)Qqcpθ{CK_N},T+e>qp/l  uتQٹmR=&ֱ'-v%9>2 _}鄆ч,4e*Lޯr!1y+`w͛:8Bt]r&|H)RU(.\2z^A$|nG*сv~zorq5ZhN ke31]7D=7M;4Q4JRRC=PcB,iS]ЕĀK 35B|]CyvXIW"*QϨ,י"F2#+)nQ3j*0~j_\N+ayX a D!a;$eU`lv\%.9™=Gʰ vp%_r5ޓ~(^+>lFxy)Yό52tPzyy1cwWŧ 2vAZfT׫zIƀS B˹JS)_ہ@3>E8 z׶SC邭>Ta%A (Lg=x_P:㺮}!ϻcKB?y6eklUz 4&j3QsVf#$iyOޛ6A1bZ )*bܯ+̹vD,Q2j&В&0൱l/"cv:^Mtֈ;GJ-v PzڐbApEZh1e>Œ-swM"Z336…Ֆ \'+iuw${%.}bc֥c2ȎRZk &Jn}_r < [RDȎRp71Q &AסּKN%jdy})vÃ] ѼR;aI0w} %(Q$'fC\ |6e@y ݂:'[&YrUEtDN:'e;PbMGS-=ú(RZzUsxMJK)u- IL/T6uFRG2d$O9%s"+ma댊V"eh d,$dIB 0PX).@Լ0P.dPhe G2(8d]'6w2h4]#B{`Z~d˹ۏֶ?Hy1:cjG?G 2+&Q͞ymLgXE:(cVÛXloO.R0^} )9rʌ9.CbI#=/>$ }mvuRlD|j^CdN79FapK6(G-"<]׊XY-;HvJ6 \aNmlTREd;H|Fe,Z&LeЛN6lIӆମ5aSҊ{]:©/ܗ/͘qŤZ]?c2yutGQcok0H`5\u` ˞5td/~q8U1Z-{OQ$Q2|ă K!FPKTVnjIe?1o5Ю7N>\xQy>*I^.WFV[(?p"R-'-pa,WȒ@cJ6ԧ_ E <+[onWs_X'Vq5 !ƼX0(b%a#b ^8üAـUf[N>l網ضy;nig~Nfd| *1τS+ށ!i*\F-PpFl4@irrMP|r)rmCzCyzp2-`{^g-n~U +EQ@N\h( J*?Мdž\Bsp}=:Ta؊yŲ 9v{ $L! (gROoyKy;'p[t4ȥ{4|]c S8lHt[v(<ϫeu}M%\Hk]?>Ln@Aʿ';]]45ܽնyՃm񷩭Wq lިW?ww6u^J_r* ap}ӻ}mbN r^Y~y8Pػޱ0]_spgPqvy ej,eLa0MFJpaapUrx/L=eӕwY\3/N7Ys3 3bz9J(ڴTkٛډ臇\zB@ QqW243|>;~3f} Lt&hW9-atj#s73Mh:XI ',/Pك#-.,![ rw?B,~r]XEu75*7ӻw% WQC 0`n{l!D{ܝuG>`s 'p%P;̉iKxJ/3#C񀬓EbTsMij')|"I:/MA^,bWU;'4 i ㈝d!6!H 9_h"E nxgQBa'W'^K:l,ő\F! FFT_:#>_vE)'< 仚 |K7 ?Kv KO%ZfI@맺> BK `8Xr49#3&lžsZLeRaJ6hy*rL"Z+ Xs=C׊K2  3!d?Gw|~~VICs2sc L aͱa9."kFڍQ f؈|u^5OqfL 5ܳ: oq_OaN_NNIyk(1 K9Xq3I ?U~?5 $Ŵ>FzjoQBΊ'$Mhx9Krx xH*QQ@rꤐ (`4ں3J)f^&0jx^a =iyϧxD-8@6\>-B. Gt4=CJąOog~l-e+"că7HjQqNЫx~td+5  H-P j#$V)1ӊPr)s,޿F(R^ k6Ģ̸\0ҌLiaNn\{҈}L i6mh{S"R"6f7*P=0PgJi*t㠁%%ObJbS_x:nT2']-?9!b0] ˎ) KyPM iW VF*r|NF#T_3c0 (%Mxšr>3甆ʛg-k6tWDg.4gfע*6 Bi<5Cb BEo%T\*ß =eo%494ie5*v[!TUeeX7ˇ=*4*tVÀwĔ} Ⱥ+Rh6:IgNƁ%QV#i :QTn+vX3^hůC>h|rMocg)TV!GAIqbhwK9V7CO|[Jzv}'-i,;R BPXqD8IpIbhqBxu(*#`5B(C,'7ĵk@%v<ų7U6\FAK\<㗢]{$pdo!\2&@Ӛk(JHA'D =<&mD-jF^ת~{ՅhjKA7Kr¥WB`KGrlgjxbYI"epU*YOAqB4SEZ.w FRP3$+B bjPlF A(a₸u\ %?7^A3c@҉[LQ[XqI QD]-uF/{C?n|}a̢4lsI#ұ+|poemJB_n%Ŕ[ ŀ5X3 gdJ7% + FcxTgRx̴c*)E:Ro(apZ`f\b>l-Pm2S,Ce}n'=?^ Gb4LJ nCsi!4I )]F޾UP>fz{ZoM#8*. ~]i2'U,ZyY_N7w%.,3R.g0;ߨ5؃[!l领L:14؛za˕Z*ԏ:ź?` 9/ xӍliZHMnV6vת O.'D>mtzc<:_f:uuR'NWvNf5U5N6nr߿kЅC0 bg?||]e]ԢM=iy ri`}pHct)abz0wIQ qU-^rvvWCw?mk|t$ayB4tFblwI7`S?LB A5mMy,cSâ_ń'ӿ4ЗeKR}ի?<.&vzykɮ˷E w[ݤݤCL ]u3`)I%ȸWαiAAɤiZ!Ia9Rb=Brpf*A+#bF4D`F͡YCq$ (\ j ``U+eQL"4PjX\z]li\W,jeݍ^VҨ5ZMJZX3_7Ēsmjjegt] ؜C!~q`U+%5OE)?Xʛ: ;}KJ9BBZGQ%[w{ʒvVEuCIhdJnU^ Q/)r L[J: IROF`~^B#_74Jv5@x ;yer4#.PU B8"O0`UCu j9[-_c+x\^F$'+G;]y ~6ؼ#F_d/١,t<|}A0XsK]lW׻:хΖם%.Ve] 4s+ux/q{٥|Pa[8Q+FULy˜w*;ӑ0qp+cTFV,뵐C`}l&kT* ލ! AsQ/9 )\gxx#emB`H;,pqaMہmE xjD'fSHÖgV$<βo r9;w޽@eIEu?qG;My$]RaM9rT6^-gcv-A s|^kc\#Ђ=!0'ֲۻm#hѧti2i- R|:u;߻ $Ky5NlD.5e%Hjh\4•CoE}=]&`eo*6BARTheCqVªb3dZb')[qG|Q/Ioago5*xB#A,F/I0C ]]dԴW4J몚7hDů CtƨJW-yUhumu7^D|Xt9 Bcc̶nMUڻ rJKͭDArJ bJQU IKgApl87HD(f0G/EQ5Zjݸwi7rɈaAN ^ũejXűa%q&P`d(0d &vG`kVD?-yvzqύ$]u22E#ݪ T.<'n'?jC"uәHq\@PoeyqNk,A ZYfV%*[657~_1p2JY2 s_u OrgAn5œ5ڰs xNݓ۽Kew|ZIY|@u >|WXĚ/Jlc3EH/0=xzcJz8a$05b07٫Ye]ʔ({)KlYA ^Bk? '7)(y(ZˈPNT~k"h-y]q,uSQ+dIiMJkYCk"T NJΩ6˵)E%%NU;?@.O/`b}C$Q9t (ʇ^B1yXEv_AbJ{b%A-p{|KK]L`}NCdJT j5nUV4F Ru&m>b1ݕ7Ž4s{ggQՍb/p:f-'K'ޝagK=K[$QA Gwfe޻p`3MPx`WvuW/]qe+`WG@7FLi})(c cttAAcR~ưRg@yۙѩS:*xd/!j4P-AKZƕ~Md&'$=Bu+ۢ:t#j0Ҙɲ`>gY7c|v_Sn-~o.."u>_1I :~X[ݶ.]z< ZaZ!Z8w?l e*n˓HBr͐)Mñ([.16!u8-BvBBrݐȢgiF8x_/;'"K!/A/ `Kh_:&4aFP96_ !Z˄oeTL_73j|{c~Gn?&Z1>萵>Aʁ\ms(o1X}*Q 1H6T%pgTê&ZjMt="7.UCF$F%3.zz_'^.yb$?z*0ח UV/*4 !.t@+&FZUPy*rŰ451;BXOɬQ "ž[~=0kĄQDR~Hp1Rg{ g$A0^5T fJP܏U#%pD@2;*\5ㆀQtNZC*,#%'Q3V1BE J8gjxEp5j_8Ӻ\ %;T5c1=Uě R9B=UzUҒVV[ff,>+E;NJK$5%!J]f EDfpsFQ+YⴖhY܂vE-QT lʚYJPԑ(j?bLs6F^~%<6HLp_FJ*𥱾o&>ѵe\7(/je(\0[ih9(J[>n~]ҷ"~P,e3{⊿X͈pV?~}Km0]o痾)#L E)*bɏ ]ߟ?UO.ϾL˙)*vQyvz i\4>*P'K{ug#S b^wG!Kpwkfg(g~*حu+6(")@Ijp[Jj[t Sc],A[U98jxLJȀvܻPG:~ItۭK|%nІ y j(1lݏ qR* |)~ltSH3[ rpݞTSD$~ #&輼Ǥ_ x䦰Lt ?i=0 F7^}S˶OQ7믛*%٨%K8Bh' /Kc/nKbM, ׇ׌M*8;?wUR|%.I R`ދȞT]44n*"jT/Iiwu”@tjT`I9 TEP`_.n"5%20~ldC{QHv6 EkG[#񬫄緋6ļdv?;h}Dnf>4s^=vzBp0|zhoG"OUf@>Q1¬Ao'(=\ Zx)ã-J#YgWd uQN5 ׍Mߑ6gWg) h߇1tHp| "e0Q]T۹7Ͽ.j؊d (3Bڷx6K8S9M!8]?ٟjz5T +^isS-kUۤZ?} r1i ӽT;|'rvhLn# 'nĈN3Rtƌn]/"hL5_jqni7%9햋A>#EME(Ǜ;ߥ,R !!_ht&h*N';u~,,{xd3# ro|momvb| mGei? }9_%xO"#qڕ]xK矜O%W~/I7M~=Xܼ}2Cj0Me>//NdVh]qT]swU+uST  I<ۍ]u+;wBkDP"qB5TʲЄ8pHJn؟Ţrș1X %Ad)ɖVWV44\ԍZ[eI]r e2EY 8 Ahר/h 3u&oD9_(Ь;var/home/core/zuul-output/logs/kubelet.log0000644000000000000000005102707115151013457017702 0ustar rootrootMar 01 09:07:49 crc systemd[1]: Starting Kubernetes Kubelet... Mar 01 09:07:49 crc restorecon[4643]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 01 09:07:50 crc restorecon[4643]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 01 09:07:51 crc kubenswrapper[4792]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 01 09:07:51 crc kubenswrapper[4792]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 01 09:07:51 crc kubenswrapper[4792]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 01 09:07:51 crc kubenswrapper[4792]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 01 09:07:51 crc kubenswrapper[4792]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 01 09:07:51 crc kubenswrapper[4792]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.171124 4792 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174120 4792 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174140 4792 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174145 4792 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174149 4792 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174154 4792 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174158 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174162 4792 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174167 4792 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174171 4792 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174176 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174181 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174187 4792 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174191 4792 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174196 4792 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174201 4792 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174207 4792 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174216 4792 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174220 4792 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174224 4792 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174228 4792 feature_gate.go:330] unrecognized feature gate: Example Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174234 4792 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174240 4792 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174245 4792 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174250 4792 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174254 4792 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174258 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174263 4792 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174266 4792 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174270 4792 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174274 4792 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174279 4792 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174283 4792 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174289 4792 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174293 4792 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174297 4792 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174301 4792 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174305 4792 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174310 4792 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174314 4792 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174318 4792 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174322 4792 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174326 4792 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174330 4792 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174335 4792 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174340 4792 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174345 4792 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174349 4792 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174354 4792 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174358 4792 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174362 4792 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174365 4792 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174369 4792 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174373 4792 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174376 4792 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174380 4792 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174383 4792 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174387 4792 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174390 4792 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174394 4792 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174397 4792 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174401 4792 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174405 4792 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174408 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174411 4792 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174415 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174418 4792 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174421 4792 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174425 4792 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174428 4792 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174432 4792 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174435 4792 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175208 4792 flags.go:64] FLAG: --address="0.0.0.0" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175222 4792 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175233 4792 flags.go:64] FLAG: --anonymous-auth="true" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175243 4792 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175249 4792 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175253 4792 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175259 4792 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175264 4792 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175268 4792 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175275 4792 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175280 4792 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175284 4792 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175289 4792 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175293 4792 flags.go:64] FLAG: --cgroup-root="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175297 4792 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175301 4792 flags.go:64] FLAG: --client-ca-file="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175305 4792 flags.go:64] FLAG: --cloud-config="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175309 4792 flags.go:64] FLAG: --cloud-provider="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175313 4792 flags.go:64] FLAG: --cluster-dns="[]" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175319 4792 flags.go:64] FLAG: --cluster-domain="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175323 4792 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175327 4792 flags.go:64] FLAG: --config-dir="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175331 4792 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175336 4792 flags.go:64] FLAG: --container-log-max-files="5" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175342 4792 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175364 4792 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175369 4792 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175375 4792 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175380 4792 flags.go:64] FLAG: --contention-profiling="false" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175384 4792 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175388 4792 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175392 4792 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175396 4792 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175402 4792 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175406 4792 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175412 4792 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175416 4792 flags.go:64] FLAG: --enable-load-reader="false" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175420 4792 flags.go:64] FLAG: --enable-server="true" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175425 4792 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175430 4792 flags.go:64] FLAG: --event-burst="100" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175435 4792 flags.go:64] FLAG: --event-qps="50" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175439 4792 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175443 4792 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175447 4792 flags.go:64] FLAG: --eviction-hard="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175453 4792 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175457 4792 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175461 4792 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175465 4792 flags.go:64] FLAG: --eviction-soft="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175469 4792 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175473 4792 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175478 4792 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175482 4792 flags.go:64] FLAG: --experimental-mounter-path="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175486 4792 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175490 4792 flags.go:64] FLAG: --fail-swap-on="true" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175494 4792 flags.go:64] FLAG: --feature-gates="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175499 4792 flags.go:64] FLAG: --file-check-frequency="20s" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175503 4792 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175507 4792 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175512 4792 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175517 4792 flags.go:64] FLAG: --healthz-port="10248" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175521 4792 flags.go:64] FLAG: --help="false" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175525 4792 flags.go:64] FLAG: --hostname-override="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175529 4792 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175533 4792 flags.go:64] FLAG: --http-check-frequency="20s" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175538 4792 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175542 4792 flags.go:64] FLAG: --image-credential-provider-config="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175546 4792 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175550 4792 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175554 4792 flags.go:64] FLAG: --image-service-endpoint="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175558 4792 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175563 4792 flags.go:64] FLAG: --kube-api-burst="100" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175568 4792 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175573 4792 flags.go:64] FLAG: --kube-api-qps="50" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175578 4792 flags.go:64] FLAG: --kube-reserved="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175585 4792 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175590 4792 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175595 4792 flags.go:64] FLAG: --kubelet-cgroups="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175600 4792 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175605 4792 flags.go:64] FLAG: --lock-file="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175610 4792 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175615 4792 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175620 4792 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175628 4792 flags.go:64] FLAG: --log-json-split-stream="false" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175632 4792 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175637 4792 flags.go:64] FLAG: --log-text-split-stream="false" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175643 4792 flags.go:64] FLAG: --logging-format="text" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175648 4792 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175653 4792 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175659 4792 flags.go:64] FLAG: --manifest-url="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175664 4792 flags.go:64] FLAG: --manifest-url-header="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175671 4792 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175676 4792 flags.go:64] FLAG: --max-open-files="1000000" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175681 4792 flags.go:64] FLAG: --max-pods="110" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175685 4792 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175689 4792 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175693 4792 flags.go:64] FLAG: --memory-manager-policy="None" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175697 4792 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175703 4792 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175707 4792 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175712 4792 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175723 4792 flags.go:64] FLAG: --node-status-max-images="50" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175728 4792 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175732 4792 flags.go:64] FLAG: --oom-score-adj="-999" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175736 4792 flags.go:64] FLAG: --pod-cidr="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175740 4792 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175747 4792 flags.go:64] FLAG: --pod-manifest-path="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175752 4792 flags.go:64] FLAG: --pod-max-pids="-1" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175756 4792 flags.go:64] FLAG: --pods-per-core="0" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175760 4792 flags.go:64] FLAG: --port="10250" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175764 4792 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175769 4792 flags.go:64] FLAG: --provider-id="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175773 4792 flags.go:64] FLAG: --qos-reserved="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175778 4792 flags.go:64] FLAG: --read-only-port="10255" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175782 4792 flags.go:64] FLAG: --register-node="true" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175786 4792 flags.go:64] FLAG: --register-schedulable="true" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175790 4792 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175798 4792 flags.go:64] FLAG: --registry-burst="10" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175802 4792 flags.go:64] FLAG: --registry-qps="5" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175806 4792 flags.go:64] FLAG: --reserved-cpus="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175810 4792 flags.go:64] FLAG: --reserved-memory="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175815 4792 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175819 4792 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175823 4792 flags.go:64] FLAG: --rotate-certificates="false" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175827 4792 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175831 4792 flags.go:64] FLAG: --runonce="false" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175835 4792 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175839 4792 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175844 4792 flags.go:64] FLAG: --seccomp-default="false" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175848 4792 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175853 4792 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175858 4792 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175863 4792 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175869 4792 flags.go:64] FLAG: --storage-driver-password="root" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175874 4792 flags.go:64] FLAG: --storage-driver-secure="false" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175894 4792 flags.go:64] FLAG: --storage-driver-table="stats" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175899 4792 flags.go:64] FLAG: --storage-driver-user="root" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175918 4792 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175924 4792 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175929 4792 flags.go:64] FLAG: --system-cgroups="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175933 4792 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175940 4792 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175944 4792 flags.go:64] FLAG: --tls-cert-file="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175948 4792 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175954 4792 flags.go:64] FLAG: --tls-min-version="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175958 4792 flags.go:64] FLAG: --tls-private-key-file="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175962 4792 flags.go:64] FLAG: --topology-manager-policy="none" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175967 4792 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175971 4792 flags.go:64] FLAG: --topology-manager-scope="container" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175976 4792 flags.go:64] FLAG: --v="2" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175982 4792 flags.go:64] FLAG: --version="false" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175987 4792 flags.go:64] FLAG: --vmodule="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175993 4792 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175997 4792 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176104 4792 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176110 4792 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176114 4792 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176118 4792 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176122 4792 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176126 4792 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176131 4792 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176135 4792 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176138 4792 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176142 4792 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176147 4792 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176152 4792 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176156 4792 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176159 4792 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176163 4792 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176167 4792 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176172 4792 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176176 4792 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176181 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176185 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176188 4792 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176192 4792 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176197 4792 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176201 4792 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176206 4792 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176210 4792 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176214 4792 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176219 4792 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176223 4792 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176228 4792 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176232 4792 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176236 4792 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176240 4792 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176244 4792 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176248 4792 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176252 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176255 4792 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176259 4792 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176263 4792 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176267 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176270 4792 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176274 4792 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176277 4792 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176281 4792 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176284 4792 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176288 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176292 4792 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176298 4792 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176302 4792 feature_gate.go:330] unrecognized feature gate: Example Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176306 4792 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176310 4792 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176313 4792 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176317 4792 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176321 4792 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176328 4792 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176332 4792 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176337 4792 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176340 4792 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176345 4792 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176349 4792 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176352 4792 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176356 4792 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176359 4792 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176363 4792 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176366 4792 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176370 4792 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176374 4792 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176378 4792 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176381 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176388 4792 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176392 4792 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.176398 4792 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.187096 4792 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.187636 4792 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187788 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187803 4792 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187811 4792 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187819 4792 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187828 4792 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187836 4792 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187842 4792 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187849 4792 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187856 4792 feature_gate.go:330] unrecognized feature gate: Example Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187863 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187870 4792 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187878 4792 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187884 4792 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187892 4792 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187899 4792 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187928 4792 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187937 4792 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187944 4792 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187950 4792 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187957 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187967 4792 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187974 4792 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187981 4792 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187988 4792 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187995 4792 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188003 4792 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188010 4792 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188017 4792 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188025 4792 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188031 4792 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188039 4792 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188047 4792 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188054 4792 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188062 4792 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188068 4792 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188078 4792 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188089 4792 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188097 4792 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188106 4792 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188113 4792 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188121 4792 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188128 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188136 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188143 4792 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188151 4792 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188161 4792 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188169 4792 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188176 4792 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188185 4792 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188192 4792 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188198 4792 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188205 4792 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188212 4792 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188221 4792 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188229 4792 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188236 4792 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188246 4792 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188253 4792 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188259 4792 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188266 4792 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188274 4792 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188281 4792 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188288 4792 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188295 4792 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188302 4792 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188310 4792 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188316 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188323 4792 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188330 4792 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188336 4792 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188345 4792 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.188357 4792 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188570 4792 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188585 4792 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188593 4792 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188602 4792 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188611 4792 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188619 4792 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188626 4792 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188633 4792 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188640 4792 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188647 4792 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188654 4792 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188661 4792 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188668 4792 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188675 4792 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188681 4792 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188689 4792 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188696 4792 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188703 4792 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188709 4792 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188716 4792 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188725 4792 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188732 4792 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188738 4792 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188745 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188752 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188760 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188768 4792 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188777 4792 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188785 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188792 4792 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188800 4792 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188807 4792 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188813 4792 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188820 4792 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188827 4792 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188834 4792 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188841 4792 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188848 4792 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188854 4792 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188861 4792 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188868 4792 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188875 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188882 4792 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188889 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188898 4792 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188931 4792 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188941 4792 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188950 4792 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188957 4792 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188963 4792 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188970 4792 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188977 4792 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188985 4792 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188991 4792 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188998 4792 feature_gate.go:330] unrecognized feature gate: Example Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.189005 4792 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.189014 4792 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.189021 4792 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.189028 4792 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.189034 4792 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.189041 4792 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.189050 4792 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.189059 4792 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.189066 4792 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.189073 4792 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.189081 4792 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.189087 4792 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.189095 4792 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.189104 4792 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.189112 4792 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.189119 4792 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.189132 4792 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.190282 4792 server.go:940] "Client rotation is on, will bootstrap in background" Mar 01 09:07:51 crc kubenswrapper[4792]: E0301 09:07:51.196114 4792 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.202605 4792 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.202824 4792 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.204800 4792 server.go:997] "Starting client certificate rotation" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.204854 4792 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.204997 4792 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.234859 4792 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 01 09:07:51 crc kubenswrapper[4792]: E0301 09:07:51.237303 4792 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.239748 4792 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.256578 4792 log.go:25] "Validated CRI v1 runtime API" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.292888 4792 log.go:25] "Validated CRI v1 image API" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.295814 4792 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.301485 4792 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-01-09-01-56-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.301524 4792 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.313382 4792 manager.go:217] Machine: {Timestamp:2026-03-01 09:07:51.312058072 +0000 UTC m=+0.553937269 CPUVendorID:AuthenticAMD NumCores:8 NumPhysicalCores:1 NumSockets:8 CpuFrequency:2800000 MemoryCapacity:25199472640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:7013d830-7d29-4a03-853d-b832509642d4 BootID:10ee72b7-c3f1-449a-bf55-34c8d2b9c7af Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:12599738368 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:2519945216 Type:vfs Inodes:615221 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:3076107 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:12599734272 Type:vfs Inodes:3076107 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:5039894528 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:429496729600 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:05:dc:1a Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:05:dc:1a Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:84:8f:7c Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:a1:4b:44 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:71:73:f9 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:68:72:88 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:e8:05:b1 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:ee:ea:05:c0:de:9e Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:66:81:6d:65:90:48 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:25199472640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.313568 4792 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.313784 4792 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.314075 4792 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.314268 4792 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.314306 4792 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.315289 4792 topology_manager.go:138] "Creating topology manager with none policy" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.315310 4792 container_manager_linux.go:303] "Creating device plugin manager" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.315834 4792 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.315859 4792 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.316727 4792 state_mem.go:36] "Initialized new in-memory state store" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.317133 4792 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.320856 4792 kubelet.go:418] "Attempting to sync node with API server" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.320894 4792 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.320939 4792 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.320954 4792 kubelet.go:324] "Adding apiserver pod source" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.320968 4792 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.325743 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Mar 01 09:07:51 crc kubenswrapper[4792]: E0301 09:07:51.325860 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.326469 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Mar 01 09:07:51 crc kubenswrapper[4792]: E0301 09:07:51.326621 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.328049 4792 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.330072 4792 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.332673 4792 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.339880 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.340184 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.340199 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.340208 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.340223 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.340237 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.340248 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.340264 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.340277 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.340288 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.340303 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.340312 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.341600 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.343647 4792 server.go:1280] "Started kubelet" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.343975 4792 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.344694 4792 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.345527 4792 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.346212 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Mar 01 09:07:51 crc systemd[1]: Started Kubernetes Kubelet. Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.348983 4792 server.go:460] "Adding debug handlers to kubelet server" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.350182 4792 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.350219 4792 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.350827 4792 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.351269 4792 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.351450 4792 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.351603 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.351664 4792 factory.go:55] Registering systemd factory Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.351721 4792 factory.go:221] Registration of the systemd container factory successfully Mar 01 09:07:51 crc kubenswrapper[4792]: E0301 09:07:51.351661 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Mar 01 09:07:51 crc kubenswrapper[4792]: E0301 09:07:51.351720 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:07:51 crc kubenswrapper[4792]: E0301 09:07:51.351920 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="200ms" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.353888 4792 factory.go:153] Registering CRI-O factory Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.353945 4792 factory.go:221] Registration of the crio container factory successfully Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.354052 4792 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.354085 4792 factory.go:103] Registering Raw factory Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.354102 4792 manager.go:1196] Started watching for new ooms in manager Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.354920 4792 manager.go:319] Starting recovery of all containers Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.364174 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.364239 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.364253 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.364264 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.364289 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.364300 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.364314 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.364422 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.364439 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.364453 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.364462 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.364479 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.364490 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.364568 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.364579 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.368785 4792 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.368873 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.368896 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.370943 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.370986 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371001 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371019 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: E0301 09:07:51.361882 4792 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.89:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1898ac74dff1661c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.343613468 +0000 UTC m=+0.585492665,LastTimestamp:2026-03-01 09:07:51.343613468 +0000 UTC m=+0.585492665,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371033 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371101 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371118 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371149 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371164 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371184 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371198 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371214 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371225 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371305 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371317 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371344 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371365 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371375 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371407 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371417 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371445 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371459 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371470 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371490 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371507 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371522 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371534 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371543 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371553 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371595 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371627 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371673 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371686 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371699 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371716 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371743 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371791 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371882 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371946 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371963 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371980 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371993 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.372034 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373355 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373420 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373434 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373446 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373458 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373469 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373479 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373492 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373508 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373522 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373535 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373548 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373562 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373575 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373596 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373656 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373672 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373686 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373706 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373720 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373737 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373750 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373766 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373781 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373793 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373805 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373818 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373832 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373845 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373867 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373888 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373903 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373938 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373955 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373975 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373989 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374001 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374017 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374030 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374062 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374075 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374089 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374107 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374124 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374152 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374169 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374208 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374223 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374239 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374258 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374271 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374287 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374301 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374318 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374342 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374360 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374376 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374390 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374405 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374418 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374431 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374445 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374484 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374506 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374524 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374548 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374563 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374576 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374590 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374605 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374618 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374633 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374645 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374680 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374696 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374710 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374755 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374772 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374785 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374800 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374813 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374829 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374846 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374859 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374875 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374889 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374932 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374947 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374961 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374977 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374992 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375006 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375021 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375035 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375048 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375065 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375079 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375101 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375120 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375143 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375160 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375198 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375214 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375228 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375250 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375268 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375283 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375304 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375320 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375334 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375347 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375362 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375376 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375389 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375402 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375426 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375445 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375465 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375487 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375501 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375535 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375547 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375561 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375577 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375591 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375605 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375618 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375633 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375648 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375661 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375675 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375687 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375702 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375720 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375734 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375748 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375762 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375776 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375799 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375815 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375832 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375852 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375866 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375885 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375898 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375926 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375958 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375970 4792 reconstruct.go:97] "Volume reconstruction finished" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375980 4792 reconciler.go:26] "Reconciler: start to sync state" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.380861 4792 manager.go:324] Recovery completed Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.391733 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.393300 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.393349 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.393384 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.394139 4792 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.394159 4792 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.394178 4792 state_mem.go:36] "Initialized new in-memory state store" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.405593 4792 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.407449 4792 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.407495 4792 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.407525 4792 kubelet.go:2335] "Starting kubelet main sync loop" Mar 01 09:07:51 crc kubenswrapper[4792]: E0301 09:07:51.407570 4792 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.408665 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Mar 01 09:07:51 crc kubenswrapper[4792]: E0301 09:07:51.408741 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.428007 4792 policy_none.go:49] "None policy: Start" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.428968 4792 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.429041 4792 state_mem.go:35] "Initializing new in-memory state store" Mar 01 09:07:51 crc kubenswrapper[4792]: E0301 09:07:51.452116 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.483023 4792 manager.go:334] "Starting Device Plugin manager" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.483109 4792 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.483126 4792 server.go:79] "Starting device plugin registration server" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.483675 4792 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.483693 4792 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.483890 4792 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.484015 4792 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.484022 4792 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 01 09:07:51 crc kubenswrapper[4792]: E0301 09:07:51.495927 4792 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.507747 4792 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.507883 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.509011 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.509061 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.509072 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.509260 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.509469 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.509570 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.510507 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.510556 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.510568 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.510742 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.510891 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.510983 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.511061 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.511110 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.511133 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.512197 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.512301 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.512313 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.512363 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.512390 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.512421 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.512723 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.513206 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.513344 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.514204 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.514232 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.514249 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.514543 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.514662 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.514700 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.515388 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.515422 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.515434 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.518576 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.518599 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.518609 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.518749 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.518773 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.518783 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.518999 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.519033 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.520487 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.520513 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.520526 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:51 crc kubenswrapper[4792]: E0301 09:07:51.552901 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="400ms" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.578727 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.578772 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.578798 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.578815 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.578836 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.578854 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.578960 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.579028 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.579068 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.579125 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.579159 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.579193 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.579228 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.579263 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.579315 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.583802 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.585241 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.585271 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.585283 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.585307 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 01 09:07:51 crc kubenswrapper[4792]: E0301 09:07:51.585718 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.89:6443: connect: connection refused" node="crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.680656 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.680729 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.680762 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.680795 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.680818 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.680827 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.680866 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.681051 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.681086 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.681001 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.680949 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.681056 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.681202 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.680982 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.681233 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.681278 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.681280 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.681302 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.681328 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.680899 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.681344 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.681387 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.681399 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.681437 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.681437 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.681462 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.681477 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.681500 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.681565 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.681695 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.786662 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.788387 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.788438 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.788457 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.788491 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 01 09:07:51 crc kubenswrapper[4792]: E0301 09:07:51.789076 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.89:6443: connect: connection refused" node="crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.863073 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.873733 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.894063 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.922603 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.927437 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-54de6d95ee1a53bd81473feb5d58e04aa62779a5735069ffba57159a05e89998 WatchSource:0}: Error finding container 54de6d95ee1a53bd81473feb5d58e04aa62779a5735069ffba57159a05e89998: Status 404 returned error can't find the container with id 54de6d95ee1a53bd81473feb5d58e04aa62779a5735069ffba57159a05e89998 Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.931899 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-7f6e6a0bb71eeadc54be97424d9bc94a4f1fde2ff23b9ce9c18316fe1533597e WatchSource:0}: Error finding container 7f6e6a0bb71eeadc54be97424d9bc94a4f1fde2ff23b9ce9c18316fe1533597e: Status 404 returned error can't find the container with id 7f6e6a0bb71eeadc54be97424d9bc94a4f1fde2ff23b9ce9c18316fe1533597e Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.932532 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.937514 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-25ab059ec0587552073d7fa61eb3c0bd4797b13cbcca56593dab2ff0fc7bb158 WatchSource:0}: Error finding container 25ab059ec0587552073d7fa61eb3c0bd4797b13cbcca56593dab2ff0fc7bb158: Status 404 returned error can't find the container with id 25ab059ec0587552073d7fa61eb3c0bd4797b13cbcca56593dab2ff0fc7bb158 Mar 01 09:07:51 crc kubenswrapper[4792]: E0301 09:07:51.953755 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="800ms" Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.958024 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-a6dd73c06f97cd741b2cff04b9b032c55e67d970b5d44ad6a4fa216321f3aff7 WatchSource:0}: Error finding container a6dd73c06f97cd741b2cff04b9b032c55e67d970b5d44ad6a4fa216321f3aff7: Status 404 returned error can't find the container with id a6dd73c06f97cd741b2cff04b9b032c55e67d970b5d44ad6a4fa216321f3aff7 Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.966077 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-9995264936e1cf89cc354c4ddff95af9290d403e2ce9aca130316680133c4376 WatchSource:0}: Error finding container 9995264936e1cf89cc354c4ddff95af9290d403e2ce9aca130316680133c4376: Status 404 returned error can't find the container with id 9995264936e1cf89cc354c4ddff95af9290d403e2ce9aca130316680133c4376 Mar 01 09:07:52 crc kubenswrapper[4792]: I0301 09:07:52.195136 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:52 crc kubenswrapper[4792]: I0301 09:07:52.202139 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:52 crc kubenswrapper[4792]: I0301 09:07:52.202194 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:52 crc kubenswrapper[4792]: I0301 09:07:52.202208 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:52 crc kubenswrapper[4792]: I0301 09:07:52.202242 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 01 09:07:52 crc kubenswrapper[4792]: E0301 09:07:52.202791 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.89:6443: connect: connection refused" node="crc" Mar 01 09:07:52 crc kubenswrapper[4792]: W0301 09:07:52.342324 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Mar 01 09:07:52 crc kubenswrapper[4792]: E0301 09:07:52.342421 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Mar 01 09:07:52 crc kubenswrapper[4792]: I0301 09:07:52.347428 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Mar 01 09:07:52 crc kubenswrapper[4792]: I0301 09:07:52.411890 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"7f6e6a0bb71eeadc54be97424d9bc94a4f1fde2ff23b9ce9c18316fe1533597e"} Mar 01 09:07:52 crc kubenswrapper[4792]: I0301 09:07:52.412849 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"54de6d95ee1a53bd81473feb5d58e04aa62779a5735069ffba57159a05e89998"} Mar 01 09:07:52 crc kubenswrapper[4792]: I0301 09:07:52.413764 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9995264936e1cf89cc354c4ddff95af9290d403e2ce9aca130316680133c4376"} Mar 01 09:07:52 crc kubenswrapper[4792]: I0301 09:07:52.416831 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a6dd73c06f97cd741b2cff04b9b032c55e67d970b5d44ad6a4fa216321f3aff7"} Mar 01 09:07:52 crc kubenswrapper[4792]: I0301 09:07:52.417967 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"25ab059ec0587552073d7fa61eb3c0bd4797b13cbcca56593dab2ff0fc7bb158"} Mar 01 09:07:52 crc kubenswrapper[4792]: W0301 09:07:52.418796 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Mar 01 09:07:52 crc kubenswrapper[4792]: E0301 09:07:52.418874 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Mar 01 09:07:52 crc kubenswrapper[4792]: W0301 09:07:52.688597 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Mar 01 09:07:52 crc kubenswrapper[4792]: E0301 09:07:52.688677 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Mar 01 09:07:52 crc kubenswrapper[4792]: W0301 09:07:52.734812 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Mar 01 09:07:52 crc kubenswrapper[4792]: E0301 09:07:52.734934 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Mar 01 09:07:52 crc kubenswrapper[4792]: E0301 09:07:52.755217 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="1.6s" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.003961 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.006231 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.006299 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.006318 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.006363 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 01 09:07:53 crc kubenswrapper[4792]: E0301 09:07:53.007038 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.89:6443: connect: connection refused" node="crc" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.336943 4792 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 01 09:07:53 crc kubenswrapper[4792]: E0301 09:07:53.337771 4792 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.347246 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.438156 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"72a213f48ea7aba6dc2e3a9aa8d171acb6aa5c22532eed941bdc337e89b76589"} Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.438243 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e93e0bcca6e83da45b6dd64db0083da5cc4045d25d7a66615d09f8674cc02433"} Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.438274 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.438276 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"684aa0b5aeaecf3fb572ab400e718956a5373c4fa6651e742d65bdbc3425b9b2"} Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.438536 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"859afd52205b95f45333b5975e67308ee360d3887aa679a922b7cba5e3acda4c"} Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.439849 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.439881 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.439894 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.441389 4792 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada" exitCode=0 Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.441525 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.441552 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada"} Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.443795 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.443822 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.443832 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.444182 4792 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="89989fe138b858dc3592cf753044ffb1142921d214e92ea0cf407ba0a44790c0" exitCode=0 Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.444603 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.444601 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"89989fe138b858dc3592cf753044ffb1142921d214e92ea0cf407ba0a44790c0"} Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.452833 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.452873 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.452891 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.453332 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.454982 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.455020 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.455037 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.455785 4792 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="6faf5a3e146bbcf0a0ebd280fd8a537de025e388c0b5abd07452fed024a59d58" exitCode=0 Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.456025 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"6faf5a3e146bbcf0a0ebd280fd8a537de025e388c0b5abd07452fed024a59d58"} Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.456073 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.457934 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.457974 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.457989 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.458418 4792 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="5e046a8fc92c9bbbcba4c1a83c870045a2986787c2d49ec55bd3612f8a38d34f" exitCode=0 Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.458453 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"5e046a8fc92c9bbbcba4c1a83c870045a2986787c2d49ec55bd3612f8a38d34f"} Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.458558 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.459851 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.460097 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.460230 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:53 crc kubenswrapper[4792]: E0301 09:07:53.468737 4792 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.89:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1898ac74dff1661c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.343613468 +0000 UTC m=+0.585492665,LastTimestamp:2026-03-01 09:07:51.343613468 +0000 UTC m=+0.585492665,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:07:54 crc kubenswrapper[4792]: W0301 09:07:54.158234 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Mar 01 09:07:54 crc kubenswrapper[4792]: E0301 09:07:54.158310 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.347283 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Mar 01 09:07:54 crc kubenswrapper[4792]: E0301 09:07:54.356473 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="3.2s" Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.464097 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"60fcc4fed210cb9af0325e0689663cbd1c6099e535847c37fe925b2262c1ba2d"} Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.464153 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.464956 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.464988 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.465005 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.470115 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"9b89e60e36b6e5a848c0d033d1ba3cfb1c58af2a1288bfd5612ade34b0c24e08"} Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.470375 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c81dff9998b1dc46b4b8cf890b0a1cf9f6201067af0aa512dfff4ad0c3688d69"} Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.470398 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"584d540504344f06dee541ba7491978c79d823f7bea306efe216719f32cfbc9a"} Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.470445 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.471830 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.471870 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.471880 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.475255 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48"} Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.475291 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1"} Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.475303 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7"} Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.475314 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610"} Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.477407 4792 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="fa59a983918262456d3ae53d3e2465ee0d2d5d20f175a152775ccf8cf961d4ac" exitCode=0 Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.477489 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.477504 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"fa59a983918262456d3ae53d3e2465ee0d2d5d20f175a152775ccf8cf961d4ac"} Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.477577 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.478872 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.478893 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.478915 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.478940 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.478958 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.478968 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.607332 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.608557 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.608598 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.608609 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.608636 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 01 09:07:54 crc kubenswrapper[4792]: E0301 09:07:54.609178 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.89:6443: connect: connection refused" node="crc" Mar 01 09:07:54 crc kubenswrapper[4792]: W0301 09:07:54.646990 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Mar 01 09:07:54 crc kubenswrapper[4792]: E0301 09:07:54.647069 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Mar 01 09:07:54 crc kubenswrapper[4792]: W0301 09:07:54.880208 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Mar 01 09:07:54 crc kubenswrapper[4792]: E0301 09:07:54.880277 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Mar 01 09:07:54 crc kubenswrapper[4792]: W0301 09:07:54.913484 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Mar 01 09:07:54 crc kubenswrapper[4792]: E0301 09:07:54.913578 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.482648 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"27e10bb6a0f100f119755de151dfea900c1268526a3fe0ee0f4f067654d2c3fb"} Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.482750 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.483793 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.483829 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.483843 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.484259 4792 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="dbb402d58001b334d03322f8dbd5127ef54e6f9f4c68a50f96678cfb0d18b0c1" exitCode=0 Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.484333 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"dbb402d58001b334d03322f8dbd5127ef54e6f9f4c68a50f96678cfb0d18b0c1"} Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.484385 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.484407 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.484431 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.484433 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.485308 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.485337 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.485354 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.485335 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.485399 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.485413 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.486101 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.486143 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.486163 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.498783 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.498978 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.499949 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.499979 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.499991 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:56 crc kubenswrapper[4792]: I0301 09:07:56.489622 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2fc17e8ed51e49812ecf9e98e9643efca15ff1ae0c651ce9ce595cab4b867835"} Mar 01 09:07:56 crc kubenswrapper[4792]: I0301 09:07:56.489706 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"261e0c49c66e0e2c75ff92754f75e47f491bd9bb111c21090588dc541a5bea7e"} Mar 01 09:07:56 crc kubenswrapper[4792]: I0301 09:07:56.489780 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"cde437ad36fd39f054e2cd44aaa14c04910c7b50c7fa9bd449fbc9473a90556e"} Mar 01 09:07:56 crc kubenswrapper[4792]: I0301 09:07:56.489708 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:56 crc kubenswrapper[4792]: I0301 09:07:56.489798 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:56 crc kubenswrapper[4792]: I0301 09:07:56.490349 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:07:56 crc kubenswrapper[4792]: I0301 09:07:56.490934 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:56 crc kubenswrapper[4792]: I0301 09:07:56.490959 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:56 crc kubenswrapper[4792]: I0301 09:07:56.490968 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:56 crc kubenswrapper[4792]: I0301 09:07:56.490968 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:56 crc kubenswrapper[4792]: I0301 09:07:56.491058 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:56 crc kubenswrapper[4792]: I0301 09:07:56.491069 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:57 crc kubenswrapper[4792]: I0301 09:07:57.498840 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"85b1d3f760090b536cbc905310656d527e974d4ebd5fc8624a6fce696ac2f62a"} Mar 01 09:07:57 crc kubenswrapper[4792]: I0301 09:07:57.498937 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"93314bca863dfc57ee7c0badd8485c0d785e94cac5c02f9299588aa1704961e4"} Mar 01 09:07:57 crc kubenswrapper[4792]: I0301 09:07:57.498947 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:57 crc kubenswrapper[4792]: I0301 09:07:57.499034 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:57 crc kubenswrapper[4792]: I0301 09:07:57.500228 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:57 crc kubenswrapper[4792]: I0301 09:07:57.500263 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:57 crc kubenswrapper[4792]: I0301 09:07:57.500274 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:57 crc kubenswrapper[4792]: I0301 09:07:57.500414 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:57 crc kubenswrapper[4792]: I0301 09:07:57.500449 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:57 crc kubenswrapper[4792]: I0301 09:07:57.500469 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:57 crc kubenswrapper[4792]: I0301 09:07:57.722953 4792 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 01 09:07:57 crc kubenswrapper[4792]: I0301 09:07:57.809985 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:57 crc kubenswrapper[4792]: I0301 09:07:57.811610 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:57 crc kubenswrapper[4792]: I0301 09:07:57.811654 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:57 crc kubenswrapper[4792]: I0301 09:07:57.811667 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:57 crc kubenswrapper[4792]: I0301 09:07:57.811696 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 01 09:07:58 crc kubenswrapper[4792]: I0301 09:07:58.408581 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 01 09:07:58 crc kubenswrapper[4792]: I0301 09:07:58.499553 4792 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 01 09:07:58 crc kubenswrapper[4792]: I0301 09:07:58.499667 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 01 09:07:58 crc kubenswrapper[4792]: I0301 09:07:58.501959 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:58 crc kubenswrapper[4792]: I0301 09:07:58.503071 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:58 crc kubenswrapper[4792]: I0301 09:07:58.503109 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:58 crc kubenswrapper[4792]: I0301 09:07:58.503121 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:59 crc kubenswrapper[4792]: I0301 09:07:59.221603 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:07:59 crc kubenswrapper[4792]: I0301 09:07:59.221894 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:59 crc kubenswrapper[4792]: I0301 09:07:59.223432 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:59 crc kubenswrapper[4792]: I0301 09:07:59.223624 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:59 crc kubenswrapper[4792]: I0301 09:07:59.223831 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:59 crc kubenswrapper[4792]: I0301 09:07:59.348191 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:07:59 crc kubenswrapper[4792]: I0301 09:07:59.348490 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:59 crc kubenswrapper[4792]: I0301 09:07:59.350162 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:59 crc kubenswrapper[4792]: I0301 09:07:59.350569 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:59 crc kubenswrapper[4792]: I0301 09:07:59.350759 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:59 crc kubenswrapper[4792]: I0301 09:07:59.504145 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:59 crc kubenswrapper[4792]: I0301 09:07:59.505537 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:59 crc kubenswrapper[4792]: I0301 09:07:59.505582 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:59 crc kubenswrapper[4792]: I0301 09:07:59.505599 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:00 crc kubenswrapper[4792]: I0301 09:08:00.115104 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:08:00 crc kubenswrapper[4792]: I0301 09:08:00.115620 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:00 crc kubenswrapper[4792]: I0301 09:08:00.120406 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:00 crc kubenswrapper[4792]: I0301 09:08:00.120481 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:00 crc kubenswrapper[4792]: I0301 09:08:00.120511 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:00 crc kubenswrapper[4792]: I0301 09:08:00.123058 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:08:00 crc kubenswrapper[4792]: I0301 09:08:00.235981 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:08:00 crc kubenswrapper[4792]: I0301 09:08:00.236544 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:00 crc kubenswrapper[4792]: I0301 09:08:00.238041 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:00 crc kubenswrapper[4792]: I0301 09:08:00.238238 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:00 crc kubenswrapper[4792]: I0301 09:08:00.238431 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:00 crc kubenswrapper[4792]: I0301 09:08:00.507742 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:00 crc kubenswrapper[4792]: I0301 09:08:00.509446 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:00 crc kubenswrapper[4792]: I0301 09:08:00.509494 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:00 crc kubenswrapper[4792]: I0301 09:08:00.509512 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:01 crc kubenswrapper[4792]: I0301 09:08:01.005860 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:08:01 crc kubenswrapper[4792]: E0301 09:08:01.496260 4792 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 01 09:08:01 crc kubenswrapper[4792]: I0301 09:08:01.509835 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:01 crc kubenswrapper[4792]: I0301 09:08:01.511811 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:01 crc kubenswrapper[4792]: I0301 09:08:01.511846 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:01 crc kubenswrapper[4792]: I0301 09:08:01.511859 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:03 crc kubenswrapper[4792]: I0301 09:08:03.906468 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 01 09:08:03 crc kubenswrapper[4792]: I0301 09:08:03.906754 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:03 crc kubenswrapper[4792]: I0301 09:08:03.908159 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:03 crc kubenswrapper[4792]: I0301 09:08:03.908192 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:03 crc kubenswrapper[4792]: I0301 09:08:03.908205 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:05 crc kubenswrapper[4792]: I0301 09:08:05.348027 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 01 09:08:05 crc kubenswrapper[4792]: I0301 09:08:05.524663 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 01 09:08:05 crc kubenswrapper[4792]: I0301 09:08:05.527217 4792 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="27e10bb6a0f100f119755de151dfea900c1268526a3fe0ee0f4f067654d2c3fb" exitCode=255 Mar 01 09:08:05 crc kubenswrapper[4792]: I0301 09:08:05.527257 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"27e10bb6a0f100f119755de151dfea900c1268526a3fe0ee0f4f067654d2c3fb"} Mar 01 09:08:05 crc kubenswrapper[4792]: I0301 09:08:05.527396 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:05 crc kubenswrapper[4792]: I0301 09:08:05.528378 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:05 crc kubenswrapper[4792]: I0301 09:08:05.528469 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:05 crc kubenswrapper[4792]: I0301 09:08:05.528497 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:05 crc kubenswrapper[4792]: I0301 09:08:05.529967 4792 scope.go:117] "RemoveContainer" containerID="27e10bb6a0f100f119755de151dfea900c1268526a3fe0ee0f4f067654d2c3fb" Mar 01 09:08:05 crc kubenswrapper[4792]: W0301 09:08:05.846969 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:05Z is after 2026-02-23T05:33:13Z Mar 01 09:08:05 crc kubenswrapper[4792]: E0301 09:08:05.847100 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:05Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 01 09:08:05 crc kubenswrapper[4792]: E0301 09:08:05.849197 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:05Z is after 2026-02-23T05:33:13Z" node="crc" Mar 01 09:08:05 crc kubenswrapper[4792]: W0301 09:08:05.849885 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:05Z is after 2026-02-23T05:33:13Z Mar 01 09:08:05 crc kubenswrapper[4792]: E0301 09:08:05.849974 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:05Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 01 09:08:05 crc kubenswrapper[4792]: W0301 09:08:05.852175 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:05Z is after 2026-02-23T05:33:13Z Mar 01 09:08:05 crc kubenswrapper[4792]: E0301 09:08:05.852298 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:05Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 01 09:08:05 crc kubenswrapper[4792]: E0301 09:08:05.854817 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:05Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 01 09:08:05 crc kubenswrapper[4792]: I0301 09:08:05.856212 4792 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 01 09:08:05 crc kubenswrapper[4792]: I0301 09:08:05.856293 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 01 09:08:05 crc kubenswrapper[4792]: E0301 09:08:05.856626 4792 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:05Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 01 09:08:05 crc kubenswrapper[4792]: W0301 09:08:05.859284 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:05Z is after 2026-02-23T05:33:13Z Mar 01 09:08:05 crc kubenswrapper[4792]: E0301 09:08:05.859352 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:05Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 01 09:08:05 crc kubenswrapper[4792]: E0301 09:08:05.865331 4792 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:05Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.1898ac74dff1661c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.343613468 +0000 UTC m=+0.585492665,LastTimestamp:2026-03-01 09:07:51.343613468 +0000 UTC m=+0.585492665,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:05 crc kubenswrapper[4792]: I0301 09:08:05.866494 4792 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 01 09:08:05 crc kubenswrapper[4792]: I0301 09:08:05.866601 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 01 09:08:06 crc kubenswrapper[4792]: I0301 09:08:06.349832 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:06Z is after 2026-02-23T05:33:13Z Mar 01 09:08:06 crc kubenswrapper[4792]: I0301 09:08:06.533784 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 01 09:08:06 crc kubenswrapper[4792]: I0301 09:08:06.536235 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"688142c3c70b9a317414df3bbaa2620b7d213567bc82305c33beb3b206eb9dbd"} Mar 01 09:08:06 crc kubenswrapper[4792]: I0301 09:08:06.536537 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:06 crc kubenswrapper[4792]: I0301 09:08:06.537892 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:06 crc kubenswrapper[4792]: I0301 09:08:06.537977 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:06 crc kubenswrapper[4792]: I0301 09:08:06.538000 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:07 crc kubenswrapper[4792]: I0301 09:08:07.350149 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:07Z is after 2026-02-23T05:33:13Z Mar 01 09:08:07 crc kubenswrapper[4792]: I0301 09:08:07.543218 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 01 09:08:07 crc kubenswrapper[4792]: I0301 09:08:07.544149 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 01 09:08:07 crc kubenswrapper[4792]: I0301 09:08:07.547982 4792 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="688142c3c70b9a317414df3bbaa2620b7d213567bc82305c33beb3b206eb9dbd" exitCode=255 Mar 01 09:08:07 crc kubenswrapper[4792]: I0301 09:08:07.548081 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"688142c3c70b9a317414df3bbaa2620b7d213567bc82305c33beb3b206eb9dbd"} Mar 01 09:08:07 crc kubenswrapper[4792]: I0301 09:08:07.548194 4792 scope.go:117] "RemoveContainer" containerID="27e10bb6a0f100f119755de151dfea900c1268526a3fe0ee0f4f067654d2c3fb" Mar 01 09:08:07 crc kubenswrapper[4792]: I0301 09:08:07.548447 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:07 crc kubenswrapper[4792]: I0301 09:08:07.550618 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:07 crc kubenswrapper[4792]: I0301 09:08:07.550692 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:07 crc kubenswrapper[4792]: I0301 09:08:07.550721 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:07 crc kubenswrapper[4792]: I0301 09:08:07.553366 4792 scope.go:117] "RemoveContainer" containerID="688142c3c70b9a317414df3bbaa2620b7d213567bc82305c33beb3b206eb9dbd" Mar 01 09:08:07 crc kubenswrapper[4792]: E0301 09:08:07.553892 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 01 09:08:08 crc kubenswrapper[4792]: I0301 09:08:08.350274 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:08Z is after 2026-02-23T05:33:13Z Mar 01 09:08:08 crc kubenswrapper[4792]: I0301 09:08:08.500143 4792 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 01 09:08:08 crc kubenswrapper[4792]: I0301 09:08:08.500313 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 01 09:08:08 crc kubenswrapper[4792]: I0301 09:08:08.553121 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 01 09:08:09 crc kubenswrapper[4792]: I0301 09:08:09.350371 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:09Z is after 2026-02-23T05:33:13Z Mar 01 09:08:10 crc kubenswrapper[4792]: I0301 09:08:10.244341 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:08:10 crc kubenswrapper[4792]: I0301 09:08:10.244568 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:10 crc kubenswrapper[4792]: I0301 09:08:10.245973 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:10 crc kubenswrapper[4792]: I0301 09:08:10.246012 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:10 crc kubenswrapper[4792]: I0301 09:08:10.246025 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:10 crc kubenswrapper[4792]: I0301 09:08:10.246664 4792 scope.go:117] "RemoveContainer" containerID="688142c3c70b9a317414df3bbaa2620b7d213567bc82305c33beb3b206eb9dbd" Mar 01 09:08:10 crc kubenswrapper[4792]: E0301 09:08:10.246940 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 01 09:08:10 crc kubenswrapper[4792]: I0301 09:08:10.249741 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:08:10 crc kubenswrapper[4792]: I0301 09:08:10.350804 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:10Z is after 2026-02-23T05:33:13Z Mar 01 09:08:10 crc kubenswrapper[4792]: I0301 09:08:10.561000 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:10 crc kubenswrapper[4792]: I0301 09:08:10.561793 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:10 crc kubenswrapper[4792]: I0301 09:08:10.561824 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:10 crc kubenswrapper[4792]: I0301 09:08:10.561836 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:10 crc kubenswrapper[4792]: I0301 09:08:10.562343 4792 scope.go:117] "RemoveContainer" containerID="688142c3c70b9a317414df3bbaa2620b7d213567bc82305c33beb3b206eb9dbd" Mar 01 09:08:10 crc kubenswrapper[4792]: E0301 09:08:10.562525 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 01 09:08:11 crc kubenswrapper[4792]: I0301 09:08:11.012657 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:08:11 crc kubenswrapper[4792]: I0301 09:08:11.012836 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:11 crc kubenswrapper[4792]: I0301 09:08:11.014066 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:11 crc kubenswrapper[4792]: I0301 09:08:11.014164 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:11 crc kubenswrapper[4792]: I0301 09:08:11.014187 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:11 crc kubenswrapper[4792]: I0301 09:08:11.018105 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:08:11 crc kubenswrapper[4792]: I0301 09:08:11.350079 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:11Z is after 2026-02-23T05:33:13Z Mar 01 09:08:11 crc kubenswrapper[4792]: E0301 09:08:11.496835 4792 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 01 09:08:11 crc kubenswrapper[4792]: I0301 09:08:11.563821 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:11 crc kubenswrapper[4792]: I0301 09:08:11.565179 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:11 crc kubenswrapper[4792]: I0301 09:08:11.565263 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:11 crc kubenswrapper[4792]: I0301 09:08:11.565315 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:11 crc kubenswrapper[4792]: I0301 09:08:11.566305 4792 scope.go:117] "RemoveContainer" containerID="688142c3c70b9a317414df3bbaa2620b7d213567bc82305c33beb3b206eb9dbd" Mar 01 09:08:11 crc kubenswrapper[4792]: E0301 09:08:11.566642 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 01 09:08:12 crc kubenswrapper[4792]: I0301 09:08:12.202572 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:08:12 crc kubenswrapper[4792]: I0301 09:08:12.251122 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:12 crc kubenswrapper[4792]: I0301 09:08:12.253016 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:12 crc kubenswrapper[4792]: I0301 09:08:12.253080 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:12 crc kubenswrapper[4792]: I0301 09:08:12.253097 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:12 crc kubenswrapper[4792]: I0301 09:08:12.253138 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 01 09:08:12 crc kubenswrapper[4792]: E0301 09:08:12.257493 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:12Z is after 2026-02-23T05:33:13Z" node="crc" Mar 01 09:08:12 crc kubenswrapper[4792]: E0301 09:08:12.259598 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:12Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 01 09:08:12 crc kubenswrapper[4792]: I0301 09:08:12.349556 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:12Z is after 2026-02-23T05:33:13Z Mar 01 09:08:12 crc kubenswrapper[4792]: I0301 09:08:12.566387 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:12 crc kubenswrapper[4792]: I0301 09:08:12.567220 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:12 crc kubenswrapper[4792]: I0301 09:08:12.567255 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:12 crc kubenswrapper[4792]: I0301 09:08:12.567267 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:12 crc kubenswrapper[4792]: I0301 09:08:12.568077 4792 scope.go:117] "RemoveContainer" containerID="688142c3c70b9a317414df3bbaa2620b7d213567bc82305c33beb3b206eb9dbd" Mar 01 09:08:12 crc kubenswrapper[4792]: E0301 09:08:12.568744 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 01 09:08:13 crc kubenswrapper[4792]: I0301 09:08:13.350137 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:13Z is after 2026-02-23T05:33:13Z Mar 01 09:08:13 crc kubenswrapper[4792]: I0301 09:08:13.931659 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 01 09:08:13 crc kubenswrapper[4792]: I0301 09:08:13.931971 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:13 crc kubenswrapper[4792]: I0301 09:08:13.933688 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:13 crc kubenswrapper[4792]: I0301 09:08:13.933760 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:13 crc kubenswrapper[4792]: I0301 09:08:13.933784 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:13 crc kubenswrapper[4792]: I0301 09:08:13.943854 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 01 09:08:14 crc kubenswrapper[4792]: I0301 09:08:14.080715 4792 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 01 09:08:14 crc kubenswrapper[4792]: E0301 09:08:14.086011 4792 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:14Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 01 09:08:14 crc kubenswrapper[4792]: W0301 09:08:14.088736 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:14Z is after 2026-02-23T05:33:13Z Mar 01 09:08:14 crc kubenswrapper[4792]: E0301 09:08:14.088817 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:14Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 01 09:08:14 crc kubenswrapper[4792]: I0301 09:08:14.349634 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:14Z is after 2026-02-23T05:33:13Z Mar 01 09:08:14 crc kubenswrapper[4792]: I0301 09:08:14.572023 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:14 crc kubenswrapper[4792]: I0301 09:08:14.573465 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:14 crc kubenswrapper[4792]: I0301 09:08:14.573532 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:14 crc kubenswrapper[4792]: I0301 09:08:14.573553 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:15 crc kubenswrapper[4792]: I0301 09:08:15.352143 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:15Z is after 2026-02-23T05:33:13Z Mar 01 09:08:15 crc kubenswrapper[4792]: W0301 09:08:15.357127 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:15Z is after 2026-02-23T05:33:13Z Mar 01 09:08:15 crc kubenswrapper[4792]: E0301 09:08:15.357243 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:15Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 01 09:08:15 crc kubenswrapper[4792]: E0301 09:08:15.871691 4792 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:15Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.1898ac74dff1661c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.343613468 +0000 UTC m=+0.585492665,LastTimestamp:2026-03-01 09:07:51.343613468 +0000 UTC m=+0.585492665,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:16 crc kubenswrapper[4792]: I0301 09:08:16.352427 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:16Z is after 2026-02-23T05:33:13Z Mar 01 09:08:16 crc kubenswrapper[4792]: W0301 09:08:16.740083 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:16Z is after 2026-02-23T05:33:13Z Mar 01 09:08:16 crc kubenswrapper[4792]: E0301 09:08:16.740209 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:16Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 01 09:08:17 crc kubenswrapper[4792]: I0301 09:08:17.352081 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:17Z is after 2026-02-23T05:33:13Z Mar 01 09:08:18 crc kubenswrapper[4792]: I0301 09:08:18.351307 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:18Z is after 2026-02-23T05:33:13Z Mar 01 09:08:18 crc kubenswrapper[4792]: I0301 09:08:18.499940 4792 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 01 09:08:18 crc kubenswrapper[4792]: I0301 09:08:18.500032 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 01 09:08:18 crc kubenswrapper[4792]: I0301 09:08:18.500128 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:08:18 crc kubenswrapper[4792]: I0301 09:08:18.500339 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:18 crc kubenswrapper[4792]: I0301 09:08:18.501892 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:18 crc kubenswrapper[4792]: I0301 09:08:18.501979 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:18 crc kubenswrapper[4792]: I0301 09:08:18.501992 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:18 crc kubenswrapper[4792]: I0301 09:08:18.502642 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"684aa0b5aeaecf3fb572ab400e718956a5373c4fa6651e742d65bdbc3425b9b2"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 01 09:08:18 crc kubenswrapper[4792]: I0301 09:08:18.502821 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://684aa0b5aeaecf3fb572ab400e718956a5373c4fa6651e742d65bdbc3425b9b2" gracePeriod=30 Mar 01 09:08:18 crc kubenswrapper[4792]: W0301 09:08:18.544162 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:18Z is after 2026-02-23T05:33:13Z Mar 01 09:08:18 crc kubenswrapper[4792]: E0301 09:08:18.544253 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:18Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 01 09:08:19 crc kubenswrapper[4792]: I0301 09:08:19.258080 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:19 crc kubenswrapper[4792]: I0301 09:08:19.259901 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:19 crc kubenswrapper[4792]: I0301 09:08:19.259958 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:19 crc kubenswrapper[4792]: I0301 09:08:19.259968 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:19 crc kubenswrapper[4792]: I0301 09:08:19.259993 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 01 09:08:19 crc kubenswrapper[4792]: E0301 09:08:19.264944 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:19Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 01 09:08:19 crc kubenswrapper[4792]: E0301 09:08:19.265095 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:19Z is after 2026-02-23T05:33:13Z" node="crc" Mar 01 09:08:19 crc kubenswrapper[4792]: I0301 09:08:19.353062 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:19Z is after 2026-02-23T05:33:13Z Mar 01 09:08:19 crc kubenswrapper[4792]: I0301 09:08:19.589746 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 01 09:08:19 crc kubenswrapper[4792]: I0301 09:08:19.590648 4792 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="684aa0b5aeaecf3fb572ab400e718956a5373c4fa6651e742d65bdbc3425b9b2" exitCode=255 Mar 01 09:08:19 crc kubenswrapper[4792]: I0301 09:08:19.590713 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"684aa0b5aeaecf3fb572ab400e718956a5373c4fa6651e742d65bdbc3425b9b2"} Mar 01 09:08:19 crc kubenswrapper[4792]: I0301 09:08:19.590755 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1e4a35ca4af33a2e93167010c2d2d02e6ef5fa5c4f65b6b87b344b58cc0a3867"} Mar 01 09:08:19 crc kubenswrapper[4792]: I0301 09:08:19.590939 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:19 crc kubenswrapper[4792]: I0301 09:08:19.591877 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:19 crc kubenswrapper[4792]: I0301 09:08:19.591946 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:19 crc kubenswrapper[4792]: I0301 09:08:19.591964 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:20 crc kubenswrapper[4792]: I0301 09:08:20.352343 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:20Z is after 2026-02-23T05:33:13Z Mar 01 09:08:21 crc kubenswrapper[4792]: I0301 09:08:21.354376 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:21Z is after 2026-02-23T05:33:13Z Mar 01 09:08:21 crc kubenswrapper[4792]: E0301 09:08:21.496945 4792 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 01 09:08:22 crc kubenswrapper[4792]: I0301 09:08:22.350610 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:22Z is after 2026-02-23T05:33:13Z Mar 01 09:08:23 crc kubenswrapper[4792]: I0301 09:08:23.352696 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:23Z is after 2026-02-23T05:33:13Z Mar 01 09:08:24 crc kubenswrapper[4792]: I0301 09:08:24.350103 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:24Z is after 2026-02-23T05:33:13Z Mar 01 09:08:24 crc kubenswrapper[4792]: I0301 09:08:24.408198 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:24 crc kubenswrapper[4792]: I0301 09:08:24.409461 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:24 crc kubenswrapper[4792]: I0301 09:08:24.409514 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:24 crc kubenswrapper[4792]: I0301 09:08:24.409532 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:24 crc kubenswrapper[4792]: I0301 09:08:24.410516 4792 scope.go:117] "RemoveContainer" containerID="688142c3c70b9a317414df3bbaa2620b7d213567bc82305c33beb3b206eb9dbd" Mar 01 09:08:25 crc kubenswrapper[4792]: I0301 09:08:25.348962 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:25Z is after 2026-02-23T05:33:13Z Mar 01 09:08:25 crc kubenswrapper[4792]: I0301 09:08:25.498701 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:08:25 crc kubenswrapper[4792]: I0301 09:08:25.498841 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:25 crc kubenswrapper[4792]: I0301 09:08:25.500342 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:25 crc kubenswrapper[4792]: I0301 09:08:25.500395 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:25 crc kubenswrapper[4792]: I0301 09:08:25.500409 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:25 crc kubenswrapper[4792]: I0301 09:08:25.612249 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 01 09:08:25 crc kubenswrapper[4792]: I0301 09:08:25.613082 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 01 09:08:25 crc kubenswrapper[4792]: I0301 09:08:25.615575 4792 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="01dac4fce4f31460a90ca7eaf5d97ffd8254d0b93b55dcb4598bd038633b40d9" exitCode=255 Mar 01 09:08:25 crc kubenswrapper[4792]: I0301 09:08:25.615624 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"01dac4fce4f31460a90ca7eaf5d97ffd8254d0b93b55dcb4598bd038633b40d9"} Mar 01 09:08:25 crc kubenswrapper[4792]: I0301 09:08:25.615670 4792 scope.go:117] "RemoveContainer" containerID="688142c3c70b9a317414df3bbaa2620b7d213567bc82305c33beb3b206eb9dbd" Mar 01 09:08:25 crc kubenswrapper[4792]: I0301 09:08:25.615882 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:25 crc kubenswrapper[4792]: I0301 09:08:25.617305 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:25 crc kubenswrapper[4792]: I0301 09:08:25.617427 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:25 crc kubenswrapper[4792]: I0301 09:08:25.617452 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:25 crc kubenswrapper[4792]: I0301 09:08:25.618597 4792 scope.go:117] "RemoveContainer" containerID="01dac4fce4f31460a90ca7eaf5d97ffd8254d0b93b55dcb4598bd038633b40d9" Mar 01 09:08:25 crc kubenswrapper[4792]: E0301 09:08:25.618969 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 01 09:08:25 crc kubenswrapper[4792]: E0301 09:08:25.877455 4792 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:25Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.1898ac74dff1661c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.343613468 +0000 UTC m=+0.585492665,LastTimestamp:2026-03-01 09:07:51.343613468 +0000 UTC m=+0.585492665,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:26 crc kubenswrapper[4792]: I0301 09:08:26.266090 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:26 crc kubenswrapper[4792]: I0301 09:08:26.267601 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:26 crc kubenswrapper[4792]: I0301 09:08:26.267643 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:26 crc kubenswrapper[4792]: I0301 09:08:26.267656 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:26 crc kubenswrapper[4792]: I0301 09:08:26.267684 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 01 09:08:26 crc kubenswrapper[4792]: E0301 09:08:26.269510 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:26Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 01 09:08:26 crc kubenswrapper[4792]: E0301 09:08:26.271392 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:26Z is after 2026-02-23T05:33:13Z" node="crc" Mar 01 09:08:26 crc kubenswrapper[4792]: I0301 09:08:26.351795 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:26Z is after 2026-02-23T05:33:13Z Mar 01 09:08:26 crc kubenswrapper[4792]: I0301 09:08:26.622944 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 01 09:08:27 crc kubenswrapper[4792]: I0301 09:08:27.350408 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:27Z is after 2026-02-23T05:33:13Z Mar 01 09:08:28 crc kubenswrapper[4792]: W0301 09:08:28.275208 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:28Z is after 2026-02-23T05:33:13Z Mar 01 09:08:28 crc kubenswrapper[4792]: E0301 09:08:28.275314 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:28Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 01 09:08:28 crc kubenswrapper[4792]: I0301 09:08:28.352350 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:28Z is after 2026-02-23T05:33:13Z Mar 01 09:08:28 crc kubenswrapper[4792]: I0301 09:08:28.498749 4792 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 01 09:08:28 crc kubenswrapper[4792]: I0301 09:08:28.499221 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 01 09:08:29 crc kubenswrapper[4792]: I0301 09:08:29.348535 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:08:29 crc kubenswrapper[4792]: I0301 09:08:29.348820 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:29 crc kubenswrapper[4792]: I0301 09:08:29.350409 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:29 crc kubenswrapper[4792]: I0301 09:08:29.350515 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:29 crc kubenswrapper[4792]: I0301 09:08:29.350584 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:29 crc kubenswrapper[4792]: I0301 09:08:29.352378 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:29Z is after 2026-02-23T05:33:13Z Mar 01 09:08:30 crc kubenswrapper[4792]: I0301 09:08:30.159520 4792 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 01 09:08:30 crc kubenswrapper[4792]: E0301 09:08:30.165475 4792 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:30Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 01 09:08:30 crc kubenswrapper[4792]: E0301 09:08:30.166932 4792 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 01 09:08:30 crc kubenswrapper[4792]: I0301 09:08:30.351017 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:30Z is after 2026-02-23T05:33:13Z Mar 01 09:08:30 crc kubenswrapper[4792]: W0301 09:08:30.536146 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:30Z is after 2026-02-23T05:33:13Z Mar 01 09:08:30 crc kubenswrapper[4792]: E0301 09:08:30.536403 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:30Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 01 09:08:31 crc kubenswrapper[4792]: I0301 09:08:31.017380 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:08:31 crc kubenswrapper[4792]: I0301 09:08:31.018041 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:31 crc kubenswrapper[4792]: I0301 09:08:31.019841 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:31 crc kubenswrapper[4792]: I0301 09:08:31.019883 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:31 crc kubenswrapper[4792]: I0301 09:08:31.019902 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:31 crc kubenswrapper[4792]: I0301 09:08:31.020660 4792 scope.go:117] "RemoveContainer" containerID="01dac4fce4f31460a90ca7eaf5d97ffd8254d0b93b55dcb4598bd038633b40d9" Mar 01 09:08:31 crc kubenswrapper[4792]: E0301 09:08:31.020856 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 01 09:08:31 crc kubenswrapper[4792]: I0301 09:08:31.350524 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:31Z is after 2026-02-23T05:33:13Z Mar 01 09:08:31 crc kubenswrapper[4792]: E0301 09:08:31.497263 4792 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 01 09:08:31 crc kubenswrapper[4792]: W0301 09:08:31.669049 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:31Z is after 2026-02-23T05:33:13Z Mar 01 09:08:31 crc kubenswrapper[4792]: E0301 09:08:31.669176 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:31Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 01 09:08:32 crc kubenswrapper[4792]: I0301 09:08:32.202884 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:08:32 crc kubenswrapper[4792]: I0301 09:08:32.203306 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:32 crc kubenswrapper[4792]: I0301 09:08:32.205180 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:32 crc kubenswrapper[4792]: I0301 09:08:32.205252 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:32 crc kubenswrapper[4792]: I0301 09:08:32.205272 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:32 crc kubenswrapper[4792]: I0301 09:08:32.206279 4792 scope.go:117] "RemoveContainer" containerID="01dac4fce4f31460a90ca7eaf5d97ffd8254d0b93b55dcb4598bd038633b40d9" Mar 01 09:08:32 crc kubenswrapper[4792]: E0301 09:08:32.206635 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 01 09:08:32 crc kubenswrapper[4792]: I0301 09:08:32.352977 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:32Z is after 2026-02-23T05:33:13Z Mar 01 09:08:33 crc kubenswrapper[4792]: I0301 09:08:33.272794 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:33 crc kubenswrapper[4792]: I0301 09:08:33.274753 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:33 crc kubenswrapper[4792]: I0301 09:08:33.275042 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:33 crc kubenswrapper[4792]: I0301 09:08:33.275237 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:33 crc kubenswrapper[4792]: I0301 09:08:33.275478 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 01 09:08:33 crc kubenswrapper[4792]: E0301 09:08:33.275792 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:33Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 01 09:08:33 crc kubenswrapper[4792]: E0301 09:08:33.278968 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:33Z is after 2026-02-23T05:33:13Z" node="crc" Mar 01 09:08:33 crc kubenswrapper[4792]: I0301 09:08:33.352539 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:33Z is after 2026-02-23T05:33:13Z Mar 01 09:08:34 crc kubenswrapper[4792]: I0301 09:08:34.468938 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:34Z is after 2026-02-23T05:33:13Z Mar 01 09:08:35 crc kubenswrapper[4792]: I0301 09:08:35.350240 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:35Z is after 2026-02-23T05:33:13Z Mar 01 09:08:35 crc kubenswrapper[4792]: E0301 09:08:35.881858 4792 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:35Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.1898ac74dff1661c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.343613468 +0000 UTC m=+0.585492665,LastTimestamp:2026-03-01 09:07:51.343613468 +0000 UTC m=+0.585492665,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:36 crc kubenswrapper[4792]: I0301 09:08:36.352125 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:36Z is after 2026-02-23T05:33:13Z Mar 01 09:08:37 crc kubenswrapper[4792]: I0301 09:08:37.352587 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:37Z is after 2026-02-23T05:33:13Z Mar 01 09:08:38 crc kubenswrapper[4792]: I0301 09:08:38.352195 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:38Z is after 2026-02-23T05:33:13Z Mar 01 09:08:38 crc kubenswrapper[4792]: I0301 09:08:38.500137 4792 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 01 09:08:38 crc kubenswrapper[4792]: I0301 09:08:38.500237 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 01 09:08:39 crc kubenswrapper[4792]: I0301 09:08:39.351194 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:08:39 crc kubenswrapper[4792]: W0301 09:08:39.415716 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 01 09:08:39 crc kubenswrapper[4792]: E0301 09:08:39.415776 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 01 09:08:40 crc kubenswrapper[4792]: I0301 09:08:40.279534 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:40 crc kubenswrapper[4792]: I0301 09:08:40.281032 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:40 crc kubenswrapper[4792]: I0301 09:08:40.281083 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:40 crc kubenswrapper[4792]: I0301 09:08:40.281096 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:40 crc kubenswrapper[4792]: I0301 09:08:40.281128 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 01 09:08:40 crc kubenswrapper[4792]: E0301 09:08:40.286200 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 01 09:08:40 crc kubenswrapper[4792]: E0301 09:08:40.286294 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 01 09:08:40 crc kubenswrapper[4792]: I0301 09:08:40.348106 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:08:41 crc kubenswrapper[4792]: I0301 09:08:41.353495 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:08:41 crc kubenswrapper[4792]: E0301 09:08:41.497574 4792 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 01 09:08:42 crc kubenswrapper[4792]: I0301 09:08:42.350846 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:08:42 crc kubenswrapper[4792]: I0301 09:08:42.412316 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 01 09:08:42 crc kubenswrapper[4792]: I0301 09:08:42.412465 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:42 crc kubenswrapper[4792]: I0301 09:08:42.413674 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:42 crc kubenswrapper[4792]: I0301 09:08:42.413727 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:42 crc kubenswrapper[4792]: I0301 09:08:42.413746 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:43 crc kubenswrapper[4792]: I0301 09:08:43.351233 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:08:44 crc kubenswrapper[4792]: I0301 09:08:44.352184 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:08:45 crc kubenswrapper[4792]: I0301 09:08:45.352595 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:08:45 crc kubenswrapper[4792]: I0301 09:08:45.407834 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:45 crc kubenswrapper[4792]: I0301 09:08:45.409133 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:45 crc kubenswrapper[4792]: I0301 09:08:45.409175 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:45 crc kubenswrapper[4792]: I0301 09:08:45.409189 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:45 crc kubenswrapper[4792]: I0301 09:08:45.409815 4792 scope.go:117] "RemoveContainer" containerID="01dac4fce4f31460a90ca7eaf5d97ffd8254d0b93b55dcb4598bd038633b40d9" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.889459 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74dff1661c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.343613468 +0000 UTC m=+0.585492665,LastTimestamp:2026-03-01 09:07:51.343613468 +0000 UTC m=+0.585492665,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.895051 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e2e82468 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.393338472 +0000 UTC m=+0.635217669,LastTimestamp:2026-03-01 09:07:51.393338472 +0000 UTC m=+0.635217669,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.901029 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e2e8c36e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.393379182 +0000 UTC m=+0.635258379,LastTimestamp:2026-03-01 09:07:51.393379182 +0000 UTC m=+0.635258379,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.909871 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e2e8f137 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.393390903 +0000 UTC m=+0.635270100,LastTimestamp:2026-03-01 09:07:51.393390903 +0000 UTC m=+0.635270100,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.915427 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e99c9cf7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.505829111 +0000 UTC m=+0.747708308,LastTimestamp:2026-03-01 09:07:51.505829111 +0000 UTC m=+0.747708308,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.921271 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898ac74e2e82468\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e2e82468 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.393338472 +0000 UTC m=+0.635217669,LastTimestamp:2026-03-01 09:07:51.509044676 +0000 UTC m=+0.750923873,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.926622 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898ac74e2e8c36e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e2e8c36e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.393379182 +0000 UTC m=+0.635258379,LastTimestamp:2026-03-01 09:07:51.509067566 +0000 UTC m=+0.750946763,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.932268 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898ac74e2e8f137\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e2e8f137 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.393390903 +0000 UTC m=+0.635270100,LastTimestamp:2026-03-01 09:07:51.509077396 +0000 UTC m=+0.750956593,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.938686 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898ac74e2e82468\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e2e82468 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.393338472 +0000 UTC m=+0.635217669,LastTimestamp:2026-03-01 09:07:51.510543211 +0000 UTC m=+0.752422408,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.946690 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898ac74e2e8c36e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e2e8c36e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.393379182 +0000 UTC m=+0.635258379,LastTimestamp:2026-03-01 09:07:51.510564522 +0000 UTC m=+0.752443719,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.950304 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898ac74e2e8f137\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e2e8f137 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.393390903 +0000 UTC m=+0.635270100,LastTimestamp:2026-03-01 09:07:51.510573612 +0000 UTC m=+0.752452809,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.953658 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898ac74e2e82468\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e2e82468 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.393338472 +0000 UTC m=+0.635217669,LastTimestamp:2026-03-01 09:07:51.511087651 +0000 UTC m=+0.752966848,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.956853 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898ac74e2e8c36e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e2e8c36e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.393379182 +0000 UTC m=+0.635258379,LastTimestamp:2026-03-01 09:07:51.511123951 +0000 UTC m=+0.753003148,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.960244 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898ac74e2e8f137\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e2e8f137 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.393390903 +0000 UTC m=+0.635270100,LastTimestamp:2026-03-01 09:07:51.511139271 +0000 UTC m=+0.753018458,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.963367 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898ac74e2e82468\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e2e82468 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.393338472 +0000 UTC m=+0.635217669,LastTimestamp:2026-03-01 09:07:51.51223718 +0000 UTC m=+0.754116377,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.966803 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898ac74e2e8c36e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e2e8c36e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.393379182 +0000 UTC m=+0.635258379,LastTimestamp:2026-03-01 09:07:51.512309911 +0000 UTC m=+0.754189108,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.970622 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898ac74e2e8f137\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e2e8f137 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.393390903 +0000 UTC m=+0.635270100,LastTimestamp:2026-03-01 09:07:51.512317931 +0000 UTC m=+0.754197128,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.974534 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898ac74e2e82468\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e2e82468 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.393338472 +0000 UTC m=+0.635217669,LastTimestamp:2026-03-01 09:07:51.512383053 +0000 UTC m=+0.754262250,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.978518 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898ac74e2e8c36e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e2e8c36e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.393379182 +0000 UTC m=+0.635258379,LastTimestamp:2026-03-01 09:07:51.512400153 +0000 UTC m=+0.754279350,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.982493 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898ac74e2e8f137\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e2e8f137 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.393390903 +0000 UTC m=+0.635270100,LastTimestamp:2026-03-01 09:07:51.512426683 +0000 UTC m=+0.754305880,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.986427 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898ac74e2e82468\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e2e82468 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.393338472 +0000 UTC m=+0.635217669,LastTimestamp:2026-03-01 09:07:51.514226754 +0000 UTC m=+0.756105951,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.989576 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898ac74e2e8c36e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e2e8c36e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.393379182 +0000 UTC m=+0.635258379,LastTimestamp:2026-03-01 09:07:51.514242644 +0000 UTC m=+0.756121841,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.992805 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898ac74e2e8f137\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e2e8f137 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.393390903 +0000 UTC m=+0.635270100,LastTimestamp:2026-03-01 09:07:51.514254644 +0000 UTC m=+0.756133841,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.995848 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898ac74e2e82468\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e2e82468 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.393338472 +0000 UTC m=+0.635217669,LastTimestamp:2026-03-01 09:07:51.515412044 +0000 UTC m=+0.757291241,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.999011 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898ac74e2e8c36e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e2e8c36e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.393379182 +0000 UTC m=+0.635258379,LastTimestamp:2026-03-01 09:07:51.515429814 +0000 UTC m=+0.757309011,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.004969 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1898ac75033b4de0 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.935659488 +0000 UTC m=+1.177538685,LastTimestamp:2026-03-01 09:07:51.935659488 +0000 UTC m=+1.177538685,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.009072 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1898ac750342c65e openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.936149086 +0000 UTC m=+1.178028283,LastTimestamp:2026-03-01 09:07:51.936149086 +0000 UTC m=+1.178028283,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.015365 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac7503b38ed0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.943540432 +0000 UTC m=+1.185419639,LastTimestamp:2026-03-01 09:07:51.943540432 +0000 UTC m=+1.185419639,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.018846 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898ac7504c1862e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.961232942 +0000 UTC m=+1.203112149,LastTimestamp:2026-03-01 09:07:51.961232942 +0000 UTC m=+1.203112149,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.022101 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898ac7505c6f253 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.978365523 +0000 UTC m=+1.220244730,LastTimestamp:2026-03-01 09:07:51.978365523 +0000 UTC m=+1.220244730,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.025377 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1898ac75286b6ad3 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:52.559569619 +0000 UTC m=+1.801448816,LastTimestamp:2026-03-01 09:07:52.559569619 +0000 UTC m=+1.801448816,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.028262 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1898ac7528c5651f openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:52.565466399 +0000 UTC m=+1.807345596,LastTimestamp:2026-03-01 09:07:52.565466399 +0000 UTC m=+1.807345596,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.031183 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898ac7528ccf64e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:52.565962318 +0000 UTC m=+1.807841515,LastTimestamp:2026-03-01 09:07:52.565962318 +0000 UTC m=+1.807841515,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.034169 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898ac7528cd916d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:52.566002029 +0000 UTC m=+1.807881226,LastTimestamp:2026-03-01 09:07:52.566002029 +0000 UTC m=+1.807881226,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.037230 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac7528d8a72b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:52.566728491 +0000 UTC m=+1.808607688,LastTimestamp:2026-03-01 09:07:52.566728491 +0000 UTC m=+1.808607688,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.040274 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1898ac752907a95b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:52.569809243 +0000 UTC m=+1.811688440,LastTimestamp:2026-03-01 09:07:52.569809243 +0000 UTC m=+1.811688440,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.043346 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac7529989bd8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:52.579308504 +0000 UTC m=+1.821187711,LastTimestamp:2026-03-01 09:07:52.579308504 +0000 UTC m=+1.821187711,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.046431 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898ac7529ada02c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:52.580685868 +0000 UTC m=+1.822565065,LastTimestamp:2026-03-01 09:07:52.580685868 +0000 UTC m=+1.822565065,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.050287 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1898ac752a155fff openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:52.587485183 +0000 UTC m=+1.829364400,LastTimestamp:2026-03-01 09:07:52.587485183 +0000 UTC m=+1.829364400,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.053312 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898ac752a4a8530 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:52.590968112 +0000 UTC m=+1.832847299,LastTimestamp:2026-03-01 09:07:52.590968112 +0000 UTC m=+1.832847299,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.056190 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898ac752a631bde openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:52.59257955 +0000 UTC m=+1.834458747,LastTimestamp:2026-03-01 09:07:52.59257955 +0000 UTC m=+1.834458747,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.059073 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898ac753c605ce1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:52.894389473 +0000 UTC m=+2.136268690,LastTimestamp:2026-03-01 09:07:52.894389473 +0000 UTC m=+2.136268690,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: I0301 09:08:46.733452 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.733770 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898ac753d405016 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:52.909066262 +0000 UTC m=+2.150945499,LastTimestamp:2026-03-01 09:07:52.909066262 +0000 UTC m=+2.150945499,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.741997 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898ac753d5b7879 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:52.910846073 +0000 UTC m=+2.152725310,LastTimestamp:2026-03-01 09:07:52.910846073 +0000 UTC m=+2.152725310,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.747460 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898ac754a119981 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.124108673 +0000 UTC m=+2.365987890,LastTimestamp:2026-03-01 09:07:53.124108673 +0000 UTC m=+2.365987890,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.755352 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898ac754af7b237 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.139188279 +0000 UTC m=+2.381067516,LastTimestamp:2026-03-01 09:07:53.139188279 +0000 UTC m=+2.381067516,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.764111 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898ac754b0f4d17 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.140735255 +0000 UTC m=+2.382614472,LastTimestamp:2026-03-01 09:07:53.140735255 +0000 UTC m=+2.382614472,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.771601 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898ac7556abcc05 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.335540741 +0000 UTC m=+2.577419938,LastTimestamp:2026-03-01 09:07:53.335540741 +0000 UTC m=+2.577419938,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.779521 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898ac7557731cba openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.348603066 +0000 UTC m=+2.590482283,LastTimestamp:2026-03-01 09:07:53.348603066 +0000 UTC m=+2.590482283,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.784532 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898ac755d3f8e6a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.445887594 +0000 UTC m=+2.687766831,LastTimestamp:2026-03-01 09:07:53.445887594 +0000 UTC m=+2.687766831,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.794326 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac755ddc85a9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.456174505 +0000 UTC m=+2.698053732,LastTimestamp:2026-03-01 09:07:53.456174505 +0000 UTC m=+2.698053732,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.799516 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1898ac755e3aa9da openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.462344154 +0000 UTC m=+2.704223371,LastTimestamp:2026-03-01 09:07:53.462344154 +0000 UTC m=+2.704223371,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.807431 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1898ac755e53a0b3 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.463980211 +0000 UTC m=+2.705859448,LastTimestamp:2026-03-01 09:07:53.463980211 +0000 UTC m=+2.705859448,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.813116 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac756af9f510 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.676207376 +0000 UTC m=+2.918086573,LastTimestamp:2026-03-01 09:07:53.676207376 +0000 UTC m=+2.918086573,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.818267 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898ac756b22b90a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.678878986 +0000 UTC m=+2.920758183,LastTimestamp:2026-03-01 09:07:53.678878986 +0000 UTC m=+2.920758183,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.822730 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1898ac756bc4c6fa openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.689499386 +0000 UTC m=+2.931378583,LastTimestamp:2026-03-01 09:07:53.689499386 +0000 UTC m=+2.931378583,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.827166 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898ac756c095ba0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.693993888 +0000 UTC m=+2.935873085,LastTimestamp:2026-03-01 09:07:53.693993888 +0000 UTC m=+2.935873085,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.831973 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898ac756c24942e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.695777838 +0000 UTC m=+2.937657035,LastTimestamp:2026-03-01 09:07:53.695777838 +0000 UTC m=+2.937657035,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.836821 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1898ac756c7abc08 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.701424136 +0000 UTC m=+2.943303353,LastTimestamp:2026-03-01 09:07:53.701424136 +0000 UTC m=+2.943303353,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.839005 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac756c9742c6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.703293638 +0000 UTC m=+2.945172835,LastTimestamp:2026-03-01 09:07:53.703293638 +0000 UTC m=+2.945172835,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.843942 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1898ac756de494e7 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.725138151 +0000 UTC m=+2.967017348,LastTimestamp:2026-03-01 09:07:53.725138151 +0000 UTC m=+2.967017348,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.848074 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1898ac756df25b78 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.726040952 +0000 UTC m=+2.967920149,LastTimestamp:2026-03-01 09:07:53.726040952 +0000 UTC m=+2.967920149,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.852431 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1898ac756df595d1 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.726252497 +0000 UTC m=+2.968131694,LastTimestamp:2026-03-01 09:07:53.726252497 +0000 UTC m=+2.968131694,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.856924 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898ac7578237b11 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.897032465 +0000 UTC m=+3.138911672,LastTimestamp:2026-03-01 09:07:53.897032465 +0000 UTC m=+3.138911672,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.861436 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1898ac757970263d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.918834237 +0000 UTC m=+3.160713444,LastTimestamp:2026-03-01 09:07:53.918834237 +0000 UTC m=+3.160713444,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.865557 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898ac757973a831 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.919064113 +0000 UTC m=+3.160943320,LastTimestamp:2026-03-01 09:07:53.919064113 +0000 UTC m=+3.160943320,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.870677 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898ac757990b82a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.920968746 +0000 UTC m=+3.162847943,LastTimestamp:2026-03-01 09:07:53.920968746 +0000 UTC m=+3.162847943,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.875560 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1898ac757a632257 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.934758487 +0000 UTC m=+3.176637684,LastTimestamp:2026-03-01 09:07:53.934758487 +0000 UTC m=+3.176637684,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.879716 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1898ac757a9fdf17 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.938738967 +0000 UTC m=+3.180618154,LastTimestamp:2026-03-01 09:07:53.938738967 +0000 UTC m=+3.180618154,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.884297 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898ac75868facc5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:54.139004101 +0000 UTC m=+3.380883298,LastTimestamp:2026-03-01 09:07:54.139004101 +0000 UTC m=+3.380883298,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.888274 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898ac758737afa0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:54.15001488 +0000 UTC m=+3.391894077,LastTimestamp:2026-03-01 09:07:54.15001488 +0000 UTC m=+3.391894077,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.891990 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898ac75874631c6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:54.150965702 +0000 UTC m=+3.392844899,LastTimestamp:2026-03-01 09:07:54.150965702 +0000 UTC m=+3.392844899,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.899161 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1898ac758888fa80 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:54.17211968 +0000 UTC m=+3.413998877,LastTimestamp:2026-03-01 09:07:54.17211968 +0000 UTC m=+3.413998877,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.903087 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1898ac75899bc1ec openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:54.190127596 +0000 UTC m=+3.432006783,LastTimestamp:2026-03-01 09:07:54.190127596 +0000 UTC m=+3.432006783,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.909100 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898ac75930bfd4e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:54.348477774 +0000 UTC m=+3.590356981,LastTimestamp:2026-03-01 09:07:54.348477774 +0000 UTC m=+3.590356981,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.912689 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898ac7593be3ea4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:54.360159908 +0000 UTC m=+3.602039105,LastTimestamp:2026-03-01 09:07:54.360159908 +0000 UTC m=+3.602039105,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.918467 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898ac7593ceb0a8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:54.361237672 +0000 UTC m=+3.603116869,LastTimestamp:2026-03-01 09:07:54.361237672 +0000 UTC m=+3.603116869,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.922737 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac759ae4b79e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:54.480121758 +0000 UTC m=+3.722000945,LastTimestamp:2026-03-01 09:07:54.480121758 +0000 UTC m=+3.722000945,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.926189 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898ac759e6b1cba openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:54.539261114 +0000 UTC m=+3.781140311,LastTimestamp:2026-03-01 09:07:54.539261114 +0000 UTC m=+3.781140311,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.929538 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898ac759ef2cbb1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:54.548153265 +0000 UTC m=+3.790032462,LastTimestamp:2026-03-01 09:07:54.548153265 +0000 UTC m=+3.790032462,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.933058 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac75a689e7ee openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:54.675496942 +0000 UTC m=+3.917376139,LastTimestamp:2026-03-01 09:07:54.675496942 +0000 UTC m=+3.917376139,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.938417 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac75a72ef6ac openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:54.686314156 +0000 UTC m=+3.928193343,LastTimestamp:2026-03-01 09:07:54.686314156 +0000 UTC m=+3.928193343,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.943745 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac75d6ef9f4a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:55.487469386 +0000 UTC m=+4.729348603,LastTimestamp:2026-03-01 09:07:55.487469386 +0000 UTC m=+4.729348603,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.948777 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac75e373e090 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:55.69746344 +0000 UTC m=+4.939342637,LastTimestamp:2026-03-01 09:07:55.69746344 +0000 UTC m=+4.939342637,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.953837 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac75e416984f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:55.708127311 +0000 UTC m=+4.950006508,LastTimestamp:2026-03-01 09:07:55.708127311 +0000 UTC m=+4.950006508,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.958791 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac75e42c334b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:55.709543243 +0000 UTC m=+4.951422470,LastTimestamp:2026-03-01 09:07:55.709543243 +0000 UTC m=+4.951422470,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.963005 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac75f2d88a8a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:55.955718794 +0000 UTC m=+5.197597991,LastTimestamp:2026-03-01 09:07:55.955718794 +0000 UTC m=+5.197597991,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.966305 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac75f3b4457f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:55.970119039 +0000 UTC m=+5.211998236,LastTimestamp:2026-03-01 09:07:55.970119039 +0000 UTC m=+5.211998236,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.970164 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac75f3cbf698 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:55.971671704 +0000 UTC m=+5.213550901,LastTimestamp:2026-03-01 09:07:55.971671704 +0000 UTC m=+5.213550901,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.973192 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac76037f2b29 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:56.235074345 +0000 UTC m=+5.476953572,LastTimestamp:2026-03-01 09:07:56.235074345 +0000 UTC m=+5.476953572,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.976014 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac76048c3c5e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:56.252707934 +0000 UTC m=+5.494587161,LastTimestamp:2026-03-01 09:07:56.252707934 +0000 UTC m=+5.494587161,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.981705 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac7604a64dd6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:56.254416342 +0000 UTC m=+5.496295569,LastTimestamp:2026-03-01 09:07:56.254416342 +0000 UTC m=+5.496295569,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.987577 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac7612cfdb6e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:56.49202059 +0000 UTC m=+5.733899777,LastTimestamp:2026-03-01 09:07:56.49202059 +0000 UTC m=+5.733899777,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.993101 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac761366ee4a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:56.501921354 +0000 UTC m=+5.743800551,LastTimestamp:2026-03-01 09:07:56.501921354 +0000 UTC m=+5.743800551,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.999045 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac7613825da6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:56.503719334 +0000 UTC m=+5.745598531,LastTimestamp:2026-03-01 09:07:56.503719334 +0000 UTC m=+5.745598531,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.003162 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac761ed176b7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:56.693452471 +0000 UTC m=+5.935331668,LastTimestamp:2026-03-01 09:07:56.693452471 +0000 UTC m=+5.935331668,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.006851 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac761f7403e1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:56.704105441 +0000 UTC m=+5.945984638,LastTimestamp:2026-03-01 09:07:56.704105441 +0000 UTC m=+5.945984638,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.011140 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 01 09:08:47 crc kubenswrapper[4792]: &Event{ObjectMeta:{kube-controller-manager-crc.1898ac768a797c5a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 01 09:08:47 crc kubenswrapper[4792]: body: Mar 01 09:08:47 crc kubenswrapper[4792]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:58.499626074 +0000 UTC m=+7.741505301,LastTimestamp:2026-03-01 09:07:58.499626074 +0000 UTC m=+7.741505301,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 01 09:08:47 crc kubenswrapper[4792]: > Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.014408 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898ac768a7ad0e7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:58.499713255 +0000 UTC m=+7.741592492,LastTimestamp:2026-03-01 09:07:58.499713255 +0000 UTC m=+7.741592492,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.019289 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1898ac7593ceb0a8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898ac7593ceb0a8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:54.361237672 +0000 UTC m=+3.603116869,LastTimestamp:2026-03-01 09:08:05.531429824 +0000 UTC m=+14.773309021,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.023206 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1898ac759e6b1cba\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898ac759e6b1cba openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:54.539261114 +0000 UTC m=+3.781140311,LastTimestamp:2026-03-01 09:08:05.784587199 +0000 UTC m=+15.026466406,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.026484 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1898ac759ef2cbb1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898ac759ef2cbb1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:54.548153265 +0000 UTC m=+3.790032462,LastTimestamp:2026-03-01 09:08:05.802115155 +0000 UTC m=+15.043994352,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.030025 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 01 09:08:47 crc kubenswrapper[4792]: &Event{ObjectMeta:{kube-apiserver-crc.1898ac7840f70a28 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 01 09:08:47 crc kubenswrapper[4792]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 01 09:08:47 crc kubenswrapper[4792]: Mar 01 09:08:47 crc kubenswrapper[4792]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:08:05.856274984 +0000 UTC m=+15.098154181,LastTimestamp:2026-03-01 09:08:05.856274984 +0000 UTC m=+15.098154181,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 01 09:08:47 crc kubenswrapper[4792]: > Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.033216 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898ac7840f7bf24 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:08:05.856321316 +0000 UTC m=+15.098200513,LastTimestamp:2026-03-01 09:08:05.856321316 +0000 UTC m=+15.098200513,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.037182 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1898ac7840f70a28\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 01 09:08:47 crc kubenswrapper[4792]: &Event{ObjectMeta:{kube-apiserver-crc.1898ac7840f70a28 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 01 09:08:47 crc kubenswrapper[4792]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 01 09:08:47 crc kubenswrapper[4792]: Mar 01 09:08:47 crc kubenswrapper[4792]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:08:05.856274984 +0000 UTC m=+15.098154181,LastTimestamp:2026-03-01 09:08:05.866560436 +0000 UTC m=+15.108439643,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 01 09:08:47 crc kubenswrapper[4792]: > Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.042231 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1898ac7840f7bf24\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898ac7840f7bf24 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:08:05.856321316 +0000 UTC m=+15.098200513,LastTimestamp:2026-03-01 09:08:05.866671319 +0000 UTC m=+15.108550526,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.046146 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 01 09:08:47 crc kubenswrapper[4792]: &Event{ObjectMeta:{kube-controller-manager-crc.1898ac78de8f7303 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 01 09:08:47 crc kubenswrapper[4792]: body: Mar 01 09:08:47 crc kubenswrapper[4792]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:08:08.500286211 +0000 UTC m=+17.742165418,LastTimestamp:2026-03-01 09:08:08.500286211 +0000 UTC m=+17.742165418,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 01 09:08:47 crc kubenswrapper[4792]: > Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.049374 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898ac78de9075f7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:08:08.500352503 +0000 UTC m=+17.742231700,LastTimestamp:2026-03-01 09:08:08.500352503 +0000 UTC m=+17.742231700,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.054351 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1898ac78de8f7303\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 01 09:08:47 crc kubenswrapper[4792]: &Event{ObjectMeta:{kube-controller-manager-crc.1898ac78de8f7303 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 01 09:08:47 crc kubenswrapper[4792]: body: Mar 01 09:08:47 crc kubenswrapper[4792]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:08:08.500286211 +0000 UTC m=+17.742165418,LastTimestamp:2026-03-01 09:08:18.500007446 +0000 UTC m=+27.741886643,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 01 09:08:47 crc kubenswrapper[4792]: > Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.057491 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1898ac78de9075f7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898ac78de9075f7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:08:08.500352503 +0000 UTC m=+17.742231700,LastTimestamp:2026-03-01 09:08:18.500073558 +0000 UTC m=+27.741952745,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.061382 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898ac7b32c1bd0a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:08:18.502802698 +0000 UTC m=+27.744681895,LastTimestamp:2026-03-01 09:08:18.502802698 +0000 UTC m=+27.744681895,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.064692 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1898ac752a631bde\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898ac752a631bde openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:52.59257955 +0000 UTC m=+1.834458747,LastTimestamp:2026-03-01 09:08:18.630126718 +0000 UTC m=+27.872005915,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.068159 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1898ac753c605ce1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898ac753c605ce1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:52.894389473 +0000 UTC m=+2.136268690,LastTimestamp:2026-03-01 09:08:18.829823663 +0000 UTC m=+28.071702870,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.071815 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1898ac753d405016\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898ac753d405016 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:52.909066262 +0000 UTC m=+2.150945499,LastTimestamp:2026-03-01 09:08:18.840047623 +0000 UTC m=+28.081926820,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.076232 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1898ac78de8f7303\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 01 09:08:47 crc kubenswrapper[4792]: &Event{ObjectMeta:{kube-controller-manager-crc.1898ac78de8f7303 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 01 09:08:47 crc kubenswrapper[4792]: body: Mar 01 09:08:47 crc kubenswrapper[4792]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:08:08.500286211 +0000 UTC m=+17.742165418,LastTimestamp:2026-03-01 09:08:28.499180456 +0000 UTC m=+37.741059713,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 01 09:08:47 crc kubenswrapper[4792]: > Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.080038 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1898ac78de9075f7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898ac78de9075f7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:08:08.500352503 +0000 UTC m=+17.742231700,LastTimestamp:2026-03-01 09:08:28.499392611 +0000 UTC m=+37.741271868,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.085387 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1898ac78de8f7303\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 01 09:08:47 crc kubenswrapper[4792]: &Event{ObjectMeta:{kube-controller-manager-crc.1898ac78de8f7303 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 01 09:08:47 crc kubenswrapper[4792]: body: Mar 01 09:08:47 crc kubenswrapper[4792]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:08:08.500286211 +0000 UTC m=+17.742165418,LastTimestamp:2026-03-01 09:08:38.500206703 +0000 UTC m=+47.742085930,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 01 09:08:47 crc kubenswrapper[4792]: > Mar 01 09:08:47 crc kubenswrapper[4792]: I0301 09:08:47.287246 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:47 crc kubenswrapper[4792]: I0301 09:08:47.288469 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:47 crc kubenswrapper[4792]: I0301 09:08:47.288497 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:47 crc kubenswrapper[4792]: I0301 09:08:47.288522 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:47 crc kubenswrapper[4792]: I0301 09:08:47.288546 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.293483 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.293826 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 01 09:08:47 crc kubenswrapper[4792]: I0301 09:08:47.348094 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:08:47 crc kubenswrapper[4792]: I0301 09:08:47.747745 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 01 09:08:47 crc kubenswrapper[4792]: I0301 09:08:47.749205 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 01 09:08:47 crc kubenswrapper[4792]: I0301 09:08:47.752417 4792 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="63c6c961bf7ee7b8486397f8898f6cd31605a2d02f3c3b3627c7c906e42eb44c" exitCode=255 Mar 01 09:08:47 crc kubenswrapper[4792]: I0301 09:08:47.752458 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"63c6c961bf7ee7b8486397f8898f6cd31605a2d02f3c3b3627c7c906e42eb44c"} Mar 01 09:08:47 crc kubenswrapper[4792]: I0301 09:08:47.752537 4792 scope.go:117] "RemoveContainer" containerID="01dac4fce4f31460a90ca7eaf5d97ffd8254d0b93b55dcb4598bd038633b40d9" Mar 01 09:08:47 crc kubenswrapper[4792]: I0301 09:08:47.752718 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:47 crc kubenswrapper[4792]: I0301 09:08:47.754287 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:47 crc kubenswrapper[4792]: I0301 09:08:47.754417 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:47 crc kubenswrapper[4792]: I0301 09:08:47.754444 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:47 crc kubenswrapper[4792]: I0301 09:08:47.755344 4792 scope.go:117] "RemoveContainer" containerID="63c6c961bf7ee7b8486397f8898f6cd31605a2d02f3c3b3627c7c906e42eb44c" Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.755620 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 01 09:08:48 crc kubenswrapper[4792]: I0301 09:08:48.351937 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:08:48 crc kubenswrapper[4792]: I0301 09:08:48.500053 4792 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 01 09:08:48 crc kubenswrapper[4792]: I0301 09:08:48.500169 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 01 09:08:48 crc kubenswrapper[4792]: I0301 09:08:48.500247 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:08:48 crc kubenswrapper[4792]: I0301 09:08:48.500436 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:48 crc kubenswrapper[4792]: I0301 09:08:48.501993 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:48 crc kubenswrapper[4792]: I0301 09:08:48.502030 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:48 crc kubenswrapper[4792]: I0301 09:08:48.502048 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:48 crc kubenswrapper[4792]: I0301 09:08:48.502686 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"1e4a35ca4af33a2e93167010c2d2d02e6ef5fa5c4f65b6b87b344b58cc0a3867"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 01 09:08:48 crc kubenswrapper[4792]: I0301 09:08:48.502880 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://1e4a35ca4af33a2e93167010c2d2d02e6ef5fa5c4f65b6b87b344b58cc0a3867" gracePeriod=30 Mar 01 09:08:48 crc kubenswrapper[4792]: I0301 09:08:48.756757 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 01 09:08:48 crc kubenswrapper[4792]: I0301 09:08:48.760661 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 01 09:08:48 crc kubenswrapper[4792]: I0301 09:08:48.761868 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 01 09:08:48 crc kubenswrapper[4792]: I0301 09:08:48.762448 4792 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="1e4a35ca4af33a2e93167010c2d2d02e6ef5fa5c4f65b6b87b344b58cc0a3867" exitCode=255 Mar 01 09:08:48 crc kubenswrapper[4792]: I0301 09:08:48.762502 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"1e4a35ca4af33a2e93167010c2d2d02e6ef5fa5c4f65b6b87b344b58cc0a3867"} Mar 01 09:08:48 crc kubenswrapper[4792]: I0301 09:08:48.762540 4792 scope.go:117] "RemoveContainer" containerID="684aa0b5aeaecf3fb572ab400e718956a5373c4fa6651e742d65bdbc3425b9b2" Mar 01 09:08:49 crc kubenswrapper[4792]: I0301 09:08:49.351651 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:08:49 crc kubenswrapper[4792]: I0301 09:08:49.765532 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 01 09:08:49 crc kubenswrapper[4792]: I0301 09:08:49.766870 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f8130bb3020d8a95b40f9675531603ec0382a75d197b88e9b13b7f3bf3c0ca5c"} Mar 01 09:08:49 crc kubenswrapper[4792]: I0301 09:08:49.766984 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:49 crc kubenswrapper[4792]: I0301 09:08:49.767712 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:49 crc kubenswrapper[4792]: I0301 09:08:49.767741 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:49 crc kubenswrapper[4792]: I0301 09:08:49.767752 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:50 crc kubenswrapper[4792]: I0301 09:08:50.350199 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:08:50 crc kubenswrapper[4792]: I0301 09:08:50.769299 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:50 crc kubenswrapper[4792]: I0301 09:08:50.770956 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:50 crc kubenswrapper[4792]: I0301 09:08:50.771011 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:50 crc kubenswrapper[4792]: I0301 09:08:50.771027 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:51 crc kubenswrapper[4792]: I0301 09:08:51.017734 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:08:51 crc kubenswrapper[4792]: I0301 09:08:51.017922 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:51 crc kubenswrapper[4792]: I0301 09:08:51.018978 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:51 crc kubenswrapper[4792]: I0301 09:08:51.019088 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:51 crc kubenswrapper[4792]: I0301 09:08:51.019158 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:51 crc kubenswrapper[4792]: I0301 09:08:51.019748 4792 scope.go:117] "RemoveContainer" containerID="63c6c961bf7ee7b8486397f8898f6cd31605a2d02f3c3b3627c7c906e42eb44c" Mar 01 09:08:51 crc kubenswrapper[4792]: E0301 09:08:51.020017 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 01 09:08:51 crc kubenswrapper[4792]: I0301 09:08:51.352056 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:08:51 crc kubenswrapper[4792]: E0301 09:08:51.498188 4792 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 01 09:08:52 crc kubenswrapper[4792]: I0301 09:08:52.203083 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:08:52 crc kubenswrapper[4792]: I0301 09:08:52.203285 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:52 crc kubenswrapper[4792]: I0301 09:08:52.204371 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:52 crc kubenswrapper[4792]: I0301 09:08:52.204492 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:52 crc kubenswrapper[4792]: I0301 09:08:52.204556 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:52 crc kubenswrapper[4792]: I0301 09:08:52.205223 4792 scope.go:117] "RemoveContainer" containerID="63c6c961bf7ee7b8486397f8898f6cd31605a2d02f3c3b3627c7c906e42eb44c" Mar 01 09:08:52 crc kubenswrapper[4792]: E0301 09:08:52.205457 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 01 09:08:52 crc kubenswrapper[4792]: I0301 09:08:52.350864 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:08:53 crc kubenswrapper[4792]: I0301 09:08:53.351245 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:08:54 crc kubenswrapper[4792]: I0301 09:08:54.294487 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:54 crc kubenswrapper[4792]: I0301 09:08:54.295839 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:54 crc kubenswrapper[4792]: I0301 09:08:54.295972 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:54 crc kubenswrapper[4792]: I0301 09:08:54.296105 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:54 crc kubenswrapper[4792]: I0301 09:08:54.296211 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 01 09:08:54 crc kubenswrapper[4792]: E0301 09:08:54.301031 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 01 09:08:54 crc kubenswrapper[4792]: E0301 09:08:54.301126 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 01 09:08:54 crc kubenswrapper[4792]: I0301 09:08:54.348331 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:08:55 crc kubenswrapper[4792]: I0301 09:08:55.349989 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:08:55 crc kubenswrapper[4792]: I0301 09:08:55.498728 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:08:55 crc kubenswrapper[4792]: I0301 09:08:55.498929 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:55 crc kubenswrapper[4792]: I0301 09:08:55.499979 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:55 crc kubenswrapper[4792]: I0301 09:08:55.500114 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:55 crc kubenswrapper[4792]: I0301 09:08:55.500185 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:55 crc kubenswrapper[4792]: I0301 09:08:55.502928 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:08:55 crc kubenswrapper[4792]: I0301 09:08:55.780838 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:55 crc kubenswrapper[4792]: I0301 09:08:55.780933 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:08:55 crc kubenswrapper[4792]: I0301 09:08:55.782040 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:55 crc kubenswrapper[4792]: I0301 09:08:55.782079 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:55 crc kubenswrapper[4792]: I0301 09:08:55.782090 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:56 crc kubenswrapper[4792]: I0301 09:08:56.354022 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:08:56 crc kubenswrapper[4792]: I0301 09:08:56.784518 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:56 crc kubenswrapper[4792]: I0301 09:08:56.786309 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:56 crc kubenswrapper[4792]: I0301 09:08:56.786359 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:56 crc kubenswrapper[4792]: I0301 09:08:56.786374 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:57 crc kubenswrapper[4792]: I0301 09:08:57.351179 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:08:57 crc kubenswrapper[4792]: W0301 09:08:57.852027 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 01 09:08:57 crc kubenswrapper[4792]: E0301 09:08:57.852090 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 01 09:08:58 crc kubenswrapper[4792]: I0301 09:08:58.350132 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:08:59 crc kubenswrapper[4792]: I0301 09:08:59.351787 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:08:59 crc kubenswrapper[4792]: I0301 09:08:59.354442 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:08:59 crc kubenswrapper[4792]: I0301 09:08:59.354624 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:59 crc kubenswrapper[4792]: I0301 09:08:59.356027 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:59 crc kubenswrapper[4792]: I0301 09:08:59.356137 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:59 crc kubenswrapper[4792]: I0301 09:08:59.356406 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:00 crc kubenswrapper[4792]: I0301 09:09:00.350484 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:09:01 crc kubenswrapper[4792]: I0301 09:09:01.301889 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:09:01 crc kubenswrapper[4792]: I0301 09:09:01.303090 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:01 crc kubenswrapper[4792]: I0301 09:09:01.303133 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:01 crc kubenswrapper[4792]: I0301 09:09:01.303144 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:01 crc kubenswrapper[4792]: I0301 09:09:01.303169 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 01 09:09:01 crc kubenswrapper[4792]: E0301 09:09:01.305732 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 01 09:09:01 crc kubenswrapper[4792]: E0301 09:09:01.305922 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 01 09:09:01 crc kubenswrapper[4792]: I0301 09:09:01.350291 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:09:01 crc kubenswrapper[4792]: E0301 09:09:01.498654 4792 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 01 09:09:02 crc kubenswrapper[4792]: I0301 09:09:02.168467 4792 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 01 09:09:02 crc kubenswrapper[4792]: I0301 09:09:02.179759 4792 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 01 09:09:02 crc kubenswrapper[4792]: I0301 09:09:02.350828 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:09:03 crc kubenswrapper[4792]: I0301 09:09:03.350324 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:09:03 crc kubenswrapper[4792]: I0301 09:09:03.408016 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:09:03 crc kubenswrapper[4792]: I0301 09:09:03.409175 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:03 crc kubenswrapper[4792]: I0301 09:09:03.409207 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:03 crc kubenswrapper[4792]: I0301 09:09:03.409218 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:03 crc kubenswrapper[4792]: I0301 09:09:03.409734 4792 scope.go:117] "RemoveContainer" containerID="63c6c961bf7ee7b8486397f8898f6cd31605a2d02f3c3b3627c7c906e42eb44c" Mar 01 09:09:03 crc kubenswrapper[4792]: E0301 09:09:03.409968 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 01 09:09:04 crc kubenswrapper[4792]: I0301 09:09:04.351189 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:09:05 crc kubenswrapper[4792]: I0301 09:09:05.352522 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:09:06 crc kubenswrapper[4792]: I0301 09:09:06.352967 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:09:06 crc kubenswrapper[4792]: I0301 09:09:06.372730 4792 csr.go:261] certificate signing request csr-j9p7j is approved, waiting to be issued Mar 01 09:09:06 crc kubenswrapper[4792]: I0301 09:09:06.380125 4792 csr.go:257] certificate signing request csr-j9p7j is issued Mar 01 09:09:06 crc kubenswrapper[4792]: I0301 09:09:06.419348 4792 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 01 09:09:06 crc kubenswrapper[4792]: I0301 09:09:06.723560 4792 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 01 09:09:07 crc kubenswrapper[4792]: I0301 09:09:07.204878 4792 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 01 09:09:07 crc kubenswrapper[4792]: W0301 09:09:07.205086 4792 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.RuntimeClass ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Mar 01 09:09:07 crc kubenswrapper[4792]: I0301 09:09:07.382070 4792 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-23 03:27:09.919042952 +0000 UTC Mar 01 09:09:07 crc kubenswrapper[4792]: I0301 09:09:07.382969 4792 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6402h18m2.536085007s for next certificate rotation Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.306420 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.307741 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.307792 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.307804 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.307948 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.315663 4792 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.315986 4792 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 01 09:09:08 crc kubenswrapper[4792]: E0301 09:09:08.316020 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.319425 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.319516 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.319580 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.319639 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.319694 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:08Z","lastTransitionTime":"2026-03-01T09:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:08 crc kubenswrapper[4792]: E0301 09:09:08.337045 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.345367 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.345520 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.345529 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.345558 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.345568 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:08Z","lastTransitionTime":"2026-03-01T09:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:08 crc kubenswrapper[4792]: E0301 09:09:08.356341 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.364779 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.364809 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.364818 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.364833 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.364843 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:08Z","lastTransitionTime":"2026-03-01T09:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:08 crc kubenswrapper[4792]: E0301 09:09:08.373759 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.381001 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.381206 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.381403 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.381701 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.382026 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:08Z","lastTransitionTime":"2026-03-01T09:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:08 crc kubenswrapper[4792]: E0301 09:09:08.391410 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:08 crc kubenswrapper[4792]: E0301 09:09:08.391516 4792 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 01 09:09:08 crc kubenswrapper[4792]: E0301 09:09:08.391537 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:08 crc kubenswrapper[4792]: E0301 09:09:08.492598 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:08 crc kubenswrapper[4792]: E0301 09:09:08.593337 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:08 crc kubenswrapper[4792]: E0301 09:09:08.694263 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:08 crc kubenswrapper[4792]: E0301 09:09:08.794672 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:08 crc kubenswrapper[4792]: E0301 09:09:08.895193 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:08 crc kubenswrapper[4792]: E0301 09:09:08.995838 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:09 crc kubenswrapper[4792]: E0301 09:09:09.096825 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:09 crc kubenswrapper[4792]: E0301 09:09:09.197694 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:09 crc kubenswrapper[4792]: E0301 09:09:09.298370 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:09 crc kubenswrapper[4792]: E0301 09:09:09.399222 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:09 crc kubenswrapper[4792]: I0301 09:09:09.408625 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:09:09 crc kubenswrapper[4792]: I0301 09:09:09.409780 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:09 crc kubenswrapper[4792]: I0301 09:09:09.409815 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:09 crc kubenswrapper[4792]: I0301 09:09:09.409824 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:09 crc kubenswrapper[4792]: E0301 09:09:09.500333 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:09 crc kubenswrapper[4792]: E0301 09:09:09.600951 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:09 crc kubenswrapper[4792]: E0301 09:09:09.702049 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:09 crc kubenswrapper[4792]: E0301 09:09:09.803091 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:09 crc kubenswrapper[4792]: E0301 09:09:09.903987 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:10 crc kubenswrapper[4792]: E0301 09:09:10.005071 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:10 crc kubenswrapper[4792]: E0301 09:09:10.106189 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:10 crc kubenswrapper[4792]: E0301 09:09:10.206589 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:10 crc kubenswrapper[4792]: E0301 09:09:10.307151 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:10 crc kubenswrapper[4792]: E0301 09:09:10.407433 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:10 crc kubenswrapper[4792]: E0301 09:09:10.508372 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:10 crc kubenswrapper[4792]: E0301 09:09:10.608469 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:10 crc kubenswrapper[4792]: E0301 09:09:10.709429 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:10 crc kubenswrapper[4792]: E0301 09:09:10.809877 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:10 crc kubenswrapper[4792]: E0301 09:09:10.910959 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:11 crc kubenswrapper[4792]: E0301 09:09:11.011704 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:11 crc kubenswrapper[4792]: E0301 09:09:11.112800 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:11 crc kubenswrapper[4792]: E0301 09:09:11.213947 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:11 crc kubenswrapper[4792]: E0301 09:09:11.314341 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:11 crc kubenswrapper[4792]: E0301 09:09:11.414838 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:11 crc kubenswrapper[4792]: E0301 09:09:11.498976 4792 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 01 09:09:11 crc kubenswrapper[4792]: E0301 09:09:11.515492 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:11 crc kubenswrapper[4792]: E0301 09:09:11.616359 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:11 crc kubenswrapper[4792]: E0301 09:09:11.717039 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:11 crc kubenswrapper[4792]: I0301 09:09:11.729997 4792 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 01 09:09:11 crc kubenswrapper[4792]: E0301 09:09:11.817189 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:11 crc kubenswrapper[4792]: E0301 09:09:11.918087 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:12 crc kubenswrapper[4792]: E0301 09:09:12.019206 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:12 crc kubenswrapper[4792]: E0301 09:09:12.120036 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:12 crc kubenswrapper[4792]: E0301 09:09:12.220173 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:12 crc kubenswrapper[4792]: E0301 09:09:12.320375 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:12 crc kubenswrapper[4792]: E0301 09:09:12.421231 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:12 crc kubenswrapper[4792]: E0301 09:09:12.521486 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:12 crc kubenswrapper[4792]: E0301 09:09:12.622508 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:12 crc kubenswrapper[4792]: E0301 09:09:12.723041 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:12 crc kubenswrapper[4792]: E0301 09:09:12.823287 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:12 crc kubenswrapper[4792]: E0301 09:09:12.923429 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:13 crc kubenswrapper[4792]: E0301 09:09:13.024199 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:13 crc kubenswrapper[4792]: E0301 09:09:13.125182 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:13 crc kubenswrapper[4792]: E0301 09:09:13.226068 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:13 crc kubenswrapper[4792]: E0301 09:09:13.326684 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:13 crc kubenswrapper[4792]: E0301 09:09:13.426977 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:13 crc kubenswrapper[4792]: E0301 09:09:13.527842 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:13 crc kubenswrapper[4792]: E0301 09:09:13.628291 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:13 crc kubenswrapper[4792]: E0301 09:09:13.729322 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:13 crc kubenswrapper[4792]: E0301 09:09:13.830001 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:13 crc kubenswrapper[4792]: E0301 09:09:13.931438 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:14 crc kubenswrapper[4792]: E0301 09:09:14.031584 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:14 crc kubenswrapper[4792]: E0301 09:09:14.133088 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:14 crc kubenswrapper[4792]: E0301 09:09:14.233937 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:14 crc kubenswrapper[4792]: E0301 09:09:14.334289 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:14 crc kubenswrapper[4792]: E0301 09:09:14.434887 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:14 crc kubenswrapper[4792]: E0301 09:09:14.535986 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:14 crc kubenswrapper[4792]: E0301 09:09:14.637047 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:14 crc kubenswrapper[4792]: E0301 09:09:14.738261 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:14 crc kubenswrapper[4792]: E0301 09:09:14.838434 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:14 crc kubenswrapper[4792]: E0301 09:09:14.939411 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:15 crc kubenswrapper[4792]: E0301 09:09:15.040393 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:15 crc kubenswrapper[4792]: E0301 09:09:15.141452 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:15 crc kubenswrapper[4792]: E0301 09:09:15.242341 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:15 crc kubenswrapper[4792]: E0301 09:09:15.342718 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:15 crc kubenswrapper[4792]: E0301 09:09:15.444250 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:15 crc kubenswrapper[4792]: E0301 09:09:15.544383 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:15 crc kubenswrapper[4792]: E0301 09:09:15.645292 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:15 crc kubenswrapper[4792]: E0301 09:09:15.745989 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:15 crc kubenswrapper[4792]: E0301 09:09:15.846440 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:15 crc kubenswrapper[4792]: E0301 09:09:15.947616 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:16 crc kubenswrapper[4792]: E0301 09:09:16.048502 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:16 crc kubenswrapper[4792]: E0301 09:09:16.149751 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:16 crc kubenswrapper[4792]: E0301 09:09:16.250973 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:16 crc kubenswrapper[4792]: E0301 09:09:16.351865 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:16 crc kubenswrapper[4792]: E0301 09:09:16.452782 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:16 crc kubenswrapper[4792]: E0301 09:09:16.553587 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:16 crc kubenswrapper[4792]: E0301 09:09:16.654381 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:16 crc kubenswrapper[4792]: E0301 09:09:16.755310 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:16 crc kubenswrapper[4792]: E0301 09:09:16.855614 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:16 crc kubenswrapper[4792]: E0301 09:09:16.956754 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:17 crc kubenswrapper[4792]: E0301 09:09:17.057052 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:17 crc kubenswrapper[4792]: E0301 09:09:17.157710 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:17 crc kubenswrapper[4792]: E0301 09:09:17.258545 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:17 crc kubenswrapper[4792]: E0301 09:09:17.359272 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:17 crc kubenswrapper[4792]: E0301 09:09:17.460408 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:17 crc kubenswrapper[4792]: E0301 09:09:17.561136 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:17 crc kubenswrapper[4792]: E0301 09:09:17.662233 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:17 crc kubenswrapper[4792]: E0301 09:09:17.762833 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:17 crc kubenswrapper[4792]: E0301 09:09:17.863667 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:17 crc kubenswrapper[4792]: E0301 09:09:17.964462 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:18 crc kubenswrapper[4792]: E0301 09:09:18.065336 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:18 crc kubenswrapper[4792]: E0301 09:09:18.166167 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:18 crc kubenswrapper[4792]: E0301 09:09:18.267252 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:18 crc kubenswrapper[4792]: E0301 09:09:18.367899 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.408475 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.409850 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.409890 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.409920 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.410493 4792 scope.go:117] "RemoveContainer" containerID="63c6c961bf7ee7b8486397f8898f6cd31605a2d02f3c3b3627c7c906e42eb44c" Mar 01 09:09:18 crc kubenswrapper[4792]: E0301 09:09:18.410640 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 01 09:09:18 crc kubenswrapper[4792]: E0301 09:09:18.468476 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:18 crc kubenswrapper[4792]: E0301 09:09:18.569373 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:18 crc kubenswrapper[4792]: E0301 09:09:18.652221 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.656263 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.656294 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.656305 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.656320 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.656331 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:18Z","lastTransitionTime":"2026-03-01T09:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:18 crc kubenswrapper[4792]: E0301 09:09:18.665782 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.669524 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.669569 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.669585 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.669605 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.669620 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:18Z","lastTransitionTime":"2026-03-01T09:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:18 crc kubenswrapper[4792]: E0301 09:09:18.680084 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.684107 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.684151 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.684166 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.684186 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.684200 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:18Z","lastTransitionTime":"2026-03-01T09:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:18 crc kubenswrapper[4792]: E0301 09:09:18.695567 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.699226 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.699288 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.699306 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.699330 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.699345 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:18Z","lastTransitionTime":"2026-03-01T09:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:18 crc kubenswrapper[4792]: E0301 09:09:18.708215 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:18 crc kubenswrapper[4792]: E0301 09:09:18.708329 4792 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 01 09:09:18 crc kubenswrapper[4792]: E0301 09:09:18.708354 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:18 crc kubenswrapper[4792]: E0301 09:09:18.809290 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:18 crc kubenswrapper[4792]: E0301 09:09:18.910172 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:19 crc kubenswrapper[4792]: E0301 09:09:19.010954 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:19 crc kubenswrapper[4792]: E0301 09:09:19.111583 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:19 crc kubenswrapper[4792]: E0301 09:09:19.212346 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:19 crc kubenswrapper[4792]: E0301 09:09:19.313268 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:19 crc kubenswrapper[4792]: E0301 09:09:19.414305 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:19 crc kubenswrapper[4792]: E0301 09:09:19.515114 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:19 crc kubenswrapper[4792]: E0301 09:09:19.615944 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:19 crc kubenswrapper[4792]: E0301 09:09:19.716855 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:19 crc kubenswrapper[4792]: E0301 09:09:19.817183 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:19 crc kubenswrapper[4792]: E0301 09:09:19.917925 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:20 crc kubenswrapper[4792]: E0301 09:09:20.018606 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:20 crc kubenswrapper[4792]: E0301 09:09:20.119246 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:20 crc kubenswrapper[4792]: E0301 09:09:20.220288 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:20 crc kubenswrapper[4792]: E0301 09:09:20.321385 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:20 crc kubenswrapper[4792]: E0301 09:09:20.422179 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:20 crc kubenswrapper[4792]: E0301 09:09:20.523037 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:20 crc kubenswrapper[4792]: E0301 09:09:20.624046 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:20 crc kubenswrapper[4792]: E0301 09:09:20.724324 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:20 crc kubenswrapper[4792]: E0301 09:09:20.824502 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:20 crc kubenswrapper[4792]: E0301 09:09:20.925223 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:21 crc kubenswrapper[4792]: E0301 09:09:21.025314 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:21 crc kubenswrapper[4792]: E0301 09:09:21.126276 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:21 crc kubenswrapper[4792]: E0301 09:09:21.226570 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:21 crc kubenswrapper[4792]: E0301 09:09:21.326693 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:21 crc kubenswrapper[4792]: E0301 09:09:21.427074 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:21 crc kubenswrapper[4792]: E0301 09:09:21.499165 4792 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 01 09:09:21 crc kubenswrapper[4792]: E0301 09:09:21.527606 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:21 crc kubenswrapper[4792]: E0301 09:09:21.628181 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:21 crc kubenswrapper[4792]: E0301 09:09:21.728769 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:21 crc kubenswrapper[4792]: E0301 09:09:21.829752 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:21 crc kubenswrapper[4792]: E0301 09:09:21.930819 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:22 crc kubenswrapper[4792]: E0301 09:09:22.031529 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:22 crc kubenswrapper[4792]: E0301 09:09:22.132190 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:22 crc kubenswrapper[4792]: E0301 09:09:22.233025 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:22 crc kubenswrapper[4792]: E0301 09:09:22.333117 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:22 crc kubenswrapper[4792]: E0301 09:09:22.434105 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:22 crc kubenswrapper[4792]: E0301 09:09:22.535161 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:22 crc kubenswrapper[4792]: E0301 09:09:22.635990 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:22 crc kubenswrapper[4792]: E0301 09:09:22.736181 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:22 crc kubenswrapper[4792]: E0301 09:09:22.836291 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:22 crc kubenswrapper[4792]: E0301 09:09:22.936393 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:23 crc kubenswrapper[4792]: E0301 09:09:23.037010 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:23 crc kubenswrapper[4792]: E0301 09:09:23.137581 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:23 crc kubenswrapper[4792]: E0301 09:09:23.238397 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:23 crc kubenswrapper[4792]: E0301 09:09:23.339360 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:23 crc kubenswrapper[4792]: E0301 09:09:23.440031 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:23 crc kubenswrapper[4792]: E0301 09:09:23.541170 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:23 crc kubenswrapper[4792]: E0301 09:09:23.642098 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:23 crc kubenswrapper[4792]: E0301 09:09:23.742617 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:23 crc kubenswrapper[4792]: E0301 09:09:23.844114 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:23 crc kubenswrapper[4792]: E0301 09:09:23.945173 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:24 crc kubenswrapper[4792]: E0301 09:09:24.045815 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:24 crc kubenswrapper[4792]: E0301 09:09:24.146616 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:24 crc kubenswrapper[4792]: E0301 09:09:24.247231 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:24 crc kubenswrapper[4792]: E0301 09:09:24.347805 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:24 crc kubenswrapper[4792]: I0301 09:09:24.408665 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:09:24 crc kubenswrapper[4792]: I0301 09:09:24.409665 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:24 crc kubenswrapper[4792]: I0301 09:09:24.409732 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:24 crc kubenswrapper[4792]: I0301 09:09:24.409752 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:24 crc kubenswrapper[4792]: E0301 09:09:24.448735 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:24 crc kubenswrapper[4792]: E0301 09:09:24.549695 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:24 crc kubenswrapper[4792]: E0301 09:09:24.650655 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:24 crc kubenswrapper[4792]: E0301 09:09:24.751074 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:24 crc kubenswrapper[4792]: E0301 09:09:24.851693 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:24 crc kubenswrapper[4792]: E0301 09:09:24.951897 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:25 crc kubenswrapper[4792]: E0301 09:09:25.052491 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:25 crc kubenswrapper[4792]: E0301 09:09:25.152576 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:25 crc kubenswrapper[4792]: E0301 09:09:25.252868 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:25 crc kubenswrapper[4792]: E0301 09:09:25.354001 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:25 crc kubenswrapper[4792]: E0301 09:09:25.455083 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:25 crc kubenswrapper[4792]: E0301 09:09:25.556128 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:25 crc kubenswrapper[4792]: E0301 09:09:25.656227 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:25 crc kubenswrapper[4792]: E0301 09:09:25.756519 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:25 crc kubenswrapper[4792]: E0301 09:09:25.856829 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:25 crc kubenswrapper[4792]: E0301 09:09:25.957479 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:26 crc kubenswrapper[4792]: E0301 09:09:26.058102 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:26 crc kubenswrapper[4792]: E0301 09:09:26.158976 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:26 crc kubenswrapper[4792]: E0301 09:09:26.259827 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:26 crc kubenswrapper[4792]: E0301 09:09:26.360024 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:26 crc kubenswrapper[4792]: E0301 09:09:26.461173 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:26 crc kubenswrapper[4792]: E0301 09:09:26.562178 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:26 crc kubenswrapper[4792]: E0301 09:09:26.663031 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:26 crc kubenswrapper[4792]: E0301 09:09:26.763794 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:26 crc kubenswrapper[4792]: E0301 09:09:26.864563 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:26 crc kubenswrapper[4792]: E0301 09:09:26.965414 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:27 crc kubenswrapper[4792]: E0301 09:09:27.066377 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:27 crc kubenswrapper[4792]: E0301 09:09:27.167343 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:27 crc kubenswrapper[4792]: E0301 09:09:27.267726 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:27 crc kubenswrapper[4792]: E0301 09:09:27.368647 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:27 crc kubenswrapper[4792]: E0301 09:09:27.470110 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:27 crc kubenswrapper[4792]: E0301 09:09:27.570931 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:27 crc kubenswrapper[4792]: E0301 09:09:27.672599 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:27 crc kubenswrapper[4792]: E0301 09:09:27.773640 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:27 crc kubenswrapper[4792]: E0301 09:09:27.874279 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:27 crc kubenswrapper[4792]: E0301 09:09:27.974408 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:28 crc kubenswrapper[4792]: E0301 09:09:28.075052 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:28 crc kubenswrapper[4792]: E0301 09:09:28.175373 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:28 crc kubenswrapper[4792]: E0301 09:09:28.276402 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:28 crc kubenswrapper[4792]: E0301 09:09:28.376804 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:28 crc kubenswrapper[4792]: E0301 09:09:28.477191 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:28 crc kubenswrapper[4792]: E0301 09:09:28.577422 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:28 crc kubenswrapper[4792]: E0301 09:09:28.678560 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:28 crc kubenswrapper[4792]: E0301 09:09:28.779100 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:28 crc kubenswrapper[4792]: E0301 09:09:28.879376 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:28 crc kubenswrapper[4792]: E0301 09:09:28.980409 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:28 crc kubenswrapper[4792]: E0301 09:09:28.992630 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 01 09:09:28 crc kubenswrapper[4792]: I0301 09:09:28.994057 4792 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 01 09:09:28 crc kubenswrapper[4792]: I0301 09:09:28.996010 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:28 crc kubenswrapper[4792]: I0301 09:09:28.996079 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:28 crc kubenswrapper[4792]: I0301 09:09:28.996092 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:28 crc kubenswrapper[4792]: I0301 09:09:28.996110 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:28 crc kubenswrapper[4792]: I0301 09:09:28.996122 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:28Z","lastTransitionTime":"2026-03-01T09:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:29 crc kubenswrapper[4792]: E0301 09:09:29.005497 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:29 crc kubenswrapper[4792]: I0301 09:09:29.008702 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:29 crc kubenswrapper[4792]: I0301 09:09:29.008761 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:29 crc kubenswrapper[4792]: I0301 09:09:29.008773 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:29 crc kubenswrapper[4792]: I0301 09:09:29.008787 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:29 crc kubenswrapper[4792]: I0301 09:09:29.008798 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:29Z","lastTransitionTime":"2026-03-01T09:09:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:29 crc kubenswrapper[4792]: E0301 09:09:29.019775 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:29 crc kubenswrapper[4792]: I0301 09:09:29.022979 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:29 crc kubenswrapper[4792]: I0301 09:09:29.023015 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:29 crc kubenswrapper[4792]: I0301 09:09:29.023041 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:29 crc kubenswrapper[4792]: I0301 09:09:29.023065 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:29 crc kubenswrapper[4792]: I0301 09:09:29.023082 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:29Z","lastTransitionTime":"2026-03-01T09:09:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:29 crc kubenswrapper[4792]: E0301 09:09:29.034153 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:29 crc kubenswrapper[4792]: I0301 09:09:29.037975 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:29 crc kubenswrapper[4792]: I0301 09:09:29.038011 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:29 crc kubenswrapper[4792]: I0301 09:09:29.038020 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:29 crc kubenswrapper[4792]: I0301 09:09:29.038032 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:29 crc kubenswrapper[4792]: I0301 09:09:29.038041 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:29Z","lastTransitionTime":"2026-03-01T09:09:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:29 crc kubenswrapper[4792]: E0301 09:09:29.046654 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:29 crc kubenswrapper[4792]: E0301 09:09:29.047015 4792 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 01 09:09:29 crc kubenswrapper[4792]: E0301 09:09:29.081236 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:29 crc kubenswrapper[4792]: E0301 09:09:29.182178 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:29 crc kubenswrapper[4792]: E0301 09:09:29.283049 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:29 crc kubenswrapper[4792]: E0301 09:09:29.383996 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:29 crc kubenswrapper[4792]: E0301 09:09:29.484884 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:29 crc kubenswrapper[4792]: E0301 09:09:29.585761 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:29 crc kubenswrapper[4792]: E0301 09:09:29.687580 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:29 crc kubenswrapper[4792]: E0301 09:09:29.788744 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:29 crc kubenswrapper[4792]: E0301 09:09:29.889179 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:29 crc kubenswrapper[4792]: E0301 09:09:29.990124 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:30 crc kubenswrapper[4792]: E0301 09:09:30.090785 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:30 crc kubenswrapper[4792]: E0301 09:09:30.191318 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:30 crc kubenswrapper[4792]: E0301 09:09:30.292423 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:30 crc kubenswrapper[4792]: E0301 09:09:30.392760 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:30 crc kubenswrapper[4792]: E0301 09:09:30.494129 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:30 crc kubenswrapper[4792]: E0301 09:09:30.594290 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:30 crc kubenswrapper[4792]: E0301 09:09:30.695345 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:30 crc kubenswrapper[4792]: E0301 09:09:30.795951 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:30 crc kubenswrapper[4792]: E0301 09:09:30.896126 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:30 crc kubenswrapper[4792]: E0301 09:09:30.997322 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:31 crc kubenswrapper[4792]: E0301 09:09:31.098297 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:31 crc kubenswrapper[4792]: E0301 09:09:31.198882 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:31 crc kubenswrapper[4792]: E0301 09:09:31.299803 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:31 crc kubenswrapper[4792]: E0301 09:09:31.400956 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:31 crc kubenswrapper[4792]: I0301 09:09:31.408456 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:09:31 crc kubenswrapper[4792]: I0301 09:09:31.410332 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:31 crc kubenswrapper[4792]: I0301 09:09:31.410419 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:31 crc kubenswrapper[4792]: I0301 09:09:31.410438 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:31 crc kubenswrapper[4792]: I0301 09:09:31.411773 4792 scope.go:117] "RemoveContainer" containerID="63c6c961bf7ee7b8486397f8898f6cd31605a2d02f3c3b3627c7c906e42eb44c" Mar 01 09:09:31 crc kubenswrapper[4792]: E0301 09:09:31.500040 4792 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 01 09:09:31 crc kubenswrapper[4792]: E0301 09:09:31.501133 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:31 crc kubenswrapper[4792]: E0301 09:09:31.601819 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:31 crc kubenswrapper[4792]: E0301 09:09:31.702189 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:31 crc kubenswrapper[4792]: E0301 09:09:31.802570 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:31 crc kubenswrapper[4792]: I0301 09:09:31.865985 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 01 09:09:31 crc kubenswrapper[4792]: I0301 09:09:31.867108 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2"} Mar 01 09:09:31 crc kubenswrapper[4792]: I0301 09:09:31.867222 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:09:31 crc kubenswrapper[4792]: I0301 09:09:31.867883 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:31 crc kubenswrapper[4792]: I0301 09:09:31.867918 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:31 crc kubenswrapper[4792]: I0301 09:09:31.867926 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:31 crc kubenswrapper[4792]: E0301 09:09:31.903472 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:32 crc kubenswrapper[4792]: E0301 09:09:32.004263 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:32 crc kubenswrapper[4792]: E0301 09:09:32.105398 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:32 crc kubenswrapper[4792]: I0301 09:09:32.202322 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:09:32 crc kubenswrapper[4792]: E0301 09:09:32.206535 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:32 crc kubenswrapper[4792]: E0301 09:09:32.306866 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:32 crc kubenswrapper[4792]: E0301 09:09:32.407101 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:32 crc kubenswrapper[4792]: E0301 09:09:32.507647 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:32 crc kubenswrapper[4792]: E0301 09:09:32.608192 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:32 crc kubenswrapper[4792]: E0301 09:09:32.708951 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:32 crc kubenswrapper[4792]: E0301 09:09:32.810184 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:32 crc kubenswrapper[4792]: I0301 09:09:32.872174 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Mar 01 09:09:32 crc kubenswrapper[4792]: I0301 09:09:32.872600 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 01 09:09:32 crc kubenswrapper[4792]: I0301 09:09:32.874471 4792 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2" exitCode=255 Mar 01 09:09:32 crc kubenswrapper[4792]: I0301 09:09:32.874670 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2"} Mar 01 09:09:32 crc kubenswrapper[4792]: I0301 09:09:32.874836 4792 scope.go:117] "RemoveContainer" containerID="63c6c961bf7ee7b8486397f8898f6cd31605a2d02f3c3b3627c7c906e42eb44c" Mar 01 09:09:32 crc kubenswrapper[4792]: I0301 09:09:32.875210 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:09:32 crc kubenswrapper[4792]: I0301 09:09:32.876487 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:32 crc kubenswrapper[4792]: I0301 09:09:32.876517 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:32 crc kubenswrapper[4792]: I0301 09:09:32.876527 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:32 crc kubenswrapper[4792]: I0301 09:09:32.877042 4792 scope.go:117] "RemoveContainer" containerID="40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2" Mar 01 09:09:32 crc kubenswrapper[4792]: E0301 09:09:32.877202 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 01 09:09:32 crc kubenswrapper[4792]: E0301 09:09:32.911420 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:33 crc kubenswrapper[4792]: E0301 09:09:33.012232 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.091148 4792 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.114799 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.114834 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.114843 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.114856 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.114866 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:33Z","lastTransitionTime":"2026-03-01T09:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.218061 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.218114 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.218131 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.218155 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.218173 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:33Z","lastTransitionTime":"2026-03-01T09:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.321215 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.321245 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.321254 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.321266 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.321276 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:33Z","lastTransitionTime":"2026-03-01T09:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.423766 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.423813 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.423824 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.423840 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.423851 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:33Z","lastTransitionTime":"2026-03-01T09:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.526545 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.526581 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.526592 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.526606 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.526618 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:33Z","lastTransitionTime":"2026-03-01T09:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.629961 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.630020 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.630033 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.630051 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.630062 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:33Z","lastTransitionTime":"2026-03-01T09:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.732799 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.732880 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.732935 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.732984 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.733008 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:33Z","lastTransitionTime":"2026-03-01T09:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.754831 4792 apiserver.go:52] "Watching apiserver" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.759449 4792 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.759765 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.760254 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.760376 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.760390 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:09:33 crc kubenswrapper[4792]: E0301 09:09:33.760629 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.760668 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:33 crc kubenswrapper[4792]: E0301 09:09:33.760883 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.760387 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:09:33 crc kubenswrapper[4792]: E0301 09:09:33.761151 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.760724 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.763900 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.764357 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.764465 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.764589 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.765088 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.765998 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.767432 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.767691 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.768518 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.787519 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.800939 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.813027 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.822578 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.835145 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.835206 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.835227 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.835257 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.835282 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:33Z","lastTransitionTime":"2026-03-01T09:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.841878 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.852413 4792 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.852756 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.864104 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.872509 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.878451 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.887561 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.893721 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.894281 4792 scope.go:117] "RemoveContainer" containerID="40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2" Mar 01 09:09:33 crc kubenswrapper[4792]: E0301 09:09:33.894661 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.900969 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.905765 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.905810 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.905839 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.905864 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.905890 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.905932 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.905956 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.905979 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906000 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906022 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906045 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906067 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906089 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906111 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906132 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906154 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906174 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906199 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906221 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906243 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906265 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906287 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906308 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906331 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906354 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906374 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906394 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906416 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906438 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906459 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906480 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906501 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906521 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906544 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906569 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906590 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906612 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906634 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906656 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906678 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906701 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906723 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906746 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906766 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906791 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906813 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906834 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906855 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906877 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906898 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906939 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906961 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906983 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907004 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907025 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907047 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907070 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907091 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907112 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906328 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906448 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907158 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907183 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907205 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907228 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907250 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907272 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907293 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907315 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907353 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907377 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907400 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907426 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907448 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907471 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907493 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907516 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907538 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907560 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907583 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907605 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907626 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907650 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907672 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907694 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907718 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907741 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907764 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907786 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907807 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907831 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907853 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907874 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907895 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907933 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907954 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907974 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908022 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908048 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908070 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908093 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908115 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908139 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908162 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908188 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908211 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908234 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908257 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908280 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908301 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908324 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908346 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908369 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908392 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908417 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908441 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914108 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914189 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914233 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914267 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914306 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914341 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914369 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914488 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914523 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914560 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914597 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914635 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914669 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914705 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914740 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914768 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914798 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914841 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914876 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914924 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914955 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914985 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915012 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915043 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915093 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915125 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915156 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915184 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915216 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915248 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915281 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915311 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915345 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915381 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915409 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915441 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915473 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915508 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915537 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915571 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915602 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915629 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915661 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915691 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915721 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915753 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915786 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915817 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915848 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915884 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915938 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915968 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915999 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916034 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916064 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916098 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916130 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916164 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916193 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916227 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916263 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916291 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916324 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916356 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916385 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916418 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916452 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916485 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916514 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916549 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916585 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916615 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916652 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916692 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916726 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916754 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916795 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916831 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916863 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916897 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916947 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.917031 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.917074 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.917116 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.917153 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.917215 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.917254 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.917288 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.917326 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.917362 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.917396 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.917432 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.917467 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.917499 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.917536 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.917667 4792 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.917693 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906709 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906854 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906960 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907060 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907126 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907257 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907311 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907372 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907553 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907567 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907597 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907670 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907798 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907988 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908104 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908142 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908174 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908254 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908379 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.912245 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.912524 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.912763 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.913015 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.913276 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.913772 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914365 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914405 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914697 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914835 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.921044 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915789 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916583 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.917287 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.917709 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.918362 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.918788 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.919061 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.919102 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.919443 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.919522 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.919574 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.919849 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.919900 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.920408 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.920414 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.922004 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.922152 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.922429 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.922567 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.922874 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.923283 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.923374 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.923391 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.923485 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.923693 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.923644 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.923835 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.924085 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.924267 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.924581 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.924641 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.929335 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.929494 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.929553 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.929632 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.929771 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.929782 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.929950 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.930049 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.930117 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.930203 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.930275 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.930401 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.930560 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.930822 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.930853 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.931005 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.931045 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.931207 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.931516 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.932020 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.932059 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.932859 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.933093 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.933316 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.934177 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.933873 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.923244 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.934311 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.934327 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.934626 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.934641 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.934662 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.934807 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.934848 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.935210 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: E0301 09:09:33.935500 4792 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.935541 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.935676 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.936239 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.937262 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.937710 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.937779 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.938362 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.938500 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.938811 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.939260 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.939259 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.939337 4792 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.939587 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.939721 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.939952 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.940293 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.940520 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.940607 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.940989 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.941070 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.941414 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.941555 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.941607 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.941760 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.942109 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.942420 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.942739 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.943068 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.943304 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.943751 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.944118 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.944437 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.944808 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.944823 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.945439 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.945764 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.946382 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.947291 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.947301 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.947341 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.947501 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.947523 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.947534 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.947550 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.947561 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:33Z","lastTransitionTime":"2026-03-01T09:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.948309 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.948791 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.949233 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.949493 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.949727 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.950067 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.950559 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.950554 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.951641 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.951520 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.951992 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.952062 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: E0301 09:09:33.952318 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:34.442266466 +0000 UTC m=+103.684145853 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.952590 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: E0301 09:09:33.953069 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 01 09:09:33 crc kubenswrapper[4792]: E0301 09:09:33.953097 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 01 09:09:33 crc kubenswrapper[4792]: E0301 09:09:33.953113 4792 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:33 crc kubenswrapper[4792]: E0301 09:09:33.953206 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:34.45315565 +0000 UTC m=+103.695034867 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.953747 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.956537 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: E0301 09:09:33.956619 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 01 09:09:33 crc kubenswrapper[4792]: E0301 09:09:33.956640 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 01 09:09:33 crc kubenswrapper[4792]: E0301 09:09:33.956654 4792 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:33 crc kubenswrapper[4792]: E0301 09:09:33.956691 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:34.456678679 +0000 UTC m=+103.698557896 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.957748 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.957814 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.957836 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.957972 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.958409 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.958441 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.958449 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.958728 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.958812 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.953814 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.954093 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.960510 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.961450 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.961902 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.961960 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.962624 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.962701 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.963365 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.963454 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.964046 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.964828 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.965048 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.965133 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.965371 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.966783 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.966977 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.968771 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.969200 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 01 09:09:33 crc kubenswrapper[4792]: E0301 09:09:33.969354 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:09:34.469329587 +0000 UTC m=+103.711208994 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.969812 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.969969 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.970312 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.970326 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.970409 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.970456 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.970511 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: E0301 09:09:33.971313 4792 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.971355 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.970723 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.970803 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: E0301 09:09:33.971395 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:34.471372399 +0000 UTC m=+103.713251806 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.971041 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.971263 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.970930 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.971830 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.972011 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.972366 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.973011 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.978466 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.981071 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.982878 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.986323 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.989410 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.992311 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.003758 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.006793 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.008791 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.009330 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.018728 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.018775 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.018821 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.018833 4792 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.018853 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.018866 4792 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.018860 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.018874 4792 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.018929 4792 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.018945 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.018959 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.018971 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.018984 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.018996 4792 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.018999 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019009 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019061 4792 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019073 4792 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019084 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019094 4792 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019104 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019116 4792 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019127 4792 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019137 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019147 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019158 4792 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019169 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019181 4792 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019192 4792 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019201 4792 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019209 4792 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019221 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019234 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019246 4792 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019257 4792 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019267 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019275 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019283 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019291 4792 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019299 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019309 4792 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019318 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019326 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019337 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019349 4792 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019359 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019371 4792 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019381 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019391 4792 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019399 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019408 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019417 4792 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019425 4792 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019433 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019441 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019450 4792 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019458 4792 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019466 4792 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019475 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019483 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019492 4792 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019501 4792 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019510 4792 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019519 4792 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019527 4792 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019536 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019544 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019556 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019565 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019577 4792 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019588 4792 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019598 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019609 4792 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019620 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019629 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019638 4792 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019646 4792 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019655 4792 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019663 4792 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019671 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019679 4792 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019687 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019695 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019706 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019716 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019727 4792 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019737 4792 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019746 4792 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019757 4792 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019765 4792 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019774 4792 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019782 4792 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019790 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019798 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019806 4792 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019814 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019824 4792 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019832 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019840 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019848 4792 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019856 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019865 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019873 4792 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019882 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019890 4792 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019898 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019922 4792 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019930 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019938 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019948 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019958 4792 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019968 4792 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019976 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019984 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019992 4792 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020001 4792 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020009 4792 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020019 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020029 4792 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020041 4792 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020053 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020062 4792 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020071 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020080 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020088 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020096 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020105 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020114 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020123 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020132 4792 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020142 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020152 4792 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020161 4792 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020169 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020178 4792 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020186 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020196 4792 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020205 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020217 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020228 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020237 4792 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020245 4792 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020254 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020262 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020271 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020281 4792 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020290 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020300 4792 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020309 4792 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020317 4792 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020325 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020334 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020341 4792 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020350 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020358 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020365 4792 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020374 4792 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020385 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020394 4792 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020402 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020410 4792 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020417 4792 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020426 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020433 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020442 4792 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020451 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020460 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020469 4792 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020478 4792 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020488 4792 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020495 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020503 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020512 4792 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020523 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020534 4792 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020543 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020552 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020561 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020570 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020579 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020587 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020596 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020605 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020613 4792 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020622 4792 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020630 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020639 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020647 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020655 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020666 4792 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020674 4792 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020682 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020691 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020699 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020707 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020716 4792 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.050334 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.050376 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.050387 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.050403 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.050413 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:34Z","lastTransitionTime":"2026-03-01T09:09:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.086838 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.099817 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 01 09:09:34 crc kubenswrapper[4792]: W0301 09:09:34.100841 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-12fd174335061da629c1882ca1f77f94c61a341aa7323ea0f73130b5e76e91c0 WatchSource:0}: Error finding container 12fd174335061da629c1882ca1f77f94c61a341aa7323ea0f73130b5e76e91c0: Status 404 returned error can't find the container with id 12fd174335061da629c1882ca1f77f94c61a341aa7323ea0f73130b5e76e91c0 Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.110165 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 01 09:09:34 crc kubenswrapper[4792]: W0301 09:09:34.111246 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-74b19e7a9fee01e9ac34eb744d89824e9cd74e491b82483d9e467d08db94da73 WatchSource:0}: Error finding container 74b19e7a9fee01e9ac34eb744d89824e9cd74e491b82483d9e467d08db94da73: Status 404 returned error can't find the container with id 74b19e7a9fee01e9ac34eb744d89824e9cd74e491b82483d9e467d08db94da73 Mar 01 09:09:34 crc kubenswrapper[4792]: W0301 09:09:34.122094 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-9b7eb2bcfda86474e39ff54de92c5f284cc19d1fed02f206b5e7c7210ed5f7f0 WatchSource:0}: Error finding container 9b7eb2bcfda86474e39ff54de92c5f284cc19d1fed02f206b5e7c7210ed5f7f0: Status 404 returned error can't find the container with id 9b7eb2bcfda86474e39ff54de92c5f284cc19d1fed02f206b5e7c7210ed5f7f0 Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.152570 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.152605 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.152616 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.152630 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.152639 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:34Z","lastTransitionTime":"2026-03-01T09:09:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.255343 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.255365 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.255373 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.255385 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.255393 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:34Z","lastTransitionTime":"2026-03-01T09:09:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.357676 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.357699 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.357708 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.357723 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.357733 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:34Z","lastTransitionTime":"2026-03-01T09:09:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.459484 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.459523 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.459534 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.459550 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.459561 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:34Z","lastTransitionTime":"2026-03-01T09:09:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.524016 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.524093 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.524116 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.524140 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:34 crc kubenswrapper[4792]: E0301 09:09:34.524171 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:09:35.524149966 +0000 UTC m=+104.766029163 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:09:34 crc kubenswrapper[4792]: E0301 09:09:34.524195 4792 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.524202 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:09:34 crc kubenswrapper[4792]: E0301 09:09:34.524229 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:35.524221328 +0000 UTC m=+104.766100525 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 01 09:09:34 crc kubenswrapper[4792]: E0301 09:09:34.524294 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 01 09:09:34 crc kubenswrapper[4792]: E0301 09:09:34.524309 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 01 09:09:34 crc kubenswrapper[4792]: E0301 09:09:34.524304 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 01 09:09:34 crc kubenswrapper[4792]: E0301 09:09:34.524353 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 01 09:09:34 crc kubenswrapper[4792]: E0301 09:09:34.524368 4792 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:34 crc kubenswrapper[4792]: E0301 09:09:34.524370 4792 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 01 09:09:34 crc kubenswrapper[4792]: E0301 09:09:34.524408 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:35.524399053 +0000 UTC m=+104.766278250 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 01 09:09:34 crc kubenswrapper[4792]: E0301 09:09:34.524321 4792 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:34 crc kubenswrapper[4792]: E0301 09:09:34.524423 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:35.524416213 +0000 UTC m=+104.766295410 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:34 crc kubenswrapper[4792]: E0301 09:09:34.524457 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:35.524438023 +0000 UTC m=+104.766317230 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.561150 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.561185 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.561196 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.561213 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.561227 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:34Z","lastTransitionTime":"2026-03-01T09:09:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.619254 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-bqszv"] Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.619524 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-zql8j"] Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.619683 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zql8j" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.619866 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.622166 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.622180 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.622530 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.623272 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.624107 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.625120 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.625373 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.625492 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.643769 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.653009 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.661946 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.663324 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.663360 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.663371 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.663385 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.663397 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:34Z","lastTransitionTime":"2026-03-01T09:09:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.669557 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.679444 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.688634 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.698114 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.707000 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.717230 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.726870 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjzhv\" (UniqueName: \"kubernetes.io/projected/0982a9bb-56d4-4e1c-86cb-76a4152de9ba-kube-api-access-hjzhv\") pod \"node-resolver-zql8j\" (UID: \"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\") " pod="openshift-dns/node-resolver-zql8j" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.726962 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0982a9bb-56d4-4e1c-86cb-76a4152de9ba-hosts-file\") pod \"node-resolver-zql8j\" (UID: \"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\") " pod="openshift-dns/node-resolver-zql8j" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.726999 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/9105f6b0-6f16-47aa-8009-73736a90b765-rootfs\") pod \"machine-config-daemon-bqszv\" (UID: \"9105f6b0-6f16-47aa-8009-73736a90b765\") " pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.727030 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwr4p\" (UniqueName: \"kubernetes.io/projected/9105f6b0-6f16-47aa-8009-73736a90b765-kube-api-access-qwr4p\") pod \"machine-config-daemon-bqszv\" (UID: \"9105f6b0-6f16-47aa-8009-73736a90b765\") " pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.727128 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9105f6b0-6f16-47aa-8009-73736a90b765-proxy-tls\") pod \"machine-config-daemon-bqszv\" (UID: \"9105f6b0-6f16-47aa-8009-73736a90b765\") " pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.727164 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9105f6b0-6f16-47aa-8009-73736a90b765-mcd-auth-proxy-config\") pod \"machine-config-daemon-bqszv\" (UID: \"9105f6b0-6f16-47aa-8009-73736a90b765\") " pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.728720 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.737310 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.765640 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.765681 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.765694 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.765712 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.765727 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:34Z","lastTransitionTime":"2026-03-01T09:09:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.788966 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:34Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.807442 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:34Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.818696 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:34Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.828046 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9105f6b0-6f16-47aa-8009-73736a90b765-proxy-tls\") pod \"machine-config-daemon-bqszv\" (UID: \"9105f6b0-6f16-47aa-8009-73736a90b765\") " pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.828239 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9105f6b0-6f16-47aa-8009-73736a90b765-mcd-auth-proxy-config\") pod \"machine-config-daemon-bqszv\" (UID: \"9105f6b0-6f16-47aa-8009-73736a90b765\") " pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.828319 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjzhv\" (UniqueName: \"kubernetes.io/projected/0982a9bb-56d4-4e1c-86cb-76a4152de9ba-kube-api-access-hjzhv\") pod \"node-resolver-zql8j\" (UID: \"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\") " pod="openshift-dns/node-resolver-zql8j" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.828400 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0982a9bb-56d4-4e1c-86cb-76a4152de9ba-hosts-file\") pod \"node-resolver-zql8j\" (UID: \"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\") " pod="openshift-dns/node-resolver-zql8j" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.828477 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/9105f6b0-6f16-47aa-8009-73736a90b765-rootfs\") pod \"machine-config-daemon-bqszv\" (UID: \"9105f6b0-6f16-47aa-8009-73736a90b765\") " pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.828652 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwr4p\" (UniqueName: \"kubernetes.io/projected/9105f6b0-6f16-47aa-8009-73736a90b765-kube-api-access-qwr4p\") pod \"machine-config-daemon-bqszv\" (UID: \"9105f6b0-6f16-47aa-8009-73736a90b765\") " pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.828557 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/9105f6b0-6f16-47aa-8009-73736a90b765-rootfs\") pod \"machine-config-daemon-bqszv\" (UID: \"9105f6b0-6f16-47aa-8009-73736a90b765\") " pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.828536 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0982a9bb-56d4-4e1c-86cb-76a4152de9ba-hosts-file\") pod \"node-resolver-zql8j\" (UID: \"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\") " pod="openshift-dns/node-resolver-zql8j" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.829036 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9105f6b0-6f16-47aa-8009-73736a90b765-mcd-auth-proxy-config\") pod \"machine-config-daemon-bqszv\" (UID: \"9105f6b0-6f16-47aa-8009-73736a90b765\") " pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.829652 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:34Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.841835 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:34Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.853991 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:34Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.859574 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9105f6b0-6f16-47aa-8009-73736a90b765-proxy-tls\") pod \"machine-config-daemon-bqszv\" (UID: \"9105f6b0-6f16-47aa-8009-73736a90b765\") " pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.860192 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwr4p\" (UniqueName: \"kubernetes.io/projected/9105f6b0-6f16-47aa-8009-73736a90b765-kube-api-access-qwr4p\") pod \"machine-config-daemon-bqszv\" (UID: \"9105f6b0-6f16-47aa-8009-73736a90b765\") " pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.860287 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjzhv\" (UniqueName: \"kubernetes.io/projected/0982a9bb-56d4-4e1c-86cb-76a4152de9ba-kube-api-access-hjzhv\") pod \"node-resolver-zql8j\" (UID: \"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\") " pod="openshift-dns/node-resolver-zql8j" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.868108 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.868131 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.868140 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.868153 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.868161 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:34Z","lastTransitionTime":"2026-03-01T09:09:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.883899 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec"} Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.883944 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b"} Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.883953 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"74b19e7a9fee01e9ac34eb744d89824e9cd74e491b82483d9e467d08db94da73"} Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.887060 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce"} Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.887173 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"12fd174335061da629c1882ca1f77f94c61a341aa7323ea0f73130b5e76e91c0"} Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.888100 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"9b7eb2bcfda86474e39ff54de92c5f284cc19d1fed02f206b5e7c7210ed5f7f0"} Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.888477 4792 scope.go:117] "RemoveContainer" containerID="40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2" Mar 01 09:09:34 crc kubenswrapper[4792]: E0301 09:09:34.888595 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.896274 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:34Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.908589 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:34Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.920347 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:34Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.931045 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:34Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.933708 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zql8j" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.941859 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.944735 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:34Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:34 crc kubenswrapper[4792]: W0301 09:09:34.948168 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0982a9bb_56d4_4e1c_86cb_76a4152de9ba.slice/crio-ddbf2f829a1dc12dae3b569d0616d5fafc69ce7d261bb673e284e0d71f91480f WatchSource:0}: Error finding container ddbf2f829a1dc12dae3b569d0616d5fafc69ce7d261bb673e284e0d71f91480f: Status 404 returned error can't find the container with id ddbf2f829a1dc12dae3b569d0616d5fafc69ce7d261bb673e284e0d71f91480f Mar 01 09:09:34 crc kubenswrapper[4792]: W0301 09:09:34.952339 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9105f6b0_6f16_47aa_8009_73736a90b765.slice/crio-77ac8f0f45b53e6a6ab8f82d7543660fc5c9f8ea61d9196529ac31e5ab986a7c WatchSource:0}: Error finding container 77ac8f0f45b53e6a6ab8f82d7543660fc5c9f8ea61d9196529ac31e5ab986a7c: Status 404 returned error can't find the container with id 77ac8f0f45b53e6a6ab8f82d7543660fc5c9f8ea61d9196529ac31e5ab986a7c Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.958050 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:34Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.980201 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.980263 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.980282 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.980305 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.980331 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:34Z","lastTransitionTime":"2026-03-01T09:09:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.981448 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:34Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.988190 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-rbwx8"] Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.988807 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-pq28p"] Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.989042 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pq28p" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.989553 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.991019 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.991597 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.991645 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.991701 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.991706 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.991867 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.994135 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.994648 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7pp7m"] Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.995383 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.998793 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.999010 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.999082 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.999313 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.999451 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.999553 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.999585 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:34.999995 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:34Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.015086 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.029661 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.041667 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.050191 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.064251 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.079856 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.082282 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.082333 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.082362 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.082384 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.082399 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:35Z","lastTransitionTime":"2026-03-01T09:09:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.090863 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.108895 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.123781 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.130920 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-host-run-k8s-cni-cncf-io\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.130960 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-node-log\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.130977 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-log-socket\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.130993 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-run-ovn-kubernetes\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131029 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-run-netns\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131047 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-cni-netd\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131064 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-hostroot\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131079 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-multus-conf-dir\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131095 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/131582d9-bd96-444b-a597-ceb81e2b2085-os-release\") pod \"multus-additional-cni-plugins-rbwx8\" (UID: \"131582d9-bd96-444b-a597-ceb81e2b2085\") " pod="openshift-multus/multus-additional-cni-plugins-rbwx8" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131112 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-var-lib-openvswitch\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131127 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-host-var-lib-cni-multus\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131142 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/131582d9-bd96-444b-a597-ceb81e2b2085-cni-binary-copy\") pod \"multus-additional-cni-plugins-rbwx8\" (UID: \"131582d9-bd96-444b-a597-ceb81e2b2085\") " pod="openshift-multus/multus-additional-cni-plugins-rbwx8" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131166 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/131582d9-bd96-444b-a597-ceb81e2b2085-cnibin\") pod \"multus-additional-cni-plugins-rbwx8\" (UID: \"131582d9-bd96-444b-a597-ceb81e2b2085\") " pod="openshift-multus/multus-additional-cni-plugins-rbwx8" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131181 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/131582d9-bd96-444b-a597-ceb81e2b2085-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rbwx8\" (UID: \"131582d9-bd96-444b-a597-ceb81e2b2085\") " pod="openshift-multus/multus-additional-cni-plugins-rbwx8" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131197 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-kubelet\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131212 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-env-overrides\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131229 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-system-cni-dir\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131293 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/131582d9-bd96-444b-a597-ceb81e2b2085-system-cni-dir\") pod \"multus-additional-cni-plugins-rbwx8\" (UID: \"131582d9-bd96-444b-a597-ceb81e2b2085\") " pod="openshift-multus/multus-additional-cni-plugins-rbwx8" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131333 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-slash\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131350 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-run-systemd\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131365 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-ovnkube-config\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131384 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-multus-daemon-config\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131401 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-os-release\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131435 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxpbv\" (UniqueName: \"kubernetes.io/projected/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-kube-api-access-hxpbv\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131453 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvrws\" (UniqueName: \"kubernetes.io/projected/131582d9-bd96-444b-a597-ceb81e2b2085-kube-api-access-mvrws\") pod \"multus-additional-cni-plugins-rbwx8\" (UID: \"131582d9-bd96-444b-a597-ceb81e2b2085\") " pod="openshift-multus/multus-additional-cni-plugins-rbwx8" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131484 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-systemd-units\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131518 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-cni-binary-copy\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131533 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-etc-openvswitch\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131551 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-run-ovn\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131571 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-etc-kubernetes\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131587 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-run-openvswitch\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131604 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131622 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-multus-cni-dir\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131647 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-host-var-lib-cni-bin\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131677 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-ovnkube-script-lib\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131693 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqvlh\" (UniqueName: \"kubernetes.io/projected/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-kube-api-access-kqvlh\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131725 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/131582d9-bd96-444b-a597-ceb81e2b2085-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rbwx8\" (UID: \"131582d9-bd96-444b-a597-ceb81e2b2085\") " pod="openshift-multus/multus-additional-cni-plugins-rbwx8" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131764 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-host-run-netns\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131782 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-host-run-multus-certs\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131797 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-cni-bin\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131852 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-ovn-node-metrics-cert\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131877 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-cnibin\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131939 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-multus-socket-dir-parent\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131960 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-host-var-lib-kubelet\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.136957 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.154523 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.171959 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.184987 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.185320 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.185349 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.185367 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.185379 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:35Z","lastTransitionTime":"2026-03-01T09:09:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.192476 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.232606 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-cni-binary-copy\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.232824 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-etc-openvswitch\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.232964 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-run-ovn\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.233092 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-etc-kubernetes\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.233162 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-etc-kubernetes\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.232985 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-etc-openvswitch\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.233014 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-run-ovn\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.233328 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-run-openvswitch\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.233410 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-run-openvswitch\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.233495 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.233557 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-cni-binary-copy\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.233542 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.233636 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-multus-cni-dir\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.233694 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-host-var-lib-cni-bin\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.233722 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-ovnkube-script-lib\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.233753 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqvlh\" (UniqueName: \"kubernetes.io/projected/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-kube-api-access-kqvlh\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.233779 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/131582d9-bd96-444b-a597-ceb81e2b2085-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rbwx8\" (UID: \"131582d9-bd96-444b-a597-ceb81e2b2085\") " pod="openshift-multus/multus-additional-cni-plugins-rbwx8" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.233794 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-host-var-lib-cni-bin\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.233799 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-host-run-netns\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.233867 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-host-run-multus-certs\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.233892 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-cni-bin\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234015 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-ovn-node-metrics-cert\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234041 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-cnibin\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234061 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-multus-socket-dir-parent\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.233958 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-host-run-multus-certs\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.233819 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-host-run-netns\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.233983 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-cni-bin\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234164 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-cnibin\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234195 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-host-var-lib-kubelet\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234222 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-host-run-k8s-cni-cncf-io\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234258 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-multus-socket-dir-parent\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234269 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-host-var-lib-kubelet\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234309 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-node-log\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234330 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-log-socket\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234379 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-host-run-k8s-cni-cncf-io\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234410 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-node-log\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234448 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-run-ovn-kubernetes\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234505 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-log-socket\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234513 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-run-ovn-kubernetes\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234543 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-run-netns\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234558 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-ovnkube-script-lib\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234563 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-cni-netd\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234594 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/131582d9-bd96-444b-a597-ceb81e2b2085-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rbwx8\" (UID: \"131582d9-bd96-444b-a597-ceb81e2b2085\") " pod="openshift-multus/multus-additional-cni-plugins-rbwx8" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234617 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-run-netns\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234627 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-hostroot\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234599 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-cni-netd\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234653 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-multus-conf-dir\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234674 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-hostroot\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234678 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/131582d9-bd96-444b-a597-ceb81e2b2085-os-release\") pod \"multus-additional-cni-plugins-rbwx8\" (UID: \"131582d9-bd96-444b-a597-ceb81e2b2085\") " pod="openshift-multus/multus-additional-cni-plugins-rbwx8" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234715 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-var-lib-openvswitch\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234721 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/131582d9-bd96-444b-a597-ceb81e2b2085-os-release\") pod \"multus-additional-cni-plugins-rbwx8\" (UID: \"131582d9-bd96-444b-a597-ceb81e2b2085\") " pod="openshift-multus/multus-additional-cni-plugins-rbwx8" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234728 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-multus-conf-dir\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234734 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-host-var-lib-cni-multus\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234754 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/131582d9-bd96-444b-a597-ceb81e2b2085-cni-binary-copy\") pod \"multus-additional-cni-plugins-rbwx8\" (UID: \"131582d9-bd96-444b-a597-ceb81e2b2085\") " pod="openshift-multus/multus-additional-cni-plugins-rbwx8" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234774 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-host-var-lib-cni-multus\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234787 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/131582d9-bd96-444b-a597-ceb81e2b2085-cnibin\") pod \"multus-additional-cni-plugins-rbwx8\" (UID: \"131582d9-bd96-444b-a597-ceb81e2b2085\") " pod="openshift-multus/multus-additional-cni-plugins-rbwx8" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234793 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-var-lib-openvswitch\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234806 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/131582d9-bd96-444b-a597-ceb81e2b2085-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rbwx8\" (UID: \"131582d9-bd96-444b-a597-ceb81e2b2085\") " pod="openshift-multus/multus-additional-cni-plugins-rbwx8" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234841 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-kubelet\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234848 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/131582d9-bd96-444b-a597-ceb81e2b2085-cnibin\") pod \"multus-additional-cni-plugins-rbwx8\" (UID: \"131582d9-bd96-444b-a597-ceb81e2b2085\") " pod="openshift-multus/multus-additional-cni-plugins-rbwx8" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234871 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-env-overrides\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234886 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-kubelet\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234983 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-system-cni-dir\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.235018 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/131582d9-bd96-444b-a597-ceb81e2b2085-system-cni-dir\") pod \"multus-additional-cni-plugins-rbwx8\" (UID: \"131582d9-bd96-444b-a597-ceb81e2b2085\") " pod="openshift-multus/multus-additional-cni-plugins-rbwx8" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.235043 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-slash\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.235044 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-system-cni-dir\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.235067 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-run-systemd\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.235092 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-ovnkube-config\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.235114 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-slash\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.235118 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-run-systemd\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.235093 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/131582d9-bd96-444b-a597-ceb81e2b2085-system-cni-dir\") pod \"multus-additional-cni-plugins-rbwx8\" (UID: \"131582d9-bd96-444b-a597-ceb81e2b2085\") " pod="openshift-multus/multus-additional-cni-plugins-rbwx8" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.235118 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-multus-daemon-config\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.235197 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-os-release\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.235235 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxpbv\" (UniqueName: \"kubernetes.io/projected/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-kube-api-access-hxpbv\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.235261 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvrws\" (UniqueName: \"kubernetes.io/projected/131582d9-bd96-444b-a597-ceb81e2b2085-kube-api-access-mvrws\") pod \"multus-additional-cni-plugins-rbwx8\" (UID: \"131582d9-bd96-444b-a597-ceb81e2b2085\") " pod="openshift-multus/multus-additional-cni-plugins-rbwx8" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.235280 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-systemd-units\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.235328 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-systemd-units\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.235416 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/131582d9-bd96-444b-a597-ceb81e2b2085-cni-binary-copy\") pod \"multus-additional-cni-plugins-rbwx8\" (UID: \"131582d9-bd96-444b-a597-ceb81e2b2085\") " pod="openshift-multus/multus-additional-cni-plugins-rbwx8" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.235487 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-os-release\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.235639 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-ovnkube-config\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.235716 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-multus-daemon-config\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.235718 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/131582d9-bd96-444b-a597-ceb81e2b2085-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rbwx8\" (UID: \"131582d9-bd96-444b-a597-ceb81e2b2085\") " pod="openshift-multus/multus-additional-cni-plugins-rbwx8" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.235814 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-env-overrides\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.236392 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-multus-cni-dir\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.241640 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-ovn-node-metrics-cert\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.252445 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxpbv\" (UniqueName: \"kubernetes.io/projected/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-kube-api-access-hxpbv\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.256715 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvrws\" (UniqueName: \"kubernetes.io/projected/131582d9-bd96-444b-a597-ceb81e2b2085-kube-api-access-mvrws\") pod \"multus-additional-cni-plugins-rbwx8\" (UID: \"131582d9-bd96-444b-a597-ceb81e2b2085\") " pod="openshift-multus/multus-additional-cni-plugins-rbwx8" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.287802 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.288084 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.288249 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.288405 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.288568 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:35Z","lastTransitionTime":"2026-03-01T09:09:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.356311 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.360295 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqvlh\" (UniqueName: \"kubernetes.io/projected/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-kube-api-access-kqvlh\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.363020 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" Mar 01 09:09:35 crc kubenswrapper[4792]: W0301 09:09:35.367017 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad44dee2_f99e_4e77_bc6a_2ab7f39eddf3.slice/crio-bf0d8f1e23790f6324b6b66e068f8d94854a2a7119fd193c81aa2574ed9f7de8 WatchSource:0}: Error finding container bf0d8f1e23790f6324b6b66e068f8d94854a2a7119fd193c81aa2574ed9f7de8: Status 404 returned error can't find the container with id bf0d8f1e23790f6324b6b66e068f8d94854a2a7119fd193c81aa2574ed9f7de8 Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.369840 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: W0301 09:09:35.376205 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod131582d9_bd96_444b_a597_ceb81e2b2085.slice/crio-1ce7de19ad7ca9fbffde845a67c6237dfb5fe98b6b1fff71cd698df905e9d226 WatchSource:0}: Error finding container 1ce7de19ad7ca9fbffde845a67c6237dfb5fe98b6b1fff71cd698df905e9d226: Status 404 returned error can't find the container with id 1ce7de19ad7ca9fbffde845a67c6237dfb5fe98b6b1fff71cd698df905e9d226 Mar 01 09:09:35 crc kubenswrapper[4792]: W0301 09:09:35.384179 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2bd7bac_21cf_4657_ab84_68a14f99f8f0.slice/crio-50a5a13eb582ab0332ca2180448f447182fdf584f81ede3ccf1a5ef0fe6bed57 WatchSource:0}: Error finding container 50a5a13eb582ab0332ca2180448f447182fdf584f81ede3ccf1a5ef0fe6bed57: Status 404 returned error can't find the container with id 50a5a13eb582ab0332ca2180448f447182fdf584f81ede3ccf1a5ef0fe6bed57 Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.390647 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.390695 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.390706 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.390720 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.390729 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:35Z","lastTransitionTime":"2026-03-01T09:09:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.410028 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.410066 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.410073 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:09:35 crc kubenswrapper[4792]: E0301 09:09:35.410133 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:09:35 crc kubenswrapper[4792]: E0301 09:09:35.410269 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:09:35 crc kubenswrapper[4792]: E0301 09:09:35.410353 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.414635 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.415308 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.417404 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.418455 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.419620 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.420608 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.422148 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.422705 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.424087 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.424678 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.425238 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.426414 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.426955 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.427843 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.428410 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.429399 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.430225 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.430628 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.431550 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.432333 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.432827 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.433878 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.434307 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.435482 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.436007 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.437010 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.437630 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.438676 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.439412 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.440294 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.441460 4792 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.441562 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.443761 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.445341 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.445807 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.447409 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.448558 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.449163 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.450248 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.450921 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.451788 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.452750 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.454005 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.454631 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.455537 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.456068 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.456994 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.457669 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.458486 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.459113 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.459995 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.460537 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.461095 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.462342 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.492918 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.492944 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.492952 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.492964 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.492973 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:35Z","lastTransitionTime":"2026-03-01T09:09:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.537870 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.537989 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.538016 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:35 crc kubenswrapper[4792]: E0301 09:09:35.538046 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:09:37.538014711 +0000 UTC m=+106.779893918 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.538092 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.538163 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:09:35 crc kubenswrapper[4792]: E0301 09:09:35.538099 4792 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 01 09:09:35 crc kubenswrapper[4792]: E0301 09:09:35.538233 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:37.538214716 +0000 UTC m=+106.780094133 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 01 09:09:35 crc kubenswrapper[4792]: E0301 09:09:35.538350 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 01 09:09:35 crc kubenswrapper[4792]: E0301 09:09:35.538150 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 01 09:09:35 crc kubenswrapper[4792]: E0301 09:09:35.538401 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 01 09:09:35 crc kubenswrapper[4792]: E0301 09:09:35.538417 4792 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:35 crc kubenswrapper[4792]: E0301 09:09:35.538175 4792 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 01 09:09:35 crc kubenswrapper[4792]: E0301 09:09:35.538374 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 01 09:09:35 crc kubenswrapper[4792]: E0301 09:09:35.538523 4792 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:35 crc kubenswrapper[4792]: E0301 09:09:35.538468 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:37.538454432 +0000 UTC m=+106.780333859 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:35 crc kubenswrapper[4792]: E0301 09:09:35.538570 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:37.538558675 +0000 UTC m=+106.780438092 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 01 09:09:35 crc kubenswrapper[4792]: E0301 09:09:35.538586 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:37.538577425 +0000 UTC m=+106.780456882 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.595026 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.595068 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.595077 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.595093 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.595104 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:35Z","lastTransitionTime":"2026-03-01T09:09:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.697377 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.697422 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.697430 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.697442 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.697451 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:35Z","lastTransitionTime":"2026-03-01T09:09:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.799916 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.799949 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.799958 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.799972 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.799981 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:35Z","lastTransitionTime":"2026-03-01T09:09:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.891846 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerStarted","Data":"5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af"} Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.891897 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerStarted","Data":"47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac"} Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.891923 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerStarted","Data":"77ac8f0f45b53e6a6ab8f82d7543660fc5c9f8ea61d9196529ac31e5ab986a7c"} Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.893218 4792 generic.go:334] "Generic (PLEG): container finished" podID="131582d9-bd96-444b-a597-ceb81e2b2085" containerID="70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0" exitCode=0 Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.893276 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" event={"ID":"131582d9-bd96-444b-a597-ceb81e2b2085","Type":"ContainerDied","Data":"70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0"} Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.893293 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" event={"ID":"131582d9-bd96-444b-a597-ceb81e2b2085","Type":"ContainerStarted","Data":"1ce7de19ad7ca9fbffde845a67c6237dfb5fe98b6b1fff71cd698df905e9d226"} Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.894814 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pq28p" event={"ID":"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3","Type":"ContainerStarted","Data":"239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae"} Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.894898 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pq28p" event={"ID":"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3","Type":"ContainerStarted","Data":"bf0d8f1e23790f6324b6b66e068f8d94854a2a7119fd193c81aa2574ed9f7de8"} Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.896332 4792 generic.go:334] "Generic (PLEG): container finished" podID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerID="d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6" exitCode=0 Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.896419 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerDied","Data":"d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6"} Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.896478 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerStarted","Data":"50a5a13eb582ab0332ca2180448f447182fdf584f81ede3ccf1a5ef0fe6bed57"} Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.897920 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zql8j" event={"ID":"0982a9bb-56d4-4e1c-86cb-76a4152de9ba","Type":"ContainerStarted","Data":"bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc"} Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.904044 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zql8j" event={"ID":"0982a9bb-56d4-4e1c-86cb-76a4152de9ba","Type":"ContainerStarted","Data":"ddbf2f829a1dc12dae3b569d0616d5fafc69ce7d261bb673e284e0d71f91480f"} Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.906761 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.906791 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.906802 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.906817 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.906828 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:35Z","lastTransitionTime":"2026-03-01T09:09:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.920477 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.937835 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.949416 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.971955 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.981884 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.994863 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.009507 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.009542 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.009562 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.009588 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.009600 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:36Z","lastTransitionTime":"2026-03-01T09:09:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.011491 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.034984 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.054902 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.071895 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.084196 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.097658 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.110313 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.111895 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.111942 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.111950 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.111969 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.111978 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:36Z","lastTransitionTime":"2026-03-01T09:09:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.124618 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.140163 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.152575 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.167551 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.184401 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.199272 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.211373 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.215629 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.215663 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.215672 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.215687 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.215701 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:36Z","lastTransitionTime":"2026-03-01T09:09:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.223948 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.235275 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.247425 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.260518 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.317621 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.317660 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.317668 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.317680 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.317689 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:36Z","lastTransitionTime":"2026-03-01T09:09:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.420206 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.420516 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.420526 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.420538 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.420549 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:36Z","lastTransitionTime":"2026-03-01T09:09:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.522581 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.522624 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.522637 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.522651 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.522662 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:36Z","lastTransitionTime":"2026-03-01T09:09:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.632735 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.632767 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.632778 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.632791 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.632801 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:36Z","lastTransitionTime":"2026-03-01T09:09:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.735740 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.735765 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.735773 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.735789 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.735797 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:36Z","lastTransitionTime":"2026-03-01T09:09:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.838610 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.838973 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.838988 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.839004 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.839015 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:36Z","lastTransitionTime":"2026-03-01T09:09:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.902810 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" event={"ID":"131582d9-bd96-444b-a597-ceb81e2b2085","Type":"ContainerStarted","Data":"069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898"} Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.907664 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerStarted","Data":"05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8"} Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.907710 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerStarted","Data":"5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3"} Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.907720 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerStarted","Data":"0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a"} Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.907730 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerStarted","Data":"d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948"} Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.909209 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c"} Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.919618 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.933132 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.941378 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.941414 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.941423 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.941444 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.941455 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:36Z","lastTransitionTime":"2026-03-01T09:09:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.949009 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.961766 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.973773 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.984814 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.996249 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.009074 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.028519 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.043350 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.043393 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.043404 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.043420 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.043430 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:37Z","lastTransitionTime":"2026-03-01T09:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.046177 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.060180 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.074708 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.087738 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.098944 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.113182 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.126820 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.142937 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.145430 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.145461 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.145470 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.145495 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.145505 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:37Z","lastTransitionTime":"2026-03-01T09:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.154118 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.165857 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.178441 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.193578 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.204040 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.214885 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.232383 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.247476 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.247514 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.247525 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.247539 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.247550 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:37Z","lastTransitionTime":"2026-03-01T09:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.349901 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.349955 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.349965 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.349982 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.349992 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:37Z","lastTransitionTime":"2026-03-01T09:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.407985 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.408022 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.407984 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:37 crc kubenswrapper[4792]: E0301 09:09:37.408109 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:09:37 crc kubenswrapper[4792]: E0301 09:09:37.408175 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:09:37 crc kubenswrapper[4792]: E0301 09:09:37.408247 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.452588 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.452617 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.452625 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.452638 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.452647 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:37Z","lastTransitionTime":"2026-03-01T09:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.554888 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.554942 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.554954 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.554971 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.554987 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:37Z","lastTransitionTime":"2026-03-01T09:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.560316 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.560396 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:09:37 crc kubenswrapper[4792]: E0301 09:09:37.560449 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:09:41.560433875 +0000 UTC m=+110.802313072 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.560485 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.560504 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:37 crc kubenswrapper[4792]: E0301 09:09:37.560507 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 01 09:09:37 crc kubenswrapper[4792]: E0301 09:09:37.560523 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.560528 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:09:37 crc kubenswrapper[4792]: E0301 09:09:37.560534 4792 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:37 crc kubenswrapper[4792]: E0301 09:09:37.560568 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:41.560558349 +0000 UTC m=+110.802437546 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:37 crc kubenswrapper[4792]: E0301 09:09:37.560595 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 01 09:09:37 crc kubenswrapper[4792]: E0301 09:09:37.560606 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 01 09:09:37 crc kubenswrapper[4792]: E0301 09:09:37.560614 4792 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:37 crc kubenswrapper[4792]: E0301 09:09:37.560639 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:41.56063119 +0000 UTC m=+110.802510377 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:37 crc kubenswrapper[4792]: E0301 09:09:37.560666 4792 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 01 09:09:37 crc kubenswrapper[4792]: E0301 09:09:37.560685 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:41.560680052 +0000 UTC m=+110.802559249 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 01 09:09:37 crc kubenswrapper[4792]: E0301 09:09:37.560677 4792 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 01 09:09:37 crc kubenswrapper[4792]: E0301 09:09:37.560800 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:41.560770564 +0000 UTC m=+110.802649791 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.661538 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.661574 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.661586 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.661601 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.661611 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:37Z","lastTransitionTime":"2026-03-01T09:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.763567 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.763613 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.763623 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.763637 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.763646 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:37Z","lastTransitionTime":"2026-03-01T09:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.866174 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.866211 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.866223 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.866239 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.866250 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:37Z","lastTransitionTime":"2026-03-01T09:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.899489 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-4gj45"] Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.900070 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4gj45" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.901867 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.902134 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.903441 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.907799 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.914719 4792 generic.go:334] "Generic (PLEG): container finished" podID="131582d9-bd96-444b-a597-ceb81e2b2085" containerID="069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898" exitCode=0 Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.914806 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" event={"ID":"131582d9-bd96-444b-a597-ceb81e2b2085","Type":"ContainerDied","Data":"069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898"} Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.924724 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerStarted","Data":"0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e"} Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.924760 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerStarted","Data":"ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe"} Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.931262 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.953614 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.969817 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.969847 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.969857 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.969893 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.969919 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:37Z","lastTransitionTime":"2026-03-01T09:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.973358 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.991743 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.005009 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.016618 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.031251 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.044031 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.054507 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.066134 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5923c286-5572-46c3-bed5-79cd67efc945-host\") pod \"node-ca-4gj45\" (UID: \"5923c286-5572-46c3-bed5-79cd67efc945\") " pod="openshift-image-registry/node-ca-4gj45" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.066263 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntx44\" (UniqueName: \"kubernetes.io/projected/5923c286-5572-46c3-bed5-79cd67efc945-kube-api-access-ntx44\") pod \"node-ca-4gj45\" (UID: \"5923c286-5572-46c3-bed5-79cd67efc945\") " pod="openshift-image-registry/node-ca-4gj45" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.066297 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5923c286-5572-46c3-bed5-79cd67efc945-serviceca\") pod \"node-ca-4gj45\" (UID: \"5923c286-5572-46c3-bed5-79cd67efc945\") " pod="openshift-image-registry/node-ca-4gj45" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.069743 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.072073 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.072096 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.072105 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.072122 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.072131 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:38Z","lastTransitionTime":"2026-03-01T09:09:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.083942 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.097372 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.115861 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.127751 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.138137 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.155072 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.166738 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5923c286-5572-46c3-bed5-79cd67efc945-host\") pod \"node-ca-4gj45\" (UID: \"5923c286-5572-46c3-bed5-79cd67efc945\") " pod="openshift-image-registry/node-ca-4gj45" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.166802 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntx44\" (UniqueName: \"kubernetes.io/projected/5923c286-5572-46c3-bed5-79cd67efc945-kube-api-access-ntx44\") pod \"node-ca-4gj45\" (UID: \"5923c286-5572-46c3-bed5-79cd67efc945\") " pod="openshift-image-registry/node-ca-4gj45" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.166822 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5923c286-5572-46c3-bed5-79cd67efc945-serviceca\") pod \"node-ca-4gj45\" (UID: \"5923c286-5572-46c3-bed5-79cd67efc945\") " pod="openshift-image-registry/node-ca-4gj45" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.166936 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5923c286-5572-46c3-bed5-79cd67efc945-host\") pod \"node-ca-4gj45\" (UID: \"5923c286-5572-46c3-bed5-79cd67efc945\") " pod="openshift-image-registry/node-ca-4gj45" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.167633 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5923c286-5572-46c3-bed5-79cd67efc945-serviceca\") pod \"node-ca-4gj45\" (UID: \"5923c286-5572-46c3-bed5-79cd67efc945\") " pod="openshift-image-registry/node-ca-4gj45" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.169431 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.173718 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.173747 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.173760 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.173775 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.173789 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:38Z","lastTransitionTime":"2026-03-01T09:09:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.182940 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.187703 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntx44\" (UniqueName: \"kubernetes.io/projected/5923c286-5572-46c3-bed5-79cd67efc945-kube-api-access-ntx44\") pod \"node-ca-4gj45\" (UID: \"5923c286-5572-46c3-bed5-79cd67efc945\") " pod="openshift-image-registry/node-ca-4gj45" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.197037 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.209365 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.220757 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.228707 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4gj45" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.232437 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.243711 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.260205 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.274569 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.276408 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.276448 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.276462 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.276477 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.276487 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:38Z","lastTransitionTime":"2026-03-01T09:09:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.284016 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.378695 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.378729 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.378737 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.378756 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.378765 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:38Z","lastTransitionTime":"2026-03-01T09:09:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.484992 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.485034 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.485045 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.485061 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.485072 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:38Z","lastTransitionTime":"2026-03-01T09:09:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.588131 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.588206 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.588220 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.588247 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.588262 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:38Z","lastTransitionTime":"2026-03-01T09:09:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.691409 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.691691 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.691789 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.691881 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.691998 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:38Z","lastTransitionTime":"2026-03-01T09:09:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.794825 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.795493 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.795598 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.795673 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.795744 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:38Z","lastTransitionTime":"2026-03-01T09:09:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.898527 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.898590 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.898613 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.898643 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.898663 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:38Z","lastTransitionTime":"2026-03-01T09:09:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.928331 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4gj45" event={"ID":"5923c286-5572-46c3-bed5-79cd67efc945","Type":"ContainerStarted","Data":"c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4"} Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.928372 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4gj45" event={"ID":"5923c286-5572-46c3-bed5-79cd67efc945","Type":"ContainerStarted","Data":"b0a9dab35e0ef28411206f88fa21e6ed13237f334bc482e178c18d5655cc7d3a"} Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.930711 4792 generic.go:334] "Generic (PLEG): container finished" podID="131582d9-bd96-444b-a597-ceb81e2b2085" containerID="bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43" exitCode=0 Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.930872 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" event={"ID":"131582d9-bd96-444b-a597-ceb81e2b2085","Type":"ContainerDied","Data":"bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43"} Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.951542 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.970810 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.985176 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.001297 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.001338 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.001349 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.001366 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.001376 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:39Z","lastTransitionTime":"2026-03-01T09:09:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.004848 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.017542 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.033410 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.046589 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.058280 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.066183 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.066213 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.066222 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.066235 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.066254 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:39Z","lastTransitionTime":"2026-03-01T09:09:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.072670 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: E0301 09:09:39.080875 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.084282 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.084306 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.084314 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.084328 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.084337 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:39Z","lastTransitionTime":"2026-03-01T09:09:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:39 crc kubenswrapper[4792]: E0301 09:09:39.097835 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.098279 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.101032 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.101071 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.101084 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.101104 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.101117 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:39Z","lastTransitionTime":"2026-03-01T09:09:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.111176 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: E0301 09:09:39.112738 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.119446 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.119476 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.119490 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.119505 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.119518 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:39Z","lastTransitionTime":"2026-03-01T09:09:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.126435 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: E0301 09:09:39.131954 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.135769 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.135803 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.135815 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.135831 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.135872 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:39Z","lastTransitionTime":"2026-03-01T09:09:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.143654 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: E0301 09:09:39.153047 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: E0301 09:09:39.153184 4792 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.154595 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.154614 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.154623 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.154636 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.154645 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:39Z","lastTransitionTime":"2026-03-01T09:09:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.159686 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.179124 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.198828 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.216917 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.229814 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.245410 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.256666 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.256702 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.256711 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.256725 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.256734 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:39Z","lastTransitionTime":"2026-03-01T09:09:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.259164 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.272761 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.284510 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.293982 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.304291 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.315234 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.336757 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.359610 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.359652 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.359666 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.359684 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.359696 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:39Z","lastTransitionTime":"2026-03-01T09:09:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.408719 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.408754 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.408791 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:09:39 crc kubenswrapper[4792]: E0301 09:09:39.408850 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:09:39 crc kubenswrapper[4792]: E0301 09:09:39.408942 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:09:39 crc kubenswrapper[4792]: E0301 09:09:39.409102 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.462484 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.462517 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.462525 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.462539 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.462584 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:39Z","lastTransitionTime":"2026-03-01T09:09:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.565873 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.566055 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.566079 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.566104 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.566122 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:39Z","lastTransitionTime":"2026-03-01T09:09:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.668850 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.668882 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.668891 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.668930 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.668945 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:39Z","lastTransitionTime":"2026-03-01T09:09:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.770597 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.770704 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.770712 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.770725 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.770733 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:39Z","lastTransitionTime":"2026-03-01T09:09:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.873791 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.873839 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.873854 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.873875 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.873885 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:39Z","lastTransitionTime":"2026-03-01T09:09:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.937657 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerStarted","Data":"63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0"} Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.940034 4792 generic.go:334] "Generic (PLEG): container finished" podID="131582d9-bd96-444b-a597-ceb81e2b2085" containerID="5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087" exitCode=0 Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.940076 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" event={"ID":"131582d9-bd96-444b-a597-ceb81e2b2085","Type":"ContainerDied","Data":"5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087"} Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.956154 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.966854 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.979448 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.979495 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.979507 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.979527 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.979540 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:39Z","lastTransitionTime":"2026-03-01T09:09:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.982640 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.995381 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.004673 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:40Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.016397 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:40Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.033432 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:40Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.074575 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:40Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.085363 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.085401 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.085410 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.085426 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.085436 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:40Z","lastTransitionTime":"2026-03-01T09:09:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.113042 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:40Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.135594 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:40Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.150235 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:40Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.162396 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:40Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.175558 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:40Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.188054 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.188101 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.188114 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.188130 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.188141 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:40Z","lastTransitionTime":"2026-03-01T09:09:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.293050 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.293095 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.293108 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.293126 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.293138 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:40Z","lastTransitionTime":"2026-03-01T09:09:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.399629 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.399685 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.399701 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.399719 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.399736 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:40Z","lastTransitionTime":"2026-03-01T09:09:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.502684 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.502857 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.502943 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.502976 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.502997 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:40Z","lastTransitionTime":"2026-03-01T09:09:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.605131 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.605658 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.605743 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.605851 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.605945 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:40Z","lastTransitionTime":"2026-03-01T09:09:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.708497 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.708530 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.708539 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.708551 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.708560 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:40Z","lastTransitionTime":"2026-03-01T09:09:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.811552 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.811606 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.811624 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.811646 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.811663 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:40Z","lastTransitionTime":"2026-03-01T09:09:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.914208 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.914265 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.914282 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.914307 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.914325 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:40Z","lastTransitionTime":"2026-03-01T09:09:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.946333 4792 generic.go:334] "Generic (PLEG): container finished" podID="131582d9-bd96-444b-a597-ceb81e2b2085" containerID="ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301" exitCode=0 Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.946595 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" event={"ID":"131582d9-bd96-444b-a597-ceb81e2b2085","Type":"ContainerDied","Data":"ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301"} Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.959633 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:40Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.971138 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:40Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.981376 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:40Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.999517 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:40Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.017099 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.017280 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.017307 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.017318 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.017333 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.017346 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:41Z","lastTransitionTime":"2026-03-01T09:09:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.017760 4792 scope.go:117] "RemoveContainer" containerID="40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2" Mar 01 09:09:41 crc kubenswrapper[4792]: E0301 09:09:41.017971 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.021275 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.032722 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.049867 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.068159 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.082750 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.096408 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.116947 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.119318 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.119355 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.119369 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.119387 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.119398 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:41Z","lastTransitionTime":"2026-03-01T09:09:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.137732 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.151495 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.222073 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.222108 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.222119 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.222140 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.222153 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:41Z","lastTransitionTime":"2026-03-01T09:09:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.324791 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.324829 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.324838 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.324853 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.324861 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:41Z","lastTransitionTime":"2026-03-01T09:09:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.408327 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.408344 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.408406 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:41 crc kubenswrapper[4792]: E0301 09:09:41.408533 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:09:41 crc kubenswrapper[4792]: E0301 09:09:41.408783 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:09:41 crc kubenswrapper[4792]: E0301 09:09:41.408824 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.420425 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.427178 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.427215 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.427226 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.427242 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.427252 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:41Z","lastTransitionTime":"2026-03-01T09:09:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.430771 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.449897 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.464057 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.477581 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.492933 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.509655 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.523790 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.529009 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.529039 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.529054 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.529073 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.529085 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:41Z","lastTransitionTime":"2026-03-01T09:09:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.539961 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.556841 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.570128 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.585302 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.599151 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.604503 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.604606 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.604644 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:09:41 crc kubenswrapper[4792]: E0301 09:09:41.604658 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:09:49.604639039 +0000 UTC m=+118.846518236 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.604705 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:41 crc kubenswrapper[4792]: E0301 09:09:41.604753 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 01 09:09:41 crc kubenswrapper[4792]: E0301 09:09:41.604765 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 01 09:09:41 crc kubenswrapper[4792]: E0301 09:09:41.604776 4792 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:41 crc kubenswrapper[4792]: E0301 09:09:41.604787 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 01 09:09:41 crc kubenswrapper[4792]: E0301 09:09:41.604806 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 01 09:09:41 crc kubenswrapper[4792]: E0301 09:09:41.604820 4792 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:41 crc kubenswrapper[4792]: E0301 09:09:41.604830 4792 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 01 09:09:41 crc kubenswrapper[4792]: E0301 09:09:41.604809 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:49.604801674 +0000 UTC m=+118.846680871 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:41 crc kubenswrapper[4792]: E0301 09:09:41.604874 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:49.604861165 +0000 UTC m=+118.846740362 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 01 09:09:41 crc kubenswrapper[4792]: E0301 09:09:41.604893 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:49.604884916 +0000 UTC m=+118.846764113 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.604737 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:41 crc kubenswrapper[4792]: E0301 09:09:41.605043 4792 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 01 09:09:41 crc kubenswrapper[4792]: E0301 09:09:41.605241 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:49.605210244 +0000 UTC m=+118.847089441 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.631451 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.631504 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.631518 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.631538 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.631550 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:41Z","lastTransitionTime":"2026-03-01T09:09:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.734569 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.734625 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.734636 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.734659 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.734675 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:41Z","lastTransitionTime":"2026-03-01T09:09:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.837049 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.837089 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.837100 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.837116 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.837126 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:41Z","lastTransitionTime":"2026-03-01T09:09:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.939770 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.939799 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.939809 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.939822 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.939831 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:41Z","lastTransitionTime":"2026-03-01T09:09:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.952399 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerStarted","Data":"26263abc51f572d7ba71366a0e5f432981bad7e073a91d6fc6bfc0ffa2ca9f23"} Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.952985 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.956355 4792 generic.go:334] "Generic (PLEG): container finished" podID="131582d9-bd96-444b-a597-ceb81e2b2085" containerID="9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f" exitCode=0 Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.956383 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" event={"ID":"131582d9-bd96-444b-a597-ceb81e2b2085","Type":"ContainerDied","Data":"9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f"} Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.967524 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.979342 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.989610 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.994167 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.007755 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.020560 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.034874 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.042075 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.042101 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.042109 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.042121 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.042129 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:42Z","lastTransitionTime":"2026-03-01T09:09:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.051362 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.063561 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.079740 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26263abc51f572d7ba71366a0e5f432981bad7e073a91d6fc6bfc0ffa2ca9f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.093281 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.105788 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.116866 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.130478 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.144731 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.144772 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.144800 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.144815 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.144826 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:42Z","lastTransitionTime":"2026-03-01T09:09:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.144729 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.154452 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.163780 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.178762 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.198145 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.210491 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.228495 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.240473 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.249067 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.249113 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.249125 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.249142 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.249153 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:42Z","lastTransitionTime":"2026-03-01T09:09:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.260133 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26263abc51f572d7ba71366a0e5f432981bad7e073a91d6fc6bfc0ffa2ca9f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.272730 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.286408 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.304156 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.320140 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.351506 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.351533 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.351541 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.351554 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.351563 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:42Z","lastTransitionTime":"2026-03-01T09:09:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.453815 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.453842 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.453850 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.453863 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.453873 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:42Z","lastTransitionTime":"2026-03-01T09:09:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.555878 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.555923 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.555932 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.555944 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.555954 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:42Z","lastTransitionTime":"2026-03-01T09:09:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.658365 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.658393 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.658403 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.658415 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.658424 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:42Z","lastTransitionTime":"2026-03-01T09:09:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.760733 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.760758 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.760767 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.760778 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.760787 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:42Z","lastTransitionTime":"2026-03-01T09:09:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.862812 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.862859 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.862869 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.862923 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.862934 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:42Z","lastTransitionTime":"2026-03-01T09:09:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.962698 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" event={"ID":"131582d9-bd96-444b-a597-ceb81e2b2085","Type":"ContainerStarted","Data":"67e87701fd366bbcc1401440e5be6144733e60e6e5d64b63be5ac3507947ee62"} Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.963620 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.963665 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.964338 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.964363 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.964373 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.964386 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.964411 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:42Z","lastTransitionTime":"2026-03-01T09:09:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.978661 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.984034 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.992062 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.004981 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.018244 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.034434 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.048523 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.063255 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.066691 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.066771 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.066792 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.066843 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.066873 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:43Z","lastTransitionTime":"2026-03-01T09:09:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.078274 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e87701fd366bbcc1401440e5be6144733e60e6e5d64b63be5ac3507947ee62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.090658 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.103651 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.126588 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26263abc51f572d7ba71366a0e5f432981bad7e073a91d6fc6bfc0ffa2ca9f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.141109 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.152805 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.167809 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.168436 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.168492 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.168506 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.168523 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.168556 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:43Z","lastTransitionTime":"2026-03-01T09:09:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.181501 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.192926 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.204556 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.214308 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.226550 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.240441 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e87701fd366bbcc1401440e5be6144733e60e6e5d64b63be5ac3507947ee62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.254125 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.265268 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.270065 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.270086 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.270094 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.270107 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.270115 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:43Z","lastTransitionTime":"2026-03-01T09:09:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.275073 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.286437 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.301114 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.318800 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26263abc51f572d7ba71366a0e5f432981bad7e073a91d6fc6bfc0ffa2ca9f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.372385 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.372432 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.372441 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.372454 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.372462 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:43Z","lastTransitionTime":"2026-03-01T09:09:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.410980 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:09:43 crc kubenswrapper[4792]: E0301 09:09:43.412587 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.412935 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:09:43 crc kubenswrapper[4792]: E0301 09:09:43.412987 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.413021 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:43 crc kubenswrapper[4792]: E0301 09:09:43.413058 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.475487 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.475521 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.475531 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.475546 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.475555 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:43Z","lastTransitionTime":"2026-03-01T09:09:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.578236 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.578279 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.578290 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.578306 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.578317 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:43Z","lastTransitionTime":"2026-03-01T09:09:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.680413 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.680690 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.680863 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.681014 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.681138 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:43Z","lastTransitionTime":"2026-03-01T09:09:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.783747 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.783792 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.783801 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.783817 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.783828 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:43Z","lastTransitionTime":"2026-03-01T09:09:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.885850 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.885943 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.885963 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.885989 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.886009 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:43Z","lastTransitionTime":"2026-03-01T09:09:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.989141 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.989220 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.989230 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.989245 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.989255 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:43Z","lastTransitionTime":"2026-03-01T09:09:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.091059 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.091086 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.091094 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.091108 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.091117 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:44Z","lastTransitionTime":"2026-03-01T09:09:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.193438 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.193472 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.193481 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.193494 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.193503 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:44Z","lastTransitionTime":"2026-03-01T09:09:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.296038 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.296077 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.296089 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.296610 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.296629 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:44Z","lastTransitionTime":"2026-03-01T09:09:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.399208 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.399250 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.399260 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.399274 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.399283 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:44Z","lastTransitionTime":"2026-03-01T09:09:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.501497 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.501540 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.501551 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.501575 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.501590 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:44Z","lastTransitionTime":"2026-03-01T09:09:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.603929 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.603967 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.603975 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.603989 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.603998 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:44Z","lastTransitionTime":"2026-03-01T09:09:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.705998 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.706035 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.706049 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.706064 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.706074 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:44Z","lastTransitionTime":"2026-03-01T09:09:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.808739 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.808792 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.808810 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.808845 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.808857 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:44Z","lastTransitionTime":"2026-03-01T09:09:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.911387 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.911425 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.911436 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.911451 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.911462 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:44Z","lastTransitionTime":"2026-03-01T09:09:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.970037 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pp7m_e2bd7bac-21cf-4657-ab84-68a14f99f8f0/ovnkube-controller/0.log" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.973463 4792 generic.go:334] "Generic (PLEG): container finished" podID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerID="26263abc51f572d7ba71366a0e5f432981bad7e073a91d6fc6bfc0ffa2ca9f23" exitCode=1 Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.973520 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerDied","Data":"26263abc51f572d7ba71366a0e5f432981bad7e073a91d6fc6bfc0ffa2ca9f23"} Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.974830 4792 scope.go:117] "RemoveContainer" containerID="26263abc51f572d7ba71366a0e5f432981bad7e073a91d6fc6bfc0ffa2ca9f23" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.989484 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:44Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.000806 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:44Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.013743 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.014004 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.014013 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.014027 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.014037 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:45Z","lastTransitionTime":"2026-03-01T09:09:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.024076 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26263abc51f572d7ba71366a0e5f432981bad7e073a91d6fc6bfc0ffa2ca9f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263abc51f572d7ba71366a0e5f432981bad7e073a91d6fc6bfc0ffa2ca9f23\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:09:44Z\\\",\\\"message\\\":\\\"nPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0301 09:09:44.646724 6493 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0301 09:09:44.646811 6493 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0301 09:09:44.646980 6493 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0301 09:09:44.646998 6493 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0301 09:09:44.647121 6493 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0301 09:09:44.647665 6493 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0301 09:09:44.647688 6493 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0301 09:09:44.647713 6493 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0301 09:09:44.647733 6493 factory.go:656] Stopping watch factory\\\\nI0301 09:09:44.647748 6493 ovnkube.go:599] Stopped ovnkube\\\\nI0301 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:45Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.064199 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:45Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.089512 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:45Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.114240 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:45Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.116178 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.116213 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.116223 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.116238 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.116249 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:45Z","lastTransitionTime":"2026-03-01T09:09:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.134074 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:45Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.152960 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e87701fd366bbcc1401440e5be6144733e60e6e5d64b63be5ac3507947ee62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:45Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.162762 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:45Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.176117 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:45Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.188060 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:45Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.200823 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:45Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.216239 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:45Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.218841 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.218867 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.218876 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.218891 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.218900 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:45Z","lastTransitionTime":"2026-03-01T09:09:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.321197 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.321230 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.321243 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.321259 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.321269 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:45Z","lastTransitionTime":"2026-03-01T09:09:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.409628 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:45 crc kubenswrapper[4792]: E0301 09:09:45.409718 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.410041 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:09:45 crc kubenswrapper[4792]: E0301 09:09:45.410095 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.410133 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:09:45 crc kubenswrapper[4792]: E0301 09:09:45.410170 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.437808 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.437943 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.438028 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.438148 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.438240 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:45Z","lastTransitionTime":"2026-03-01T09:09:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.541774 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.542097 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.542264 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.542431 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.542658 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:45Z","lastTransitionTime":"2026-03-01T09:09:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.646451 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.646492 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.646505 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.646521 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.646532 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:45Z","lastTransitionTime":"2026-03-01T09:09:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.748494 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.748531 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.748541 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.748556 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.748566 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:45Z","lastTransitionTime":"2026-03-01T09:09:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.850375 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.850413 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.850423 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.850437 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.850447 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:45Z","lastTransitionTime":"2026-03-01T09:09:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.952741 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.952793 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.952808 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.952826 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.952836 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:45Z","lastTransitionTime":"2026-03-01T09:09:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.977862 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pp7m_e2bd7bac-21cf-4657-ab84-68a14f99f8f0/ovnkube-controller/0.log" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.980332 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerStarted","Data":"52f06d1a6aa3f9871d55b3db6eb32ffacdf68fded2a68958577c63d060e19d25"} Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.981203 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.001405 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f06d1a6aa3f9871d55b3db6eb32ffacdf68fded2a68958577c63d060e19d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263abc51f572d7ba71366a0e5f432981bad7e073a91d6fc6bfc0ffa2ca9f23\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:09:44Z\\\",\\\"message\\\":\\\"nPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0301 09:09:44.646724 6493 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0301 09:09:44.646811 6493 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0301 09:09:44.646980 6493 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0301 09:09:44.646998 6493 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0301 09:09:44.647121 6493 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0301 09:09:44.647665 6493 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0301 09:09:44.647688 6493 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0301 09:09:44.647713 6493 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0301 09:09:44.647733 6493 factory.go:656] Stopping watch factory\\\\nI0301 09:09:44.647748 6493 ovnkube.go:599] Stopped ovnkube\\\\nI0301 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:45Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.013798 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:46Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.024492 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:46Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.037213 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:46Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.049088 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:46Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.054879 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.054927 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.054935 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.054947 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.054962 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:46Z","lastTransitionTime":"2026-03-01T09:09:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.064305 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:46Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.077113 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:46Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.091516 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:46Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.100970 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:46Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.112997 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:46Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.126834 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e87701fd366bbcc1401440e5be6144733e60e6e5d64b63be5ac3507947ee62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:46Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.135309 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:46Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.147060 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:46Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.156613 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.156649 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.156660 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.156675 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.156685 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:46Z","lastTransitionTime":"2026-03-01T09:09:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.260278 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.260327 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.260339 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.260358 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.260370 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:46Z","lastTransitionTime":"2026-03-01T09:09:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.362978 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.363025 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.363038 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.363060 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.363072 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:46Z","lastTransitionTime":"2026-03-01T09:09:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.465378 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.465436 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.465458 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.465491 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.465510 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:46Z","lastTransitionTime":"2026-03-01T09:09:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.567361 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.567389 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.567398 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.567411 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.567420 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:46Z","lastTransitionTime":"2026-03-01T09:09:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.669469 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.669503 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.669512 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.669524 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.669532 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:46Z","lastTransitionTime":"2026-03-01T09:09:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.772047 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.772119 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.772137 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.772161 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.772218 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:46Z","lastTransitionTime":"2026-03-01T09:09:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.874048 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.874077 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.874085 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.874097 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.874106 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:46Z","lastTransitionTime":"2026-03-01T09:09:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.955536 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn"] Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.956130 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.957510 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.957784 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.958283 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9ddb0171-7126-45ef-aea2-8433f52357a6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-k7dvn\" (UID: \"9ddb0171-7126-45ef-aea2-8433f52357a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.958517 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx98x\" (UniqueName: \"kubernetes.io/projected/9ddb0171-7126-45ef-aea2-8433f52357a6-kube-api-access-rx98x\") pod \"ovnkube-control-plane-749d76644c-k7dvn\" (UID: \"9ddb0171-7126-45ef-aea2-8433f52357a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.958733 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9ddb0171-7126-45ef-aea2-8433f52357a6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-k7dvn\" (UID: \"9ddb0171-7126-45ef-aea2-8433f52357a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.958838 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9ddb0171-7126-45ef-aea2-8433f52357a6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-k7dvn\" (UID: \"9ddb0171-7126-45ef-aea2-8433f52357a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.970816 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:46Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.976052 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.976087 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.976097 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.976113 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.976124 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:46Z","lastTransitionTime":"2026-03-01T09:09:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.985064 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pp7m_e2bd7bac-21cf-4657-ab84-68a14f99f8f0/ovnkube-controller/1.log" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.985500 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pp7m_e2bd7bac-21cf-4657-ab84-68a14f99f8f0/ovnkube-controller/0.log" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.986839 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:46Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.988160 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerDied","Data":"52f06d1a6aa3f9871d55b3db6eb32ffacdf68fded2a68958577c63d060e19d25"} Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.988264 4792 scope.go:117] "RemoveContainer" containerID="26263abc51f572d7ba71366a0e5f432981bad7e073a91d6fc6bfc0ffa2ca9f23" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.988423 4792 generic.go:334] "Generic (PLEG): container finished" podID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerID="52f06d1a6aa3f9871d55b3db6eb32ffacdf68fded2a68958577c63d060e19d25" exitCode=1 Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.989957 4792 scope.go:117] "RemoveContainer" containerID="52f06d1a6aa3f9871d55b3db6eb32ffacdf68fded2a68958577c63d060e19d25" Mar 01 09:09:46 crc kubenswrapper[4792]: E0301 09:09:46.990202 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7pp7m_openshift-ovn-kubernetes(e2bd7bac-21cf-4657-ab84-68a14f99f8f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.001865 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.013030 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.024012 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.033443 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.045176 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.058031 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e87701fd366bbcc1401440e5be6144733e60e6e5d64b63be5ac3507947ee62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.059403 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9ddb0171-7126-45ef-aea2-8433f52357a6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-k7dvn\" (UID: \"9ddb0171-7126-45ef-aea2-8433f52357a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.059465 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx98x\" (UniqueName: \"kubernetes.io/projected/9ddb0171-7126-45ef-aea2-8433f52357a6-kube-api-access-rx98x\") pod \"ovnkube-control-plane-749d76644c-k7dvn\" (UID: \"9ddb0171-7126-45ef-aea2-8433f52357a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.059531 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9ddb0171-7126-45ef-aea2-8433f52357a6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-k7dvn\" (UID: \"9ddb0171-7126-45ef-aea2-8433f52357a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.059557 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9ddb0171-7126-45ef-aea2-8433f52357a6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-k7dvn\" (UID: \"9ddb0171-7126-45ef-aea2-8433f52357a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.060693 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9ddb0171-7126-45ef-aea2-8433f52357a6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-k7dvn\" (UID: \"9ddb0171-7126-45ef-aea2-8433f52357a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.060949 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9ddb0171-7126-45ef-aea2-8433f52357a6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-k7dvn\" (UID: \"9ddb0171-7126-45ef-aea2-8433f52357a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.066383 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9ddb0171-7126-45ef-aea2-8433f52357a6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-k7dvn\" (UID: \"9ddb0171-7126-45ef-aea2-8433f52357a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.068110 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.075841 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx98x\" (UniqueName: \"kubernetes.io/projected/9ddb0171-7126-45ef-aea2-8433f52357a6-kube-api-access-rx98x\") pod \"ovnkube-control-plane-749d76644c-k7dvn\" (UID: \"9ddb0171-7126-45ef-aea2-8433f52357a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.077674 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.077780 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.077853 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.077948 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.078024 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:47Z","lastTransitionTime":"2026-03-01T09:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.081978 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.098297 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f06d1a6aa3f9871d55b3db6eb32ffacdf68fded2a68958577c63d060e19d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263abc51f572d7ba71366a0e5f432981bad7e073a91d6fc6bfc0ffa2ca9f23\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:09:44Z\\\",\\\"message\\\":\\\"nPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0301 09:09:44.646724 6493 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0301 09:09:44.646811 6493 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0301 09:09:44.646980 6493 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0301 09:09:44.646998 6493 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0301 09:09:44.647121 6493 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0301 09:09:44.647665 6493 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0301 09:09:44.647688 6493 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0301 09:09:44.647713 6493 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0301 09:09:44.647733 6493 factory.go:656] Stopping watch factory\\\\nI0301 09:09:44.647748 6493 ovnkube.go:599] Stopped ovnkube\\\\nI0301 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.109249 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddb0171-7126-45ef-aea2-8433f52357a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k7dvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.120249 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.130516 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.142571 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.156430 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.170417 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.180754 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.180821 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.180838 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.180855 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.180866 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:47Z","lastTransitionTime":"2026-03-01T09:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.186557 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.197113 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.208952 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.222959 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.235674 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.254653 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e87701fd366bbcc1401440e5be6144733e60e6e5d64b63be5ac3507947ee62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.264810 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.268739 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.279602 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: W0301 09:09:47.282794 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ddb0171_7126_45ef_aea2_8433f52357a6.slice/crio-2b48f991cbaafe8d2f382b896d1059580c6167e8c8b319e3db1398bab76a7ad3 WatchSource:0}: Error finding container 2b48f991cbaafe8d2f382b896d1059580c6167e8c8b319e3db1398bab76a7ad3: Status 404 returned error can't find the container with id 2b48f991cbaafe8d2f382b896d1059580c6167e8c8b319e3db1398bab76a7ad3 Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.282983 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.283003 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.283011 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.283023 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.283031 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:47Z","lastTransitionTime":"2026-03-01T09:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.303122 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f06d1a6aa3f9871d55b3db6eb32ffacdf68fded2a68958577c63d060e19d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263abc51f572d7ba71366a0e5f432981bad7e073a91d6fc6bfc0ffa2ca9f23\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:09:44Z\\\",\\\"message\\\":\\\"nPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0301 09:09:44.646724 6493 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0301 09:09:44.646811 6493 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0301 09:09:44.646980 6493 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0301 09:09:44.646998 6493 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0301 09:09:44.647121 6493 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0301 09:09:44.647665 6493 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0301 09:09:44.647688 6493 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0301 09:09:44.647713 6493 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0301 09:09:44.647733 6493 factory.go:656] Stopping watch factory\\\\nI0301 09:09:44.647748 6493 ovnkube.go:599] Stopped ovnkube\\\\nI0301 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52f06d1a6aa3f9871d55b3db6eb32ffacdf68fded2a68958577c63d060e19d25\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"message\\\":\\\"rage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0301 09:09:46.070022 6641 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:46Z is after 2025-08-24T17:21:41Z]\\\\nI0301 09:09:46.070084 6641 services_controller.go:451] Built service openshif\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.318162 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddb0171-7126-45ef-aea2-8433f52357a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k7dvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.330668 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.386805 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.386921 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.386936 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.386953 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.386987 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:47Z","lastTransitionTime":"2026-03-01T09:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.408348 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.408394 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:09:47 crc kubenswrapper[4792]: E0301 09:09:47.408440 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:09:47 crc kubenswrapper[4792]: E0301 09:09:47.408525 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.408402 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:47 crc kubenswrapper[4792]: E0301 09:09:47.408646 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.489816 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.489847 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.489855 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.489883 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.489892 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:47Z","lastTransitionTime":"2026-03-01T09:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.592660 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.592715 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.592730 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.592751 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.592764 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:47Z","lastTransitionTime":"2026-03-01T09:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.696053 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.696121 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.696134 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.696153 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.696190 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:47Z","lastTransitionTime":"2026-03-01T09:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.798427 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.798466 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.798506 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.798522 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.798533 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:47Z","lastTransitionTime":"2026-03-01T09:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.902181 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.902217 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.902226 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.902238 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.902246 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:47Z","lastTransitionTime":"2026-03-01T09:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.992510 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" event={"ID":"9ddb0171-7126-45ef-aea2-8433f52357a6","Type":"ContainerStarted","Data":"2b48f991cbaafe8d2f382b896d1059580c6167e8c8b319e3db1398bab76a7ad3"} Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.993383 4792 scope.go:117] "RemoveContainer" containerID="52f06d1a6aa3f9871d55b3db6eb32ffacdf68fded2a68958577c63d060e19d25" Mar 01 09:09:47 crc kubenswrapper[4792]: E0301 09:09:47.993589 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7pp7m_openshift-ovn-kubernetes(e2bd7bac-21cf-4657-ab84-68a14f99f8f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.005089 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.005151 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.005167 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.005187 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.005227 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:48Z","lastTransitionTime":"2026-03-01T09:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.007937 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.021862 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.036741 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.048318 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.058214 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.067752 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.075956 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.085939 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.097689 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e87701fd366bbcc1401440e5be6144733e60e6e5d64b63be5ac3507947ee62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.107369 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.107406 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.107417 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.107433 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.107444 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:48Z","lastTransitionTime":"2026-03-01T09:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.108122 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.119761 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.132277 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.148879 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f06d1a6aa3f9871d55b3db6eb32ffacdf68fded2a68958577c63d060e19d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52f06d1a6aa3f9871d55b3db6eb32ffacdf68fded2a68958577c63d060e19d25\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"message\\\":\\\"rage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0301 09:09:46.070022 6641 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:46Z is after 2025-08-24T17:21:41Z]\\\\nI0301 09:09:46.070084 6641 services_controller.go:451] Built service openshif\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7pp7m_openshift-ovn-kubernetes(e2bd7bac-21cf-4657-ab84-68a14f99f8f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.165634 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddb0171-7126-45ef-aea2-8433f52357a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k7dvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.210137 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.210190 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.210203 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.210225 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.210238 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:48Z","lastTransitionTime":"2026-03-01T09:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.312423 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.312454 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.312463 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.312475 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.312484 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:48Z","lastTransitionTime":"2026-03-01T09:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.414995 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.415084 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.415110 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.415147 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.415177 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:48Z","lastTransitionTime":"2026-03-01T09:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.518482 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.518535 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.518548 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.518567 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.518584 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:48Z","lastTransitionTime":"2026-03-01T09:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.625839 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.625921 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.625939 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.625955 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.625967 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:48Z","lastTransitionTime":"2026-03-01T09:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.728786 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.728818 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.728829 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.728842 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.728851 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:48Z","lastTransitionTime":"2026-03-01T09:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.798463 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-frm7z"] Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.798988 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:09:48 crc kubenswrapper[4792]: E0301 09:09:48.799062 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.811913 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.825161 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e87701fd366bbcc1401440e5be6144733e60e6e5d64b63be5ac3507947ee62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.832032 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.832114 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.832131 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.832148 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.832160 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:48Z","lastTransitionTime":"2026-03-01T09:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.834947 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.846370 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.859022 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.869131 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.874855 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw62d\" (UniqueName: \"kubernetes.io/projected/fa0bf523-6582-46b4-9134-28880a50b474-kube-api-access-gw62d\") pod \"network-metrics-daemon-frm7z\" (UID: \"fa0bf523-6582-46b4-9134-28880a50b474\") " pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.875205 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs\") pod \"network-metrics-daemon-frm7z\" (UID: \"fa0bf523-6582-46b4-9134-28880a50b474\") " pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.882596 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.894528 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.913311 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f06d1a6aa3f9871d55b3db6eb32ffacdf68fded2a68958577c63d060e19d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52f06d1a6aa3f9871d55b3db6eb32ffacdf68fded2a68958577c63d060e19d25\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"message\\\":\\\"rage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0301 09:09:46.070022 6641 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:46Z is after 2025-08-24T17:21:41Z]\\\\nI0301 09:09:46.070084 6641 services_controller.go:451] Built service openshif\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7pp7m_openshift-ovn-kubernetes(e2bd7bac-21cf-4657-ab84-68a14f99f8f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.925722 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddb0171-7126-45ef-aea2-8433f52357a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k7dvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.936140 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.936184 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.936196 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.936215 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.936226 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:48Z","lastTransitionTime":"2026-03-01T09:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.941013 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.954926 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.967364 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.975817 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs\") pod \"network-metrics-daemon-frm7z\" (UID: \"fa0bf523-6582-46b4-9134-28880a50b474\") " pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.975917 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gw62d\" (UniqueName: \"kubernetes.io/projected/fa0bf523-6582-46b4-9134-28880a50b474-kube-api-access-gw62d\") pod \"network-metrics-daemon-frm7z\" (UID: \"fa0bf523-6582-46b4-9134-28880a50b474\") " pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:09:48 crc kubenswrapper[4792]: E0301 09:09:48.976036 4792 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 01 09:09:48 crc kubenswrapper[4792]: E0301 09:09:48.976114 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs podName:fa0bf523-6582-46b4-9134-28880a50b474 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:49.476097057 +0000 UTC m=+118.717976254 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs") pod "network-metrics-daemon-frm7z" (UID: "fa0bf523-6582-46b4-9134-28880a50b474") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.985068 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-frm7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0bf523-6582-46b4-9134-28880a50b474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-frm7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.995212 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw62d\" (UniqueName: \"kubernetes.io/projected/fa0bf523-6582-46b4-9134-28880a50b474-kube-api-access-gw62d\") pod \"network-metrics-daemon-frm7z\" (UID: \"fa0bf523-6582-46b4-9134-28880a50b474\") " pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.000115 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.003804 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" event={"ID":"9ddb0171-7126-45ef-aea2-8433f52357a6","Type":"ContainerStarted","Data":"c1c0585a4ca4551b3b8182c6427a8b2a5b8536cf388b7a9075819593b63502cf"} Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.003848 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" event={"ID":"9ddb0171-7126-45ef-aea2-8433f52357a6","Type":"ContainerStarted","Data":"0af07e226c7efd45ef475f2bb96192734bb29dbd1e16b83dda8a437e0a5fd606"} Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.005797 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pp7m_e2bd7bac-21cf-4657-ab84-68a14f99f8f0/ovnkube-controller/1.log" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.039489 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.039524 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.039532 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.039547 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.039557 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:49Z","lastTransitionTime":"2026-03-01T09:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.141358 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.141385 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.141393 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.141407 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.141416 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:49Z","lastTransitionTime":"2026-03-01T09:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.243731 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.243779 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.243795 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.243817 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.243835 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:49Z","lastTransitionTime":"2026-03-01T09:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.346142 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.346194 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.346206 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.346224 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.346236 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:49Z","lastTransitionTime":"2026-03-01T09:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.408662 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.408730 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.408826 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.408820 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.408967 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.409038 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.436744 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.436783 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.436795 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.436811 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.436823 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:49Z","lastTransitionTime":"2026-03-01T09:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.448970 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:49Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.452339 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.452379 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.452389 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.452405 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.452416 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:49Z","lastTransitionTime":"2026-03-01T09:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.464509 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:49Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.467770 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.467806 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.467816 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.467830 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.467840 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:49Z","lastTransitionTime":"2026-03-01T09:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.479345 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:49Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.479493 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs\") pod \"network-metrics-daemon-frm7z\" (UID: \"fa0bf523-6582-46b4-9134-28880a50b474\") " pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.479672 4792 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.479742 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs podName:fa0bf523-6582-46b4-9134-28880a50b474 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:50.479724106 +0000 UTC m=+119.721603303 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs") pod "network-metrics-daemon-frm7z" (UID: "fa0bf523-6582-46b4-9134-28880a50b474") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.484438 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.484472 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.484481 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.484494 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.484504 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:49Z","lastTransitionTime":"2026-03-01T09:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.500881 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:49Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.504339 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.504376 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.504386 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.504400 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.504410 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:49Z","lastTransitionTime":"2026-03-01T09:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.516620 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:49Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.516736 4792 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.518303 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.518328 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.518337 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.518350 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.518359 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:49Z","lastTransitionTime":"2026-03-01T09:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.620455 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.620511 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.620523 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.620538 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.620567 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:49Z","lastTransitionTime":"2026-03-01T09:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.681186 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.681319 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.681341 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:10:05.681314255 +0000 UTC m=+134.923193462 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.681413 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.681435 4792 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.681459 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.681493 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.681527 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-01 09:10:05.681517891 +0000 UTC m=+134.923397088 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.681595 4792 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.681638 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-01 09:10:05.681623323 +0000 UTC m=+134.923502610 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.681759 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.681814 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.681833 4792 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.681894 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-01 09:10:05.681871019 +0000 UTC m=+134.923750216 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.681766 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.681944 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.681955 4792 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.681990 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-01 09:10:05.681980762 +0000 UTC m=+134.923860059 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.723061 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.723135 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.723144 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.723184 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.723195 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:49Z","lastTransitionTime":"2026-03-01T09:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.825976 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.826016 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.826040 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.826055 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.826065 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:49Z","lastTransitionTime":"2026-03-01T09:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.928395 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.928453 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.928462 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.928478 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.928488 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:49Z","lastTransitionTime":"2026-03-01T09:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.030943 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.031018 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.031035 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.031058 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.031107 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:50Z","lastTransitionTime":"2026-03-01T09:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.034347 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:50Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.052172 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:50Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.071383 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:50Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.089378 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e87701fd366bbcc1401440e5be6144733e60e6e5d64b63be5ac3507947ee62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:50Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.101092 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:50Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.114920 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:50Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.126402 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:50Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.133301 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.133337 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.133347 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.133365 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.133376 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:50Z","lastTransitionTime":"2026-03-01T09:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.138712 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddb0171-7126-45ef-aea2-8433f52357a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0af07e226c7efd45ef475f2bb96192734bb29dbd1e16b83dda8a437e0a5fd606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c0585a4ca4551b3b8182c6427a8b2a5b8536cf388b7a9075819593b63502cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k7dvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:50Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.152134 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:50Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.166875 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:50Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.191713 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f06d1a6aa3f9871d55b3db6eb32ffacdf68fded2a68958577c63d060e19d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52f06d1a6aa3f9871d55b3db6eb32ffacdf68fded2a68958577c63d060e19d25\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"message\\\":\\\"rage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0301 09:09:46.070022 6641 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:46Z is after 2025-08-24T17:21:41Z]\\\\nI0301 09:09:46.070084 6641 services_controller.go:451] Built service openshif\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7pp7m_openshift-ovn-kubernetes(e2bd7bac-21cf-4657-ab84-68a14f99f8f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:50Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.205492 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:50Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.218887 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:50Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.230938 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:50Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.235546 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.235579 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.235588 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.235602 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.235612 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:50Z","lastTransitionTime":"2026-03-01T09:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.242879 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-frm7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0bf523-6582-46b4-9134-28880a50b474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-frm7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:50Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.338832 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.338882 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.338895 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.338932 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.338952 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:50Z","lastTransitionTime":"2026-03-01T09:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.408537 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:09:50 crc kubenswrapper[4792]: E0301 09:09:50.408689 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.441580 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.441649 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.441662 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.441679 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.441691 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:50Z","lastTransitionTime":"2026-03-01T09:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.489064 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs\") pod \"network-metrics-daemon-frm7z\" (UID: \"fa0bf523-6582-46b4-9134-28880a50b474\") " pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:09:50 crc kubenswrapper[4792]: E0301 09:09:50.489204 4792 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 01 09:09:50 crc kubenswrapper[4792]: E0301 09:09:50.489264 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs podName:fa0bf523-6582-46b4-9134-28880a50b474 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:52.489248502 +0000 UTC m=+121.731127709 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs") pod "network-metrics-daemon-frm7z" (UID: "fa0bf523-6582-46b4-9134-28880a50b474") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.543698 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.543759 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.543780 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.543809 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.543835 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:50Z","lastTransitionTime":"2026-03-01T09:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.647369 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.647420 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.647431 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.647447 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.647457 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:50Z","lastTransitionTime":"2026-03-01T09:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.750353 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.750425 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.750448 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.750478 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.750501 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:50Z","lastTransitionTime":"2026-03-01T09:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.853100 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.853166 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.853177 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.853192 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.853203 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:50Z","lastTransitionTime":"2026-03-01T09:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.955839 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.955874 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.955883 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.955898 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.955933 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:50Z","lastTransitionTime":"2026-03-01T09:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.058258 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.058309 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.058326 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.058348 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.058365 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:51Z","lastTransitionTime":"2026-03-01T09:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.160629 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.160666 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.160677 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.160694 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.160706 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:51Z","lastTransitionTime":"2026-03-01T09:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.263601 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.263873 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.264016 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.264092 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.264158 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:51Z","lastTransitionTime":"2026-03-01T09:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:51 crc kubenswrapper[4792]: E0301 09:09:51.364607 4792 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.407725 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.407758 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:09:51 crc kubenswrapper[4792]: E0301 09:09:51.407934 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.408400 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:51 crc kubenswrapper[4792]: E0301 09:09:51.408505 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:09:51 crc kubenswrapper[4792]: E0301 09:09:51.408600 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.422400 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:51Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.432734 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:51Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.440986 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:51Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.453577 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:51Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.466090 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e87701fd366bbcc1401440e5be6144733e60e6e5d64b63be5ac3507947ee62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:51Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.478072 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:51Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.491712 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:51Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.503741 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:51Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.525956 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f06d1a6aa3f9871d55b3db6eb32ffacdf68fded2a68958577c63d060e19d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52f06d1a6aa3f9871d55b3db6eb32ffacdf68fded2a68958577c63d060e19d25\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"message\\\":\\\"rage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0301 09:09:46.070022 6641 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:46Z is after 2025-08-24T17:21:41Z]\\\\nI0301 09:09:46.070084 6641 services_controller.go:451] Built service openshif\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7pp7m_openshift-ovn-kubernetes(e2bd7bac-21cf-4657-ab84-68a14f99f8f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:51Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.539474 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddb0171-7126-45ef-aea2-8433f52357a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0af07e226c7efd45ef475f2bb96192734bb29dbd1e16b83dda8a437e0a5fd606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c0585a4ca4551b3b8182c6427a8b2a5b8536cf388b7a9075819593b63502cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k7dvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:51Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.553059 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:51Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.565139 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:51Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.576329 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-frm7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0bf523-6582-46b4-9134-28880a50b474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-frm7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:51Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.592007 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:51Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.604238 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:51Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:51 crc kubenswrapper[4792]: E0301 09:09:51.753989 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 01 09:09:52 crc kubenswrapper[4792]: I0301 09:09:52.407924 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:09:52 crc kubenswrapper[4792]: E0301 09:09:52.408065 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:09:52 crc kubenswrapper[4792]: I0301 09:09:52.507744 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs\") pod \"network-metrics-daemon-frm7z\" (UID: \"fa0bf523-6582-46b4-9134-28880a50b474\") " pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:09:52 crc kubenswrapper[4792]: E0301 09:09:52.507861 4792 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 01 09:09:52 crc kubenswrapper[4792]: E0301 09:09:52.507929 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs podName:fa0bf523-6582-46b4-9134-28880a50b474 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:56.507894243 +0000 UTC m=+125.749773440 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs") pod "network-metrics-daemon-frm7z" (UID: "fa0bf523-6582-46b4-9134-28880a50b474") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 01 09:09:53 crc kubenswrapper[4792]: I0301 09:09:53.380812 4792 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 01 09:09:53 crc kubenswrapper[4792]: I0301 09:09:53.408238 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:09:53 crc kubenswrapper[4792]: I0301 09:09:53.408332 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:53 crc kubenswrapper[4792]: E0301 09:09:53.408469 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:09:53 crc kubenswrapper[4792]: I0301 09:09:53.408579 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:09:53 crc kubenswrapper[4792]: E0301 09:09:53.408701 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:09:53 crc kubenswrapper[4792]: E0301 09:09:53.408756 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:09:54 crc kubenswrapper[4792]: I0301 09:09:54.407768 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:09:54 crc kubenswrapper[4792]: E0301 09:09:54.408027 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:09:55 crc kubenswrapper[4792]: I0301 09:09:55.407889 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:09:55 crc kubenswrapper[4792]: I0301 09:09:55.407942 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:09:55 crc kubenswrapper[4792]: I0301 09:09:55.407889 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:55 crc kubenswrapper[4792]: E0301 09:09:55.408011 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:09:55 crc kubenswrapper[4792]: E0301 09:09:55.408241 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:09:55 crc kubenswrapper[4792]: E0301 09:09:55.408307 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:09:55 crc kubenswrapper[4792]: I0301 09:09:55.408687 4792 scope.go:117] "RemoveContainer" containerID="40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2" Mar 01 09:09:55 crc kubenswrapper[4792]: E0301 09:09:55.408877 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 01 09:09:56 crc kubenswrapper[4792]: I0301 09:09:56.408167 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:09:56 crc kubenswrapper[4792]: E0301 09:09:56.408333 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:09:56 crc kubenswrapper[4792]: I0301 09:09:56.548750 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs\") pod \"network-metrics-daemon-frm7z\" (UID: \"fa0bf523-6582-46b4-9134-28880a50b474\") " pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:09:56 crc kubenswrapper[4792]: E0301 09:09:56.549236 4792 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 01 09:09:56 crc kubenswrapper[4792]: E0301 09:09:56.549396 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs podName:fa0bf523-6582-46b4-9134-28880a50b474 nodeName:}" failed. No retries permitted until 2026-03-01 09:10:04.54936556 +0000 UTC m=+133.791244777 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs") pod "network-metrics-daemon-frm7z" (UID: "fa0bf523-6582-46b4-9134-28880a50b474") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 01 09:09:56 crc kubenswrapper[4792]: E0301 09:09:56.754929 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 01 09:09:57 crc kubenswrapper[4792]: I0301 09:09:57.408195 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:09:57 crc kubenswrapper[4792]: I0301 09:09:57.408195 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:09:57 crc kubenswrapper[4792]: I0301 09:09:57.408227 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:57 crc kubenswrapper[4792]: E0301 09:09:57.408698 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:09:57 crc kubenswrapper[4792]: E0301 09:09:57.408611 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:09:57 crc kubenswrapper[4792]: E0301 09:09:57.409161 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:09:58 crc kubenswrapper[4792]: I0301 09:09:58.407693 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:09:58 crc kubenswrapper[4792]: E0301 09:09:58.407819 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.408217 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.408271 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:09:59 crc kubenswrapper[4792]: E0301 09:09:59.408368 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.408472 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:59 crc kubenswrapper[4792]: E0301 09:09:59.409419 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:09:59 crc kubenswrapper[4792]: E0301 09:09:59.409584 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.410010 4792 scope.go:117] "RemoveContainer" containerID="52f06d1a6aa3f9871d55b3db6eb32ffacdf68fded2a68958577c63d060e19d25" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.733675 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.733738 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.733759 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.733781 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.733795 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:59Z","lastTransitionTime":"2026-03-01T09:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:59 crc kubenswrapper[4792]: E0301 09:09:59.746893 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:59Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.750409 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.750468 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.750480 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.750497 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.750530 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:59Z","lastTransitionTime":"2026-03-01T09:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:59 crc kubenswrapper[4792]: E0301 09:09:59.762757 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:59Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.765960 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.766006 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.766015 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.766027 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.766036 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:59Z","lastTransitionTime":"2026-03-01T09:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:59 crc kubenswrapper[4792]: E0301 09:09:59.776564 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:59Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.780025 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.780057 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.780067 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.780083 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.780096 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:59Z","lastTransitionTime":"2026-03-01T09:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:59 crc kubenswrapper[4792]: E0301 09:09:59.810285 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:59Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.814430 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.814463 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.814474 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.814486 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.814494 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:59Z","lastTransitionTime":"2026-03-01T09:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:59 crc kubenswrapper[4792]: E0301 09:09:59.828186 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:59Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:59 crc kubenswrapper[4792]: E0301 09:09:59.828301 4792 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 01 09:10:00 crc kubenswrapper[4792]: I0301 09:10:00.046509 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pp7m_e2bd7bac-21cf-4657-ab84-68a14f99f8f0/ovnkube-controller/1.log" Mar 01 09:10:00 crc kubenswrapper[4792]: I0301 09:10:00.056649 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerStarted","Data":"e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456"} Mar 01 09:10:00 crc kubenswrapper[4792]: I0301 09:10:00.057131 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:10:00 crc kubenswrapper[4792]: I0301 09:10:00.074262 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:00Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:00 crc kubenswrapper[4792]: I0301 09:10:00.089454 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:00Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:00 crc kubenswrapper[4792]: I0301 09:10:00.101282 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:00Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:00 crc kubenswrapper[4792]: I0301 09:10:00.111540 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:00Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:00 crc kubenswrapper[4792]: I0301 09:10:00.126556 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:00Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:00 crc kubenswrapper[4792]: I0301 09:10:00.140389 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e87701fd366bbcc1401440e5be6144733e60e6e5d64b63be5ac3507947ee62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:00Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:00 crc kubenswrapper[4792]: I0301 09:10:00.150502 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:00Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:00 crc kubenswrapper[4792]: I0301 09:10:00.159531 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:00Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:00 crc kubenswrapper[4792]: I0301 09:10:00.178346 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52f06d1a6aa3f9871d55b3db6eb32ffacdf68fded2a68958577c63d060e19d25\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"message\\\":\\\"rage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0301 09:09:46.070022 6641 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:46Z is after 2025-08-24T17:21:41Z]\\\\nI0301 09:09:46.070084 6641 services_controller.go:451] Built service openshif\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:00Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:00 crc kubenswrapper[4792]: I0301 09:10:00.189733 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddb0171-7126-45ef-aea2-8433f52357a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0af07e226c7efd45ef475f2bb96192734bb29dbd1e16b83dda8a437e0a5fd606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c0585a4ca4551b3b8182c6427a8b2a5b8536cf388b7a9075819593b63502cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k7dvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:00Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:00 crc kubenswrapper[4792]: I0301 09:10:00.201563 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:00Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:00 crc kubenswrapper[4792]: I0301 09:10:00.213966 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:00Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:00 crc kubenswrapper[4792]: I0301 09:10:00.225566 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-frm7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0bf523-6582-46b4-9134-28880a50b474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-frm7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:00Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:00 crc kubenswrapper[4792]: I0301 09:10:00.238637 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:00Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:00 crc kubenswrapper[4792]: I0301 09:10:00.249362 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:00Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:00 crc kubenswrapper[4792]: I0301 09:10:00.408377 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:00 crc kubenswrapper[4792]: E0301 09:10:00.408510 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.062157 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pp7m_e2bd7bac-21cf-4657-ab84-68a14f99f8f0/ovnkube-controller/2.log" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.063192 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pp7m_e2bd7bac-21cf-4657-ab84-68a14f99f8f0/ovnkube-controller/1.log" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.066417 4792 generic.go:334] "Generic (PLEG): container finished" podID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerID="e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456" exitCode=1 Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.066473 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerDied","Data":"e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456"} Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.066540 4792 scope.go:117] "RemoveContainer" containerID="52f06d1a6aa3f9871d55b3db6eb32ffacdf68fded2a68958577c63d060e19d25" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.067529 4792 scope.go:117] "RemoveContainer" containerID="e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456" Mar 01 09:10:01 crc kubenswrapper[4792]: E0301 09:10:01.067816 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7pp7m_openshift-ovn-kubernetes(e2bd7bac-21cf-4657-ab84-68a14f99f8f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.083436 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.096733 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.106404 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-frm7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0bf523-6582-46b4-9134-28880a50b474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-frm7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.120901 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.132817 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.143984 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.153553 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.162128 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.174378 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.187490 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e87701fd366bbcc1401440e5be6144733e60e6e5d64b63be5ac3507947ee62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.197787 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.209728 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.220183 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.238170 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52f06d1a6aa3f9871d55b3db6eb32ffacdf68fded2a68958577c63d060e19d25\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"message\\\":\\\"rage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0301 09:09:46.070022 6641 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:46Z is after 2025-08-24T17:21:41Z]\\\\nI0301 09:09:46.070084 6641 services_controller.go:451] Built service openshif\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:10:00Z\\\",\\\"message\\\":\\\"uid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0301 09:10:00.308291 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.248490 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddb0171-7126-45ef-aea2-8433f52357a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0af07e226c7efd45ef475f2bb96192734bb29dbd1e16b83dda8a437e0a5fd606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c0585a4ca4551b3b8182c6427a8b2a5b8536cf388b7a9075819593b63502cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k7dvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.408746 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.408814 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.408875 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:01 crc kubenswrapper[4792]: E0301 09:10:01.409012 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:01 crc kubenswrapper[4792]: E0301 09:10:01.409191 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:01 crc kubenswrapper[4792]: E0301 09:10:01.409318 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.424580 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.438638 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.453165 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.462442 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-frm7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0bf523-6582-46b4-9134-28880a50b474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-frm7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.473574 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.486161 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.497352 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.511061 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.524899 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.538735 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.553162 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e87701fd366bbcc1401440e5be6144733e60e6e5d64b63be5ac3507947ee62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.564665 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.576897 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.597005 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52f06d1a6aa3f9871d55b3db6eb32ffacdf68fded2a68958577c63d060e19d25\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"message\\\":\\\"rage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0301 09:09:46.070022 6641 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:46Z is after 2025-08-24T17:21:41Z]\\\\nI0301 09:09:46.070084 6641 services_controller.go:451] Built service openshif\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:10:00Z\\\",\\\"message\\\":\\\"uid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0301 09:10:00.308291 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.606341 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddb0171-7126-45ef-aea2-8433f52357a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0af07e226c7efd45ef475f2bb96192734bb29dbd1e16b83dda8a437e0a5fd606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c0585a4ca4551b3b8182c6427a8b2a5b8536cf388b7a9075819593b63502cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k7dvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: E0301 09:10:01.756743 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 01 09:10:02 crc kubenswrapper[4792]: I0301 09:10:02.071310 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pp7m_e2bd7bac-21cf-4657-ab84-68a14f99f8f0/ovnkube-controller/2.log" Mar 01 09:10:02 crc kubenswrapper[4792]: I0301 09:10:02.076643 4792 scope.go:117] "RemoveContainer" containerID="e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456" Mar 01 09:10:02 crc kubenswrapper[4792]: E0301 09:10:02.076953 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7pp7m_openshift-ovn-kubernetes(e2bd7bac-21cf-4657-ab84-68a14f99f8f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" Mar 01 09:10:02 crc kubenswrapper[4792]: I0301 09:10:02.091512 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:02Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:02 crc kubenswrapper[4792]: I0301 09:10:02.111188 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:02Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:02 crc kubenswrapper[4792]: I0301 09:10:02.124415 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:02Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:02 crc kubenswrapper[4792]: I0301 09:10:02.143786 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:02Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:02 crc kubenswrapper[4792]: I0301 09:10:02.166260 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e87701fd366bbcc1401440e5be6144733e60e6e5d64b63be5ac3507947ee62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:02Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:02 crc kubenswrapper[4792]: I0301 09:10:02.177386 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:02Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:02 crc kubenswrapper[4792]: I0301 09:10:02.190852 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:02Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:02 crc kubenswrapper[4792]: I0301 09:10:02.203583 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:02Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:02 crc kubenswrapper[4792]: I0301 09:10:02.222339 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:10:00Z\\\",\\\"message\\\":\\\"uid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0301 09:10:00.308291 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7pp7m_openshift-ovn-kubernetes(e2bd7bac-21cf-4657-ab84-68a14f99f8f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:02Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:02 crc kubenswrapper[4792]: I0301 09:10:02.238029 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddb0171-7126-45ef-aea2-8433f52357a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0af07e226c7efd45ef475f2bb96192734bb29dbd1e16b83dda8a437e0a5fd606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c0585a4ca4551b3b8182c6427a8b2a5b8536cf388b7a9075819593b63502cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k7dvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:02Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:02 crc kubenswrapper[4792]: I0301 09:10:02.254568 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:02Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:02 crc kubenswrapper[4792]: I0301 09:10:02.268943 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:02Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:02 crc kubenswrapper[4792]: I0301 09:10:02.283933 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-frm7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0bf523-6582-46b4-9134-28880a50b474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-frm7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:02Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:02 crc kubenswrapper[4792]: I0301 09:10:02.299964 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:02Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:02 crc kubenswrapper[4792]: I0301 09:10:02.314301 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:02Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:02 crc kubenswrapper[4792]: I0301 09:10:02.407850 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:02 crc kubenswrapper[4792]: E0301 09:10:02.408015 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:03 crc kubenswrapper[4792]: I0301 09:10:03.408073 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:03 crc kubenswrapper[4792]: I0301 09:10:03.408181 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:03 crc kubenswrapper[4792]: I0301 09:10:03.408072 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:03 crc kubenswrapper[4792]: E0301 09:10:03.408336 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:03 crc kubenswrapper[4792]: E0301 09:10:03.408410 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:03 crc kubenswrapper[4792]: E0301 09:10:03.408623 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:04 crc kubenswrapper[4792]: I0301 09:10:04.408397 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:04 crc kubenswrapper[4792]: E0301 09:10:04.409193 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:04 crc kubenswrapper[4792]: I0301 09:10:04.631461 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs\") pod \"network-metrics-daemon-frm7z\" (UID: \"fa0bf523-6582-46b4-9134-28880a50b474\") " pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:04 crc kubenswrapper[4792]: E0301 09:10:04.631717 4792 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 01 09:10:04 crc kubenswrapper[4792]: E0301 09:10:04.631875 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs podName:fa0bf523-6582-46b4-9134-28880a50b474 nodeName:}" failed. No retries permitted until 2026-03-01 09:10:20.631845054 +0000 UTC m=+149.873724291 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs") pod "network-metrics-daemon-frm7z" (UID: "fa0bf523-6582-46b4-9134-28880a50b474") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 01 09:10:05 crc kubenswrapper[4792]: I0301 09:10:05.408600 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:05 crc kubenswrapper[4792]: I0301 09:10:05.408600 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:05 crc kubenswrapper[4792]: E0301 09:10:05.408765 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:05 crc kubenswrapper[4792]: E0301 09:10:05.408832 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:05 crc kubenswrapper[4792]: I0301 09:10:05.409646 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:05 crc kubenswrapper[4792]: E0301 09:10:05.409994 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:05 crc kubenswrapper[4792]: I0301 09:10:05.418862 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 01 09:10:05 crc kubenswrapper[4792]: I0301 09:10:05.741811 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:10:05 crc kubenswrapper[4792]: I0301 09:10:05.741928 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:05 crc kubenswrapper[4792]: I0301 09:10:05.741996 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:05 crc kubenswrapper[4792]: I0301 09:10:05.742032 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:05 crc kubenswrapper[4792]: E0301 09:10:05.742056 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:10:37.742028416 +0000 UTC m=+166.983907633 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:10:05 crc kubenswrapper[4792]: I0301 09:10:05.742091 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:05 crc kubenswrapper[4792]: E0301 09:10:05.742152 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 01 09:10:05 crc kubenswrapper[4792]: E0301 09:10:05.742155 4792 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 01 09:10:05 crc kubenswrapper[4792]: E0301 09:10:05.742214 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 01 09:10:05 crc kubenswrapper[4792]: E0301 09:10:05.742217 4792 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 01 09:10:05 crc kubenswrapper[4792]: E0301 09:10:05.742225 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 01 09:10:05 crc kubenswrapper[4792]: E0301 09:10:05.742235 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-01 09:10:37.74221555 +0000 UTC m=+166.984094747 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 01 09:10:05 crc kubenswrapper[4792]: E0301 09:10:05.742240 4792 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:10:05 crc kubenswrapper[4792]: E0301 09:10:05.742170 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 01 09:10:05 crc kubenswrapper[4792]: E0301 09:10:05.742274 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-01 09:10:37.742258931 +0000 UTC m=+166.984138238 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 01 09:10:05 crc kubenswrapper[4792]: E0301 09:10:05.742274 4792 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:10:05 crc kubenswrapper[4792]: E0301 09:10:05.742295 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-01 09:10:37.742284172 +0000 UTC m=+166.984163509 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:10:05 crc kubenswrapper[4792]: E0301 09:10:05.742317 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-01 09:10:37.742307292 +0000 UTC m=+166.984186619 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:10:06 crc kubenswrapper[4792]: I0301 09:10:06.408431 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:06 crc kubenswrapper[4792]: E0301 09:10:06.408889 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:06 crc kubenswrapper[4792]: E0301 09:10:06.758958 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 01 09:10:07 crc kubenswrapper[4792]: I0301 09:10:07.408503 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:07 crc kubenswrapper[4792]: I0301 09:10:07.408518 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:07 crc kubenswrapper[4792]: E0301 09:10:07.408723 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:07 crc kubenswrapper[4792]: E0301 09:10:07.408854 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:07 crc kubenswrapper[4792]: I0301 09:10:07.408965 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:07 crc kubenswrapper[4792]: E0301 09:10:07.409480 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:07 crc kubenswrapper[4792]: I0301 09:10:07.409588 4792 scope.go:117] "RemoveContainer" containerID="40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2" Mar 01 09:10:07 crc kubenswrapper[4792]: E0301 09:10:07.409868 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 01 09:10:07 crc kubenswrapper[4792]: I0301 09:10:07.423622 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 01 09:10:08 crc kubenswrapper[4792]: I0301 09:10:08.408447 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:08 crc kubenswrapper[4792]: E0301 09:10:08.408773 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:09 crc kubenswrapper[4792]: I0301 09:10:09.408660 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:09 crc kubenswrapper[4792]: E0301 09:10:09.409289 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:09 crc kubenswrapper[4792]: I0301 09:10:09.408712 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:09 crc kubenswrapper[4792]: E0301 09:10:09.409531 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:09 crc kubenswrapper[4792]: I0301 09:10:09.408660 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:09 crc kubenswrapper[4792]: E0301 09:10:09.409784 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:09 crc kubenswrapper[4792]: I0301 09:10:09.997767 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:10:09 crc kubenswrapper[4792]: I0301 09:10:09.998098 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:10:09 crc kubenswrapper[4792]: I0301 09:10:09.998184 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:10:09 crc kubenswrapper[4792]: I0301 09:10:09.998285 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:10:09 crc kubenswrapper[4792]: I0301 09:10:09.998625 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:10:09Z","lastTransitionTime":"2026-03-01T09:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:10:10 crc kubenswrapper[4792]: E0301 09:10:10.011645 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:10Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:10 crc kubenswrapper[4792]: I0301 09:10:10.016891 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:10:10 crc kubenswrapper[4792]: I0301 09:10:10.017006 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:10:10 crc kubenswrapper[4792]: I0301 09:10:10.017028 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:10:10 crc kubenswrapper[4792]: I0301 09:10:10.017060 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:10:10 crc kubenswrapper[4792]: I0301 09:10:10.017082 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:10:10Z","lastTransitionTime":"2026-03-01T09:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:10:10 crc kubenswrapper[4792]: E0301 09:10:10.033337 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:10Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:10 crc kubenswrapper[4792]: I0301 09:10:10.037071 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:10:10 crc kubenswrapper[4792]: I0301 09:10:10.037110 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:10:10 crc kubenswrapper[4792]: I0301 09:10:10.037119 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:10:10 crc kubenswrapper[4792]: I0301 09:10:10.037136 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:10:10 crc kubenswrapper[4792]: I0301 09:10:10.037146 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:10:10Z","lastTransitionTime":"2026-03-01T09:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:10:10 crc kubenswrapper[4792]: E0301 09:10:10.047943 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:10Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:10 crc kubenswrapper[4792]: I0301 09:10:10.053204 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:10:10 crc kubenswrapper[4792]: I0301 09:10:10.053338 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:10:10 crc kubenswrapper[4792]: I0301 09:10:10.053409 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:10:10 crc kubenswrapper[4792]: I0301 09:10:10.053506 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:10:10 crc kubenswrapper[4792]: I0301 09:10:10.053599 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:10:10Z","lastTransitionTime":"2026-03-01T09:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:10:10 crc kubenswrapper[4792]: E0301 09:10:10.067765 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:10Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:10 crc kubenswrapper[4792]: I0301 09:10:10.071932 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:10:10 crc kubenswrapper[4792]: I0301 09:10:10.072078 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:10:10 crc kubenswrapper[4792]: I0301 09:10:10.072161 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:10:10 crc kubenswrapper[4792]: I0301 09:10:10.072254 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:10:10 crc kubenswrapper[4792]: I0301 09:10:10.072355 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:10:10Z","lastTransitionTime":"2026-03-01T09:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:10:10 crc kubenswrapper[4792]: E0301 09:10:10.086580 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:10Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:10 crc kubenswrapper[4792]: E0301 09:10:10.086996 4792 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 01 09:10:10 crc kubenswrapper[4792]: I0301 09:10:10.407999 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:10 crc kubenswrapper[4792]: E0301 09:10:10.408430 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:11 crc kubenswrapper[4792]: I0301 09:10:11.408562 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:11 crc kubenswrapper[4792]: I0301 09:10:11.408642 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:11 crc kubenswrapper[4792]: I0301 09:10:11.408571 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:11 crc kubenswrapper[4792]: E0301 09:10:11.408797 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:11 crc kubenswrapper[4792]: E0301 09:10:11.408960 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:11 crc kubenswrapper[4792]: E0301 09:10:11.409134 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:11 crc kubenswrapper[4792]: I0301 09:10:11.428074 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:11Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:11 crc kubenswrapper[4792]: I0301 09:10:11.445873 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a80515-bc6a-4158-8fc7-835f324381b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://584d540504344f06dee541ba7491978c79d823f7bea306efe216719f32cfbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81dff9998b1dc46b4b8cf890b0a1cf9f6201067af0aa512dfff4ad0c3688d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b89e60e36b6e5a848c0d033d1ba3cfb1c58af2a1288bfd5612ade34b0c24e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e046a8fc92c9bbbcba4c1a83c870045a2986787c2d49ec55bd3612f8a38d34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e046a8fc92c9bbbcba4c1a83c870045a2986787c2d49ec55bd3612f8a38d34f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:11Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:11 crc kubenswrapper[4792]: I0301 09:10:11.464673 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:11Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:11 crc kubenswrapper[4792]: I0301 09:10:11.478019 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:11Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:11 crc kubenswrapper[4792]: I0301 09:10:11.488830 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:11Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:11 crc kubenswrapper[4792]: I0301 09:10:11.502685 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:11Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:11 crc kubenswrapper[4792]: I0301 09:10:11.523127 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e87701fd366bbcc1401440e5be6144733e60e6e5d64b63be5ac3507947ee62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:11Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:11 crc kubenswrapper[4792]: I0301 09:10:11.536532 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:11Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:11 crc kubenswrapper[4792]: I0301 09:10:11.547896 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:11Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:11 crc kubenswrapper[4792]: I0301 09:10:11.569406 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:10:00Z\\\",\\\"message\\\":\\\"uid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0301 09:10:00.308291 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7pp7m_openshift-ovn-kubernetes(e2bd7bac-21cf-4657-ab84-68a14f99f8f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:11Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:11 crc kubenswrapper[4792]: I0301 09:10:11.581753 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddb0171-7126-45ef-aea2-8433f52357a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0af07e226c7efd45ef475f2bb96192734bb29dbd1e16b83dda8a437e0a5fd606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c0585a4ca4551b3b8182c6427a8b2a5b8536cf388b7a9075819593b63502cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k7dvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:11Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:11 crc kubenswrapper[4792]: I0301 09:10:11.597820 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:11Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:11 crc kubenswrapper[4792]: I0301 09:10:11.613131 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:11Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:11 crc kubenswrapper[4792]: I0301 09:10:11.625353 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:11Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:11 crc kubenswrapper[4792]: I0301 09:10:11.636437 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-frm7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0bf523-6582-46b4-9134-28880a50b474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-frm7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:11Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:11 crc kubenswrapper[4792]: I0301 09:10:11.651090 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650a1f7b-2bf5-4a55-87af-ecee46abe3ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8130bb3020d8a95b40f9675531603ec0382a75d197b88e9b13b7f3bf3c0ca5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4a35ca4af33a2e93167010c2d2d02e6ef5fa5c4f65b6b87b344b58cc0a3867\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:08:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0301 09:08:18.955011 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0301 09:08:18.956745 1 observer_polling.go:159] Starting file observer\\\\nI0301 09:08:18.958204 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0301 09:08:18.959319 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0301 09:08:42.859426 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0301 09:08:48.505666 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0301 09:08:48.505786 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:08:18Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://859afd52205b95f45333b5975e67308ee360d3887aa679a922b7cba5e3acda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93e0bcca6e83da45b6dd64db0083da5cc4045d25d7a66615d09f8674cc02433\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72a213f48ea7aba6dc2e3a9aa8d171acb6aa5c22532eed941bdc337e89b76589\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:11Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:11 crc kubenswrapper[4792]: I0301 09:10:11.666386 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:11Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:11 crc kubenswrapper[4792]: E0301 09:10:11.759970 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 01 09:10:12 crc kubenswrapper[4792]: I0301 09:10:12.407760 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:12 crc kubenswrapper[4792]: E0301 09:10:12.408016 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:13 crc kubenswrapper[4792]: I0301 09:10:13.407769 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:13 crc kubenswrapper[4792]: E0301 09:10:13.407890 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:13 crc kubenswrapper[4792]: I0301 09:10:13.407963 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:13 crc kubenswrapper[4792]: E0301 09:10:13.408027 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:13 crc kubenswrapper[4792]: I0301 09:10:13.408075 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:13 crc kubenswrapper[4792]: E0301 09:10:13.408144 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:14 crc kubenswrapper[4792]: I0301 09:10:14.408292 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:14 crc kubenswrapper[4792]: E0301 09:10:14.408534 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:14 crc kubenswrapper[4792]: I0301 09:10:14.409369 4792 scope.go:117] "RemoveContainer" containerID="e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456" Mar 01 09:10:14 crc kubenswrapper[4792]: E0301 09:10:14.409567 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7pp7m_openshift-ovn-kubernetes(e2bd7bac-21cf-4657-ab84-68a14f99f8f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" Mar 01 09:10:15 crc kubenswrapper[4792]: I0301 09:10:15.408316 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:15 crc kubenswrapper[4792]: I0301 09:10:15.408316 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:15 crc kubenswrapper[4792]: E0301 09:10:15.408430 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:15 crc kubenswrapper[4792]: E0301 09:10:15.408489 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:15 crc kubenswrapper[4792]: I0301 09:10:15.408334 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:15 crc kubenswrapper[4792]: E0301 09:10:15.408549 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:16 crc kubenswrapper[4792]: I0301 09:10:16.407804 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:16 crc kubenswrapper[4792]: E0301 09:10:16.407950 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:16 crc kubenswrapper[4792]: E0301 09:10:16.761384 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 01 09:10:17 crc kubenswrapper[4792]: I0301 09:10:17.408762 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:17 crc kubenswrapper[4792]: I0301 09:10:17.408806 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:17 crc kubenswrapper[4792]: I0301 09:10:17.408891 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:17 crc kubenswrapper[4792]: E0301 09:10:17.409099 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:17 crc kubenswrapper[4792]: E0301 09:10:17.409234 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:17 crc kubenswrapper[4792]: E0301 09:10:17.409363 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:18 crc kubenswrapper[4792]: I0301 09:10:18.408534 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:18 crc kubenswrapper[4792]: E0301 09:10:18.409345 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:19 crc kubenswrapper[4792]: I0301 09:10:19.408031 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:19 crc kubenswrapper[4792]: I0301 09:10:19.408126 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:19 crc kubenswrapper[4792]: I0301 09:10:19.408187 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:19 crc kubenswrapper[4792]: E0301 09:10:19.409402 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:19 crc kubenswrapper[4792]: E0301 09:10:19.409481 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:19 crc kubenswrapper[4792]: E0301 09:10:19.409530 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.133664 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.133695 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.133704 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.133717 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.133727 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:10:20Z","lastTransitionTime":"2026-03-01T09:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:10:20 crc kubenswrapper[4792]: E0301 09:10:20.145808 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:20Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.149740 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.149777 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.149786 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.149808 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.149817 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:10:20Z","lastTransitionTime":"2026-03-01T09:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:10:20 crc kubenswrapper[4792]: E0301 09:10:20.161563 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:20Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.164732 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.164787 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.164799 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.164817 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.165209 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:10:20Z","lastTransitionTime":"2026-03-01T09:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:10:20 crc kubenswrapper[4792]: E0301 09:10:20.176244 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:20Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.178991 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.179026 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.179035 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.179049 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.179058 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:10:20Z","lastTransitionTime":"2026-03-01T09:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:10:20 crc kubenswrapper[4792]: E0301 09:10:20.194502 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:20Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.197894 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.197948 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.197958 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.197973 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.197981 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:10:20Z","lastTransitionTime":"2026-03-01T09:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:10:20 crc kubenswrapper[4792]: E0301 09:10:20.211046 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:20Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:20 crc kubenswrapper[4792]: E0301 09:10:20.211215 4792 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.408397 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:20 crc kubenswrapper[4792]: E0301 09:10:20.408587 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.697350 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs\") pod \"network-metrics-daemon-frm7z\" (UID: \"fa0bf523-6582-46b4-9134-28880a50b474\") " pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:20 crc kubenswrapper[4792]: E0301 09:10:20.697666 4792 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 01 09:10:20 crc kubenswrapper[4792]: E0301 09:10:20.698648 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs podName:fa0bf523-6582-46b4-9134-28880a50b474 nodeName:}" failed. No retries permitted until 2026-03-01 09:10:52.697783158 +0000 UTC m=+181.939662405 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs") pod "network-metrics-daemon-frm7z" (UID: "fa0bf523-6582-46b4-9134-28880a50b474") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 01 09:10:21 crc kubenswrapper[4792]: I0301 09:10:21.408852 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:21 crc kubenswrapper[4792]: I0301 09:10:21.408921 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:21 crc kubenswrapper[4792]: I0301 09:10:21.408859 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:21 crc kubenswrapper[4792]: E0301 09:10:21.409020 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:21 crc kubenswrapper[4792]: E0301 09:10:21.409155 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:21 crc kubenswrapper[4792]: E0301 09:10:21.409207 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:21 crc kubenswrapper[4792]: I0301 09:10:21.421577 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:21Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:21 crc kubenswrapper[4792]: I0301 09:10:21.433144 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:21Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:21 crc kubenswrapper[4792]: I0301 09:10:21.455863 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:10:00Z\\\",\\\"message\\\":\\\"uid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0301 09:10:00.308291 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7pp7m_openshift-ovn-kubernetes(e2bd7bac-21cf-4657-ab84-68a14f99f8f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:21Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:21 crc kubenswrapper[4792]: I0301 09:10:21.469346 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddb0171-7126-45ef-aea2-8433f52357a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0af07e226c7efd45ef475f2bb96192734bb29dbd1e16b83dda8a437e0a5fd606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c0585a4ca4551b3b8182c6427a8b2a5b8536cf388b7a9075819593b63502cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k7dvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:21Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:21 crc kubenswrapper[4792]: I0301 09:10:21.482109 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:21Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:21 crc kubenswrapper[4792]: I0301 09:10:21.496813 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:21Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:21 crc kubenswrapper[4792]: I0301 09:10:21.508266 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:21Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:21 crc kubenswrapper[4792]: I0301 09:10:21.524189 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-frm7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0bf523-6582-46b4-9134-28880a50b474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-frm7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:21Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:21 crc kubenswrapper[4792]: I0301 09:10:21.536238 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650a1f7b-2bf5-4a55-87af-ecee46abe3ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8130bb3020d8a95b40f9675531603ec0382a75d197b88e9b13b7f3bf3c0ca5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4a35ca4af33a2e93167010c2d2d02e6ef5fa5c4f65b6b87b344b58cc0a3867\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:08:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0301 09:08:18.955011 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0301 09:08:18.956745 1 observer_polling.go:159] Starting file observer\\\\nI0301 09:08:18.958204 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0301 09:08:18.959319 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0301 09:08:42.859426 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0301 09:08:48.505666 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0301 09:08:48.505786 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:08:18Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://859afd52205b95f45333b5975e67308ee360d3887aa679a922b7cba5e3acda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93e0bcca6e83da45b6dd64db0083da5cc4045d25d7a66615d09f8674cc02433\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72a213f48ea7aba6dc2e3a9aa8d171acb6aa5c22532eed941bdc337e89b76589\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:21Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:21 crc kubenswrapper[4792]: I0301 09:10:21.548752 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:21Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:21 crc kubenswrapper[4792]: I0301 09:10:21.564042 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e87701fd366bbcc1401440e5be6144733e60e6e5d64b63be5ac3507947ee62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:21Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:21 crc kubenswrapper[4792]: I0301 09:10:21.576899 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:21Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:21 crc kubenswrapper[4792]: I0301 09:10:21.589937 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a80515-bc6a-4158-8fc7-835f324381b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://584d540504344f06dee541ba7491978c79d823f7bea306efe216719f32cfbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81dff9998b1dc46b4b8cf890b0a1cf9f6201067af0aa512dfff4ad0c3688d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b89e60e36b6e5a848c0d033d1ba3cfb1c58af2a1288bfd5612ade34b0c24e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e046a8fc92c9bbbcba4c1a83c870045a2986787c2d49ec55bd3612f8a38d34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e046a8fc92c9bbbcba4c1a83c870045a2986787c2d49ec55bd3612f8a38d34f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:21Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:21 crc kubenswrapper[4792]: I0301 09:10:21.605166 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:21Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:21 crc kubenswrapper[4792]: I0301 09:10:21.616323 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:21Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:21 crc kubenswrapper[4792]: I0301 09:10:21.625386 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:21Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:21 crc kubenswrapper[4792]: I0301 09:10:21.636853 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:21Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:21 crc kubenswrapper[4792]: E0301 09:10:21.762607 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 01 09:10:22 crc kubenswrapper[4792]: I0301 09:10:22.407786 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:22 crc kubenswrapper[4792]: E0301 09:10:22.408242 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:22 crc kubenswrapper[4792]: I0301 09:10:22.408332 4792 scope.go:117] "RemoveContainer" containerID="40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2" Mar 01 09:10:22 crc kubenswrapper[4792]: E0301 09:10:22.408481 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.140053 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pq28p_ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3/kube-multus/0.log" Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.140126 4792 generic.go:334] "Generic (PLEG): container finished" podID="ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3" containerID="239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae" exitCode=1 Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.140160 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pq28p" event={"ID":"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3","Type":"ContainerDied","Data":"239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae"} Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.140559 4792 scope.go:117] "RemoveContainer" containerID="239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae" Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.160415 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650a1f7b-2bf5-4a55-87af-ecee46abe3ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8130bb3020d8a95b40f9675531603ec0382a75d197b88e9b13b7f3bf3c0ca5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4a35ca4af33a2e93167010c2d2d02e6ef5fa5c4f65b6b87b344b58cc0a3867\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:08:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0301 09:08:18.955011 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0301 09:08:18.956745 1 observer_polling.go:159] Starting file observer\\\\nI0301 09:08:18.958204 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0301 09:08:18.959319 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0301 09:08:42.859426 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0301 09:08:48.505666 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0301 09:08:48.505786 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:08:18Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://859afd52205b95f45333b5975e67308ee360d3887aa679a922b7cba5e3acda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93e0bcca6e83da45b6dd64db0083da5cc4045d25d7a66615d09f8674cc02433\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72a213f48ea7aba6dc2e3a9aa8d171acb6aa5c22532eed941bdc337e89b76589\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:23Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.175291 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:23Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.189850 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:23Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.200879 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:23Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.210984 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:23Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.225550 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:10:22Z\\\",\\\"message\\\":\\\"2026-03-01T09:09:36+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f5445e48-670b-4a9c-9944-fbd0231ee556\\\\n2026-03-01T09:09:36+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f5445e48-670b-4a9c-9944-fbd0231ee556 to /host/opt/cni/bin/\\\\n2026-03-01T09:09:37Z [verbose] multus-daemon started\\\\n2026-03-01T09:09:37Z [verbose] Readiness Indicator file check\\\\n2026-03-01T09:10:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:23Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.242478 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e87701fd366bbcc1401440e5be6144733e60e6e5d64b63be5ac3507947ee62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:23Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.252987 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:23Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.264793 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a80515-bc6a-4158-8fc7-835f324381b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://584d540504344f06dee541ba7491978c79d823f7bea306efe216719f32cfbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81dff9998b1dc46b4b8cf890b0a1cf9f6201067af0aa512dfff4ad0c3688d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b89e60e36b6e5a848c0d033d1ba3cfb1c58af2a1288bfd5612ade34b0c24e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e046a8fc92c9bbbcba4c1a83c870045a2986787c2d49ec55bd3612f8a38d34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e046a8fc92c9bbbcba4c1a83c870045a2986787c2d49ec55bd3612f8a38d34f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:23Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.276103 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:23Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.292189 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:10:00Z\\\",\\\"message\\\":\\\"uid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0301 09:10:00.308291 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7pp7m_openshift-ovn-kubernetes(e2bd7bac-21cf-4657-ab84-68a14f99f8f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:23Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.303256 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddb0171-7126-45ef-aea2-8433f52357a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0af07e226c7efd45ef475f2bb96192734bb29dbd1e16b83dda8a437e0a5fd606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c0585a4ca4551b3b8182c6427a8b2a5b8536cf388b7a9075819593b63502cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k7dvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:23Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.316355 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:23Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.329190 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:23Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.337942 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-frm7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0bf523-6582-46b4-9134-28880a50b474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-frm7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:23Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.349177 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:23Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.360749 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:23Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.407949 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:23 crc kubenswrapper[4792]: E0301 09:10:23.408069 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.408110 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.407949 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:23 crc kubenswrapper[4792]: E0301 09:10:23.408234 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:23 crc kubenswrapper[4792]: E0301 09:10:23.408277 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:24 crc kubenswrapper[4792]: I0301 09:10:24.145169 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pq28p_ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3/kube-multus/0.log" Mar 01 09:10:24 crc kubenswrapper[4792]: I0301 09:10:24.145225 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pq28p" event={"ID":"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3","Type":"ContainerStarted","Data":"833126c3956c8927e1b68252bd9962e43df9c6e09dc2b98a20208c2db19a5fc1"} Mar 01 09:10:24 crc kubenswrapper[4792]: I0301 09:10:24.158795 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:24Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:24 crc kubenswrapper[4792]: I0301 09:10:24.171544 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:24Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:24 crc kubenswrapper[4792]: I0301 09:10:24.189975 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:10:00Z\\\",\\\"message\\\":\\\"uid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0301 09:10:00.308291 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7pp7m_openshift-ovn-kubernetes(e2bd7bac-21cf-4657-ab84-68a14f99f8f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:24Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:24 crc kubenswrapper[4792]: I0301 09:10:24.202406 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddb0171-7126-45ef-aea2-8433f52357a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0af07e226c7efd45ef475f2bb96192734bb29dbd1e16b83dda8a437e0a5fd606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c0585a4ca4551b3b8182c6427a8b2a5b8536cf388b7a9075819593b63502cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k7dvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:24Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:24 crc kubenswrapper[4792]: I0301 09:10:24.214769 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:24Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:24 crc kubenswrapper[4792]: I0301 09:10:24.227354 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:24Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:24 crc kubenswrapper[4792]: I0301 09:10:24.239945 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:24Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:24 crc kubenswrapper[4792]: I0301 09:10:24.250249 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-frm7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0bf523-6582-46b4-9134-28880a50b474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-frm7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:24Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:24 crc kubenswrapper[4792]: I0301 09:10:24.262196 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650a1f7b-2bf5-4a55-87af-ecee46abe3ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8130bb3020d8a95b40f9675531603ec0382a75d197b88e9b13b7f3bf3c0ca5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4a35ca4af33a2e93167010c2d2d02e6ef5fa5c4f65b6b87b344b58cc0a3867\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:08:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0301 09:08:18.955011 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0301 09:08:18.956745 1 observer_polling.go:159] Starting file observer\\\\nI0301 09:08:18.958204 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0301 09:08:18.959319 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0301 09:08:42.859426 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0301 09:08:48.505666 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0301 09:08:48.505786 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:08:18Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://859afd52205b95f45333b5975e67308ee360d3887aa679a922b7cba5e3acda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93e0bcca6e83da45b6dd64db0083da5cc4045d25d7a66615d09f8674cc02433\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72a213f48ea7aba6dc2e3a9aa8d171acb6aa5c22532eed941bdc337e89b76589\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:24Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:24 crc kubenswrapper[4792]: I0301 09:10:24.277738 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:24Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:24 crc kubenswrapper[4792]: I0301 09:10:24.290035 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a80515-bc6a-4158-8fc7-835f324381b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://584d540504344f06dee541ba7491978c79d823f7bea306efe216719f32cfbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81dff9998b1dc46b4b8cf890b0a1cf9f6201067af0aa512dfff4ad0c3688d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b89e60e36b6e5a848c0d033d1ba3cfb1c58af2a1288bfd5612ade34b0c24e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e046a8fc92c9bbbcba4c1a83c870045a2986787c2d49ec55bd3612f8a38d34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e046a8fc92c9bbbcba4c1a83c870045a2986787c2d49ec55bd3612f8a38d34f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:24Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:24 crc kubenswrapper[4792]: I0301 09:10:24.302036 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:24Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:24 crc kubenswrapper[4792]: I0301 09:10:24.312539 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:24Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:24 crc kubenswrapper[4792]: I0301 09:10:24.321325 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:24Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:24 crc kubenswrapper[4792]: I0301 09:10:24.331518 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://833126c3956c8927e1b68252bd9962e43df9c6e09dc2b98a20208c2db19a5fc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:10:22Z\\\",\\\"message\\\":\\\"2026-03-01T09:09:36+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f5445e48-670b-4a9c-9944-fbd0231ee556\\\\n2026-03-01T09:09:36+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f5445e48-670b-4a9c-9944-fbd0231ee556 to /host/opt/cni/bin/\\\\n2026-03-01T09:09:37Z [verbose] multus-daemon started\\\\n2026-03-01T09:09:37Z [verbose] Readiness Indicator file check\\\\n2026-03-01T09:10:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:24Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:24 crc kubenswrapper[4792]: I0301 09:10:24.345983 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e87701fd366bbcc1401440e5be6144733e60e6e5d64b63be5ac3507947ee62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:24Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:24 crc kubenswrapper[4792]: I0301 09:10:24.356230 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:24Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:24 crc kubenswrapper[4792]: I0301 09:10:24.408573 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:24 crc kubenswrapper[4792]: E0301 09:10:24.408716 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:25 crc kubenswrapper[4792]: I0301 09:10:25.407981 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:25 crc kubenswrapper[4792]: E0301 09:10:25.408105 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:25 crc kubenswrapper[4792]: I0301 09:10:25.408293 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:25 crc kubenswrapper[4792]: E0301 09:10:25.408354 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:25 crc kubenswrapper[4792]: I0301 09:10:25.408500 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:25 crc kubenswrapper[4792]: E0301 09:10:25.408559 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:26 crc kubenswrapper[4792]: I0301 09:10:26.408604 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:26 crc kubenswrapper[4792]: E0301 09:10:26.408751 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:26 crc kubenswrapper[4792]: I0301 09:10:26.409474 4792 scope.go:117] "RemoveContainer" containerID="e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456" Mar 01 09:10:26 crc kubenswrapper[4792]: E0301 09:10:26.764263 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 01 09:10:27 crc kubenswrapper[4792]: I0301 09:10:27.157241 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pp7m_e2bd7bac-21cf-4657-ab84-68a14f99f8f0/ovnkube-controller/2.log" Mar 01 09:10:27 crc kubenswrapper[4792]: I0301 09:10:27.161602 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerStarted","Data":"157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c"} Mar 01 09:10:27 crc kubenswrapper[4792]: I0301 09:10:27.162118 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:10:27 crc kubenswrapper[4792]: I0301 09:10:27.175872 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddb0171-7126-45ef-aea2-8433f52357a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0af07e226c7efd45ef475f2bb96192734bb29dbd1e16b83dda8a437e0a5fd606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c0585a4ca4551b3b8182c6427a8b2a5b8536cf388b7a9075819593b63502cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k7dvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:27Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:27 crc kubenswrapper[4792]: I0301 09:10:27.190721 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:27Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:27 crc kubenswrapper[4792]: I0301 09:10:27.202880 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:27Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:27 crc kubenswrapper[4792]: I0301 09:10:27.232009 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:10:00Z\\\",\\\"message\\\":\\\"uid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0301 09:10:00.308291 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:27Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:27 crc kubenswrapper[4792]: I0301 09:10:27.246004 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:27Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:27 crc kubenswrapper[4792]: I0301 09:10:27.264506 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:27Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:27 crc kubenswrapper[4792]: I0301 09:10:27.281230 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:27Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:27 crc kubenswrapper[4792]: I0301 09:10:27.296516 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-frm7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0bf523-6582-46b4-9134-28880a50b474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-frm7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:27Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:27 crc kubenswrapper[4792]: I0301 09:10:27.314298 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650a1f7b-2bf5-4a55-87af-ecee46abe3ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8130bb3020d8a95b40f9675531603ec0382a75d197b88e9b13b7f3bf3c0ca5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4a35ca4af33a2e93167010c2d2d02e6ef5fa5c4f65b6b87b344b58cc0a3867\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:08:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0301 09:08:18.955011 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0301 09:08:18.956745 1 observer_polling.go:159] Starting file observer\\\\nI0301 09:08:18.958204 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0301 09:08:18.959319 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0301 09:08:42.859426 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0301 09:08:48.505666 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0301 09:08:48.505786 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:08:18Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://859afd52205b95f45333b5975e67308ee360d3887aa679a922b7cba5e3acda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93e0bcca6e83da45b6dd64db0083da5cc4045d25d7a66615d09f8674cc02433\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72a213f48ea7aba6dc2e3a9aa8d171acb6aa5c22532eed941bdc337e89b76589\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:27Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:27 crc kubenswrapper[4792]: I0301 09:10:27.332652 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:27Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:27 crc kubenswrapper[4792]: I0301 09:10:27.346357 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:27Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:27 crc kubenswrapper[4792]: I0301 09:10:27.362763 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://833126c3956c8927e1b68252bd9962e43df9c6e09dc2b98a20208c2db19a5fc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:10:22Z\\\",\\\"message\\\":\\\"2026-03-01T09:09:36+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f5445e48-670b-4a9c-9944-fbd0231ee556\\\\n2026-03-01T09:09:36+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f5445e48-670b-4a9c-9944-fbd0231ee556 to /host/opt/cni/bin/\\\\n2026-03-01T09:09:37Z [verbose] multus-daemon started\\\\n2026-03-01T09:09:37Z [verbose] Readiness Indicator file check\\\\n2026-03-01T09:10:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:27Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:27 crc kubenswrapper[4792]: I0301 09:10:27.377574 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e87701fd366bbcc1401440e5be6144733e60e6e5d64b63be5ac3507947ee62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:27Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:27 crc kubenswrapper[4792]: I0301 09:10:27.387538 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:27Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:27 crc kubenswrapper[4792]: I0301 09:10:27.397790 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a80515-bc6a-4158-8fc7-835f324381b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://584d540504344f06dee541ba7491978c79d823f7bea306efe216719f32cfbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81dff9998b1dc46b4b8cf890b0a1cf9f6201067af0aa512dfff4ad0c3688d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b89e60e36b6e5a848c0d033d1ba3cfb1c58af2a1288bfd5612ade34b0c24e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e046a8fc92c9bbbcba4c1a83c870045a2986787c2d49ec55bd3612f8a38d34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e046a8fc92c9bbbcba4c1a83c870045a2986787c2d49ec55bd3612f8a38d34f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:27Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:27 crc kubenswrapper[4792]: I0301 09:10:27.408390 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:27 crc kubenswrapper[4792]: E0301 09:10:27.408533 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:27 crc kubenswrapper[4792]: I0301 09:10:27.408755 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:27 crc kubenswrapper[4792]: E0301 09:10:27.408882 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:27 crc kubenswrapper[4792]: I0301 09:10:27.408963 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:27 crc kubenswrapper[4792]: E0301 09:10:27.409035 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:27 crc kubenswrapper[4792]: I0301 09:10:27.411195 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:27Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:27 crc kubenswrapper[4792]: I0301 09:10:27.422979 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:27Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.167807 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pp7m_e2bd7bac-21cf-4657-ab84-68a14f99f8f0/ovnkube-controller/3.log" Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.168726 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pp7m_e2bd7bac-21cf-4657-ab84-68a14f99f8f0/ovnkube-controller/2.log" Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.172679 4792 generic.go:334] "Generic (PLEG): container finished" podID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerID="157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c" exitCode=1 Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.172716 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerDied","Data":"157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c"} Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.172753 4792 scope.go:117] "RemoveContainer" containerID="e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456" Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.175709 4792 scope.go:117] "RemoveContainer" containerID="157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c" Mar 01 09:10:28 crc kubenswrapper[4792]: E0301 09:10:28.176117 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7pp7m_openshift-ovn-kubernetes(e2bd7bac-21cf-4657-ab84-68a14f99f8f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.190170 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:28Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.204835 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:28Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.222860 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:10:00Z\\\",\\\"message\\\":\\\"uid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0301 09:10:00.308291 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:10:27Z\\\",\\\"message\\\":\\\"nityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/machine-api-operator-webhook_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator-webhook\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.254\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0301 09:10:27.273332 7207 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network co\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:28Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.233331 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddb0171-7126-45ef-aea2-8433f52357a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0af07e226c7efd45ef475f2bb96192734bb29dbd1e16b83dda8a437e0a5fd606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c0585a4ca4551b3b8182c6427a8b2a5b8536cf388b7a9075819593b63502cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k7dvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:28Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.246046 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:28Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.259657 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:28Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.273378 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:28Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.285978 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-frm7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0bf523-6582-46b4-9134-28880a50b474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-frm7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:28Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.298737 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650a1f7b-2bf5-4a55-87af-ecee46abe3ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8130bb3020d8a95b40f9675531603ec0382a75d197b88e9b13b7f3bf3c0ca5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4a35ca4af33a2e93167010c2d2d02e6ef5fa5c4f65b6b87b344b58cc0a3867\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:08:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0301 09:08:18.955011 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0301 09:08:18.956745 1 observer_polling.go:159] Starting file observer\\\\nI0301 09:08:18.958204 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0301 09:08:18.959319 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0301 09:08:42.859426 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0301 09:08:48.505666 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0301 09:08:48.505786 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:08:18Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://859afd52205b95f45333b5975e67308ee360d3887aa679a922b7cba5e3acda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93e0bcca6e83da45b6dd64db0083da5cc4045d25d7a66615d09f8674cc02433\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72a213f48ea7aba6dc2e3a9aa8d171acb6aa5c22532eed941bdc337e89b76589\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:28Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.312520 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:28Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.327720 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://833126c3956c8927e1b68252bd9962e43df9c6e09dc2b98a20208c2db19a5fc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:10:22Z\\\",\\\"message\\\":\\\"2026-03-01T09:09:36+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f5445e48-670b-4a9c-9944-fbd0231ee556\\\\n2026-03-01T09:09:36+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f5445e48-670b-4a9c-9944-fbd0231ee556 to /host/opt/cni/bin/\\\\n2026-03-01T09:09:37Z [verbose] multus-daemon started\\\\n2026-03-01T09:09:37Z [verbose] Readiness Indicator file check\\\\n2026-03-01T09:10:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:28Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.342360 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e87701fd366bbcc1401440e5be6144733e60e6e5d64b63be5ac3507947ee62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:28Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.353007 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:28Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.365037 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a80515-bc6a-4158-8fc7-835f324381b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://584d540504344f06dee541ba7491978c79d823f7bea306efe216719f32cfbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81dff9998b1dc46b4b8cf890b0a1cf9f6201067af0aa512dfff4ad0c3688d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b89e60e36b6e5a848c0d033d1ba3cfb1c58af2a1288bfd5612ade34b0c24e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e046a8fc92c9bbbcba4c1a83c870045a2986787c2d49ec55bd3612f8a38d34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e046a8fc92c9bbbcba4c1a83c870045a2986787c2d49ec55bd3612f8a38d34f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:28Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.376623 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:28Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.387833 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:28Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.396999 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:28Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.408360 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:28 crc kubenswrapper[4792]: E0301 09:10:28.408596 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:29 crc kubenswrapper[4792]: I0301 09:10:29.179695 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pp7m_e2bd7bac-21cf-4657-ab84-68a14f99f8f0/ovnkube-controller/3.log" Mar 01 09:10:29 crc kubenswrapper[4792]: I0301 09:10:29.185199 4792 scope.go:117] "RemoveContainer" containerID="157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c" Mar 01 09:10:29 crc kubenswrapper[4792]: E0301 09:10:29.185382 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7pp7m_openshift-ovn-kubernetes(e2bd7bac-21cf-4657-ab84-68a14f99f8f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" Mar 01 09:10:29 crc kubenswrapper[4792]: I0301 09:10:29.201047 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650a1f7b-2bf5-4a55-87af-ecee46abe3ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8130bb3020d8a95b40f9675531603ec0382a75d197b88e9b13b7f3bf3c0ca5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4a35ca4af33a2e93167010c2d2d02e6ef5fa5c4f65b6b87b344b58cc0a3867\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:08:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0301 09:08:18.955011 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0301 09:08:18.956745 1 observer_polling.go:159] Starting file observer\\\\nI0301 09:08:18.958204 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0301 09:08:18.959319 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0301 09:08:42.859426 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0301 09:08:48.505666 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0301 09:08:48.505786 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:08:18Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://859afd52205b95f45333b5975e67308ee360d3887aa679a922b7cba5e3acda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93e0bcca6e83da45b6dd64db0083da5cc4045d25d7a66615d09f8674cc02433\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72a213f48ea7aba6dc2e3a9aa8d171acb6aa5c22532eed941bdc337e89b76589\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:29Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:29 crc kubenswrapper[4792]: I0301 09:10:29.219804 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:29Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:29 crc kubenswrapper[4792]: I0301 09:10:29.242799 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:29Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:29 crc kubenswrapper[4792]: I0301 09:10:29.257947 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:29Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:29 crc kubenswrapper[4792]: I0301 09:10:29.270837 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:29Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:29 crc kubenswrapper[4792]: I0301 09:10:29.291590 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://833126c3956c8927e1b68252bd9962e43df9c6e09dc2b98a20208c2db19a5fc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:10:22Z\\\",\\\"message\\\":\\\"2026-03-01T09:09:36+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f5445e48-670b-4a9c-9944-fbd0231ee556\\\\n2026-03-01T09:09:36+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f5445e48-670b-4a9c-9944-fbd0231ee556 to /host/opt/cni/bin/\\\\n2026-03-01T09:09:37Z [verbose] multus-daemon started\\\\n2026-03-01T09:09:37Z [verbose] Readiness Indicator file check\\\\n2026-03-01T09:10:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:29Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:29 crc kubenswrapper[4792]: I0301 09:10:29.313154 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e87701fd366bbcc1401440e5be6144733e60e6e5d64b63be5ac3507947ee62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:29Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:29 crc kubenswrapper[4792]: I0301 09:10:29.325364 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:29Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:29 crc kubenswrapper[4792]: I0301 09:10:29.342742 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a80515-bc6a-4158-8fc7-835f324381b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://584d540504344f06dee541ba7491978c79d823f7bea306efe216719f32cfbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81dff9998b1dc46b4b8cf890b0a1cf9f6201067af0aa512dfff4ad0c3688d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b89e60e36b6e5a848c0d033d1ba3cfb1c58af2a1288bfd5612ade34b0c24e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e046a8fc92c9bbbcba4c1a83c870045a2986787c2d49ec55bd3612f8a38d34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e046a8fc92c9bbbcba4c1a83c870045a2986787c2d49ec55bd3612f8a38d34f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:29Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:29 crc kubenswrapper[4792]: I0301 09:10:29.356280 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:29Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:29 crc kubenswrapper[4792]: I0301 09:10:29.374149 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:10:27Z\\\",\\\"message\\\":\\\"nityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/machine-api-operator-webhook_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator-webhook\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.254\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0301 09:10:27.273332 7207 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network co\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:10:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7pp7m_openshift-ovn-kubernetes(e2bd7bac-21cf-4657-ab84-68a14f99f8f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:29Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:29 crc kubenswrapper[4792]: I0301 09:10:29.391982 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddb0171-7126-45ef-aea2-8433f52357a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0af07e226c7efd45ef475f2bb96192734bb29dbd1e16b83dda8a437e0a5fd606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c0585a4ca4551b3b8182c6427a8b2a5b8536cf388b7a9075819593b63502cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k7dvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:29Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:29 crc kubenswrapper[4792]: I0301 09:10:29.404546 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:29Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:29 crc kubenswrapper[4792]: I0301 09:10:29.407741 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:29 crc kubenswrapper[4792]: I0301 09:10:29.407779 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:29 crc kubenswrapper[4792]: I0301 09:10:29.407798 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:29 crc kubenswrapper[4792]: E0301 09:10:29.407934 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:29 crc kubenswrapper[4792]: E0301 09:10:29.408022 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:29 crc kubenswrapper[4792]: E0301 09:10:29.408151 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:29 crc kubenswrapper[4792]: I0301 09:10:29.422644 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:29Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:29 crc kubenswrapper[4792]: I0301 09:10:29.437288 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-frm7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0bf523-6582-46b4-9134-28880a50b474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-frm7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:29Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:29 crc kubenswrapper[4792]: I0301 09:10:29.453617 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:29Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:29 crc kubenswrapper[4792]: I0301 09:10:29.467601 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:29Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.408474 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:30 crc kubenswrapper[4792]: E0301 09:10:30.408598 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.502056 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.502093 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.502103 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.502138 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.502148 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:10:30Z","lastTransitionTime":"2026-03-01T09:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:10:30 crc kubenswrapper[4792]: E0301 09:10:30.514417 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:30Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.517933 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.518001 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.518013 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.518029 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.518038 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:10:30Z","lastTransitionTime":"2026-03-01T09:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:10:30 crc kubenswrapper[4792]: E0301 09:10:30.546739 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:30Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.553372 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.553396 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.553406 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.553420 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.553431 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:10:30Z","lastTransitionTime":"2026-03-01T09:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:10:30 crc kubenswrapper[4792]: E0301 09:10:30.574097 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:30Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.580521 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.580548 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.580556 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.580570 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.580580 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:10:30Z","lastTransitionTime":"2026-03-01T09:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:10:30 crc kubenswrapper[4792]: E0301 09:10:30.593665 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:30Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.596547 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.596580 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.596591 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.596604 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.596613 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:10:30Z","lastTransitionTime":"2026-03-01T09:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:10:30 crc kubenswrapper[4792]: E0301 09:10:30.607398 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:30Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:30 crc kubenswrapper[4792]: E0301 09:10:30.607648 4792 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 01 09:10:31 crc kubenswrapper[4792]: I0301 09:10:31.408021 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:31 crc kubenswrapper[4792]: I0301 09:10:31.408021 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:31 crc kubenswrapper[4792]: I0301 09:10:31.408130 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:31 crc kubenswrapper[4792]: E0301 09:10:31.408550 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:31 crc kubenswrapper[4792]: E0301 09:10:31.408377 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:31 crc kubenswrapper[4792]: E0301 09:10:31.408606 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:31 crc kubenswrapper[4792]: I0301 09:10:31.421389 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddb0171-7126-45ef-aea2-8433f52357a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0af07e226c7efd45ef475f2bb96192734bb29dbd1e16b83dda8a437e0a5fd606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c0585a4ca4551b3b8182c6427a8b2a5b8536cf388b7a9075819593b63502cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k7dvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:31Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:31 crc kubenswrapper[4792]: I0301 09:10:31.433665 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:31Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:31 crc kubenswrapper[4792]: I0301 09:10:31.446938 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:31Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:31 crc kubenswrapper[4792]: I0301 09:10:31.464890 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:10:27Z\\\",\\\"message\\\":\\\"nityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/machine-api-operator-webhook_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator-webhook\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.254\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0301 09:10:27.273332 7207 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network co\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:10:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7pp7m_openshift-ovn-kubernetes(e2bd7bac-21cf-4657-ab84-68a14f99f8f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:31Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:31 crc kubenswrapper[4792]: I0301 09:10:31.481240 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:31Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:31 crc kubenswrapper[4792]: I0301 09:10:31.499313 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:31Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:31 crc kubenswrapper[4792]: I0301 09:10:31.510983 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:31Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:31 crc kubenswrapper[4792]: I0301 09:10:31.521204 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-frm7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0bf523-6582-46b4-9134-28880a50b474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-frm7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:31Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:31 crc kubenswrapper[4792]: I0301 09:10:31.533256 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650a1f7b-2bf5-4a55-87af-ecee46abe3ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8130bb3020d8a95b40f9675531603ec0382a75d197b88e9b13b7f3bf3c0ca5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4a35ca4af33a2e93167010c2d2d02e6ef5fa5c4f65b6b87b344b58cc0a3867\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:08:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0301 09:08:18.955011 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0301 09:08:18.956745 1 observer_polling.go:159] Starting file observer\\\\nI0301 09:08:18.958204 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0301 09:08:18.959319 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0301 09:08:42.859426 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0301 09:08:48.505666 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0301 09:08:48.505786 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:08:18Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://859afd52205b95f45333b5975e67308ee360d3887aa679a922b7cba5e3acda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93e0bcca6e83da45b6dd64db0083da5cc4045d25d7a66615d09f8674cc02433\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72a213f48ea7aba6dc2e3a9aa8d171acb6aa5c22532eed941bdc337e89b76589\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:31Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:31 crc kubenswrapper[4792]: I0301 09:10:31.546381 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:31Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:31 crc kubenswrapper[4792]: I0301 09:10:31.558037 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:31Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:31 crc kubenswrapper[4792]: I0301 09:10:31.571672 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://833126c3956c8927e1b68252bd9962e43df9c6e09dc2b98a20208c2db19a5fc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:10:22Z\\\",\\\"message\\\":\\\"2026-03-01T09:09:36+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f5445e48-670b-4a9c-9944-fbd0231ee556\\\\n2026-03-01T09:09:36+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f5445e48-670b-4a9c-9944-fbd0231ee556 to /host/opt/cni/bin/\\\\n2026-03-01T09:09:37Z [verbose] multus-daemon started\\\\n2026-03-01T09:09:37Z [verbose] Readiness Indicator file check\\\\n2026-03-01T09:10:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:31Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:31 crc kubenswrapper[4792]: I0301 09:10:31.592138 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e87701fd366bbcc1401440e5be6144733e60e6e5d64b63be5ac3507947ee62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:31Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:31 crc kubenswrapper[4792]: I0301 09:10:31.603286 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:31Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:31 crc kubenswrapper[4792]: I0301 09:10:31.614539 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a80515-bc6a-4158-8fc7-835f324381b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://584d540504344f06dee541ba7491978c79d823f7bea306efe216719f32cfbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81dff9998b1dc46b4b8cf890b0a1cf9f6201067af0aa512dfff4ad0c3688d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b89e60e36b6e5a848c0d033d1ba3cfb1c58af2a1288bfd5612ade34b0c24e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e046a8fc92c9bbbcba4c1a83c870045a2986787c2d49ec55bd3612f8a38d34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e046a8fc92c9bbbcba4c1a83c870045a2986787c2d49ec55bd3612f8a38d34f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:31Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:31 crc kubenswrapper[4792]: I0301 09:10:31.626632 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:31Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:31 crc kubenswrapper[4792]: I0301 09:10:31.638132 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:31Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:31 crc kubenswrapper[4792]: E0301 09:10:31.765143 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 01 09:10:32 crc kubenswrapper[4792]: I0301 09:10:32.408431 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:32 crc kubenswrapper[4792]: E0301 09:10:32.408642 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:33 crc kubenswrapper[4792]: I0301 09:10:33.407768 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:33 crc kubenswrapper[4792]: E0301 09:10:33.407897 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:33 crc kubenswrapper[4792]: I0301 09:10:33.408400 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:33 crc kubenswrapper[4792]: I0301 09:10:33.408528 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:33 crc kubenswrapper[4792]: E0301 09:10:33.408894 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:33 crc kubenswrapper[4792]: E0301 09:10:33.409392 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:33 crc kubenswrapper[4792]: I0301 09:10:33.409489 4792 scope.go:117] "RemoveContainer" containerID="40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2" Mar 01 09:10:33 crc kubenswrapper[4792]: E0301 09:10:33.410381 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 01 09:10:34 crc kubenswrapper[4792]: I0301 09:10:34.408452 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:34 crc kubenswrapper[4792]: E0301 09:10:34.408585 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:34 crc kubenswrapper[4792]: I0301 09:10:34.418529 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 01 09:10:35 crc kubenswrapper[4792]: I0301 09:10:35.409013 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:35 crc kubenswrapper[4792]: I0301 09:10:35.409007 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:35 crc kubenswrapper[4792]: I0301 09:10:35.409117 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:35 crc kubenswrapper[4792]: E0301 09:10:35.409331 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:35 crc kubenswrapper[4792]: E0301 09:10:35.409417 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:35 crc kubenswrapper[4792]: E0301 09:10:35.409508 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:36 crc kubenswrapper[4792]: I0301 09:10:36.408758 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:36 crc kubenswrapper[4792]: E0301 09:10:36.410031 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:36 crc kubenswrapper[4792]: E0301 09:10:36.766497 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 01 09:10:37 crc kubenswrapper[4792]: I0301 09:10:37.408399 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:37 crc kubenswrapper[4792]: I0301 09:10:37.408397 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:37 crc kubenswrapper[4792]: E0301 09:10:37.408603 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:37 crc kubenswrapper[4792]: I0301 09:10:37.408757 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:37 crc kubenswrapper[4792]: E0301 09:10:37.408930 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:37 crc kubenswrapper[4792]: E0301 09:10:37.409045 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:37 crc kubenswrapper[4792]: I0301 09:10:37.769298 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:10:37 crc kubenswrapper[4792]: I0301 09:10:37.769404 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:37 crc kubenswrapper[4792]: I0301 09:10:37.769495 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:37 crc kubenswrapper[4792]: E0301 09:10:37.769529 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:41.769498696 +0000 UTC m=+231.011377933 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:10:37 crc kubenswrapper[4792]: E0301 09:10:37.769567 4792 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 01 09:10:37 crc kubenswrapper[4792]: I0301 09:10:37.769584 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:37 crc kubenswrapper[4792]: E0301 09:10:37.769612 4792 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 01 09:10:37 crc kubenswrapper[4792]: E0301 09:10:37.769628 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-01 09:11:41.769609598 +0000 UTC m=+231.011488895 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 01 09:10:37 crc kubenswrapper[4792]: E0301 09:10:37.769739 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 01 09:10:37 crc kubenswrapper[4792]: E0301 09:10:37.769758 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 01 09:10:37 crc kubenswrapper[4792]: E0301 09:10:37.769772 4792 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:10:37 crc kubenswrapper[4792]: E0301 09:10:37.769787 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 01 09:10:37 crc kubenswrapper[4792]: E0301 09:10:37.769805 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 01 09:10:37 crc kubenswrapper[4792]: E0301 09:10:37.769814 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-01 09:11:41.769803912 +0000 UTC m=+231.011683209 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:10:37 crc kubenswrapper[4792]: E0301 09:10:37.769818 4792 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:10:37 crc kubenswrapper[4792]: E0301 09:10:37.769844 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-01 09:11:41.769822163 +0000 UTC m=+231.011701360 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 01 09:10:37 crc kubenswrapper[4792]: I0301 09:10:37.769697 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:37 crc kubenswrapper[4792]: E0301 09:10:37.769867 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-01 09:11:41.769859103 +0000 UTC m=+231.011738300 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:10:38 crc kubenswrapper[4792]: I0301 09:10:38.408197 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:38 crc kubenswrapper[4792]: E0301 09:10:38.408339 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:39 crc kubenswrapper[4792]: I0301 09:10:39.408752 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:39 crc kubenswrapper[4792]: I0301 09:10:39.408826 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:39 crc kubenswrapper[4792]: I0301 09:10:39.408825 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:39 crc kubenswrapper[4792]: E0301 09:10:39.408956 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:39 crc kubenswrapper[4792]: E0301 09:10:39.409074 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:39 crc kubenswrapper[4792]: E0301 09:10:39.409625 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:39 crc kubenswrapper[4792]: I0301 09:10:39.409949 4792 scope.go:117] "RemoveContainer" containerID="157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c" Mar 01 09:10:39 crc kubenswrapper[4792]: E0301 09:10:39.410144 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7pp7m_openshift-ovn-kubernetes(e2bd7bac-21cf-4657-ab84-68a14f99f8f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.408321 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:40 crc kubenswrapper[4792]: E0301 09:10:40.408498 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.653778 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.653851 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.653882 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.653939 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.653954 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:10:40Z","lastTransitionTime":"2026-03-01T09:10:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.707928 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-k5kmk"] Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.708483 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k5kmk" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.711299 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.711306 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.711503 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.711524 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.754568 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=33.754537164 podStartE2EDuration="33.754537164s" podCreationTimestamp="2026-03-01 09:10:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:10:40.73469428 +0000 UTC m=+169.976573517" watchObservedRunningTime="2026-03-01 09:10:40.754537164 +0000 UTC m=+169.996416401" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.793341 4792 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.794871 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-zql8j" podStartSLOduration=109.794852286 podStartE2EDuration="1m49.794852286s" podCreationTimestamp="2026-03-01 09:08:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:10:40.793693252 +0000 UTC m=+170.035572449" watchObservedRunningTime="2026-03-01 09:10:40.794852286 +0000 UTC m=+170.036731483" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.801006 4792 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.804497 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4c63f6a-1c48-4f27-8687-b7be8c24fcb9-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-k5kmk\" (UID: \"f4c63f6a-1c48-4f27-8687-b7be8c24fcb9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k5kmk" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.804544 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f4c63f6a-1c48-4f27-8687-b7be8c24fcb9-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-k5kmk\" (UID: \"f4c63f6a-1c48-4f27-8687-b7be8c24fcb9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k5kmk" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.804629 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f4c63f6a-1c48-4f27-8687-b7be8c24fcb9-service-ca\") pod \"cluster-version-operator-5c965bbfc6-k5kmk\" (UID: \"f4c63f6a-1c48-4f27-8687-b7be8c24fcb9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k5kmk" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.804657 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f4c63f6a-1c48-4f27-8687-b7be8c24fcb9-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-k5kmk\" (UID: \"f4c63f6a-1c48-4f27-8687-b7be8c24fcb9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k5kmk" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.804677 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f4c63f6a-1c48-4f27-8687-b7be8c24fcb9-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-k5kmk\" (UID: \"f4c63f6a-1c48-4f27-8687-b7be8c24fcb9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k5kmk" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.826573 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" podStartSLOduration=108.826553169 podStartE2EDuration="1m48.826553169s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:10:40.826437076 +0000 UTC m=+170.068316263" watchObservedRunningTime="2026-03-01 09:10:40.826553169 +0000 UTC m=+170.068432386" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.827198 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-pq28p" podStartSLOduration=108.827188872 podStartE2EDuration="1m48.827188872s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:10:40.808839669 +0000 UTC m=+170.050718886" watchObservedRunningTime="2026-03-01 09:10:40.827188872 +0000 UTC m=+170.069068079" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.840994 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-4gj45" podStartSLOduration=109.84097686 podStartE2EDuration="1m49.84097686s" podCreationTimestamp="2026-03-01 09:08:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:10:40.84049673 +0000 UTC m=+170.082375937" watchObservedRunningTime="2026-03-01 09:10:40.84097686 +0000 UTC m=+170.082856067" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.863885 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=35.863867858 podStartE2EDuration="35.863867858s" podCreationTimestamp="2026-03-01 09:10:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:10:40.85293689 +0000 UTC m=+170.094816107" watchObservedRunningTime="2026-03-01 09:10:40.863867858 +0000 UTC m=+170.105747065" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.906070 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f4c63f6a-1c48-4f27-8687-b7be8c24fcb9-service-ca\") pod \"cluster-version-operator-5c965bbfc6-k5kmk\" (UID: \"f4c63f6a-1c48-4f27-8687-b7be8c24fcb9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k5kmk" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.906115 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f4c63f6a-1c48-4f27-8687-b7be8c24fcb9-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-k5kmk\" (UID: \"f4c63f6a-1c48-4f27-8687-b7be8c24fcb9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k5kmk" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.906150 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f4c63f6a-1c48-4f27-8687-b7be8c24fcb9-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-k5kmk\" (UID: \"f4c63f6a-1c48-4f27-8687-b7be8c24fcb9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k5kmk" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.906188 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4c63f6a-1c48-4f27-8687-b7be8c24fcb9-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-k5kmk\" (UID: \"f4c63f6a-1c48-4f27-8687-b7be8c24fcb9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k5kmk" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.906215 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f4c63f6a-1c48-4f27-8687-b7be8c24fcb9-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-k5kmk\" (UID: \"f4c63f6a-1c48-4f27-8687-b7be8c24fcb9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k5kmk" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.906278 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f4c63f6a-1c48-4f27-8687-b7be8c24fcb9-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-k5kmk\" (UID: \"f4c63f6a-1c48-4f27-8687-b7be8c24fcb9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k5kmk" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.906415 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f4c63f6a-1c48-4f27-8687-b7be8c24fcb9-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-k5kmk\" (UID: \"f4c63f6a-1c48-4f27-8687-b7be8c24fcb9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k5kmk" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.907066 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f4c63f6a-1c48-4f27-8687-b7be8c24fcb9-service-ca\") pod \"cluster-version-operator-5c965bbfc6-k5kmk\" (UID: \"f4c63f6a-1c48-4f27-8687-b7be8c24fcb9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k5kmk" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.907717 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podStartSLOduration=108.907707094 podStartE2EDuration="1m48.907707094s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:10:40.879241609 +0000 UTC m=+170.121120816" watchObservedRunningTime="2026-03-01 09:10:40.907707094 +0000 UTC m=+170.149586291" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.914288 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4c63f6a-1c48-4f27-8687-b7be8c24fcb9-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-k5kmk\" (UID: \"f4c63f6a-1c48-4f27-8687-b7be8c24fcb9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k5kmk" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.930311 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" podStartSLOduration=108.930294086 podStartE2EDuration="1m48.930294086s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:10:40.928789414 +0000 UTC m=+170.170668611" watchObservedRunningTime="2026-03-01 09:10:40.930294086 +0000 UTC m=+170.172173283" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.931829 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f4c63f6a-1c48-4f27-8687-b7be8c24fcb9-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-k5kmk\" (UID: \"f4c63f6a-1c48-4f27-8687-b7be8c24fcb9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k5kmk" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.958296 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=6.95826853 podStartE2EDuration="6.95826853s" podCreationTimestamp="2026-03-01 09:10:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:10:40.947616098 +0000 UTC m=+170.189495295" watchObservedRunningTime="2026-03-01 09:10:40.95826853 +0000 UTC m=+170.200147727" Mar 01 09:10:41 crc kubenswrapper[4792]: I0301 09:10:41.026652 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k5kmk" Mar 01 09:10:41 crc kubenswrapper[4792]: W0301 09:10:41.044148 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4c63f6a_1c48_4f27_8687_b7be8c24fcb9.slice/crio-f41d2c318ff605f83c38e7b35e1f9361796bdba7a0e6aa5a26fd9330f35552a4 WatchSource:0}: Error finding container f41d2c318ff605f83c38e7b35e1f9361796bdba7a0e6aa5a26fd9330f35552a4: Status 404 returned error can't find the container with id f41d2c318ff605f83c38e7b35e1f9361796bdba7a0e6aa5a26fd9330f35552a4 Mar 01 09:10:41 crc kubenswrapper[4792]: I0301 09:10:41.224897 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k5kmk" event={"ID":"f4c63f6a-1c48-4f27-8687-b7be8c24fcb9","Type":"ContainerStarted","Data":"5569e4527c708368f798d693e286f5fd93cc26ec2fc46b0ee1fda4c9120d9d05"} Mar 01 09:10:41 crc kubenswrapper[4792]: I0301 09:10:41.224966 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k5kmk" event={"ID":"f4c63f6a-1c48-4f27-8687-b7be8c24fcb9","Type":"ContainerStarted","Data":"f41d2c318ff605f83c38e7b35e1f9361796bdba7a0e6aa5a26fd9330f35552a4"} Mar 01 09:10:41 crc kubenswrapper[4792]: I0301 09:10:41.237776 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k5kmk" podStartSLOduration=110.237759249 podStartE2EDuration="1m50.237759249s" podCreationTimestamp="2026-03-01 09:08:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:10:41.236968012 +0000 UTC m=+170.478847209" watchObservedRunningTime="2026-03-01 09:10:41.237759249 +0000 UTC m=+170.479638446" Mar 01 09:10:41 crc kubenswrapper[4792]: I0301 09:10:41.408542 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:41 crc kubenswrapper[4792]: I0301 09:10:41.408684 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:41 crc kubenswrapper[4792]: E0301 09:10:41.409299 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:41 crc kubenswrapper[4792]: I0301 09:10:41.409323 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:41 crc kubenswrapper[4792]: E0301 09:10:41.409443 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:41 crc kubenswrapper[4792]: E0301 09:10:41.409654 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:41 crc kubenswrapper[4792]: E0301 09:10:41.768160 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 01 09:10:42 crc kubenswrapper[4792]: I0301 09:10:42.407756 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:42 crc kubenswrapper[4792]: E0301 09:10:42.407976 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:43 crc kubenswrapper[4792]: I0301 09:10:43.408127 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:43 crc kubenswrapper[4792]: I0301 09:10:43.408196 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:43 crc kubenswrapper[4792]: I0301 09:10:43.408145 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:43 crc kubenswrapper[4792]: E0301 09:10:43.408281 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:43 crc kubenswrapper[4792]: E0301 09:10:43.408461 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:43 crc kubenswrapper[4792]: E0301 09:10:43.408501 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:44 crc kubenswrapper[4792]: I0301 09:10:44.407925 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:44 crc kubenswrapper[4792]: E0301 09:10:44.408061 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:45 crc kubenswrapper[4792]: I0301 09:10:45.409595 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:45 crc kubenswrapper[4792]: E0301 09:10:45.409744 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:45 crc kubenswrapper[4792]: I0301 09:10:45.410001 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:45 crc kubenswrapper[4792]: E0301 09:10:45.410105 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:45 crc kubenswrapper[4792]: I0301 09:10:45.409554 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:45 crc kubenswrapper[4792]: E0301 09:10:45.412056 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:46 crc kubenswrapper[4792]: I0301 09:10:46.408394 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:46 crc kubenswrapper[4792]: E0301 09:10:46.408566 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:46 crc kubenswrapper[4792]: I0301 09:10:46.409382 4792 scope.go:117] "RemoveContainer" containerID="40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2" Mar 01 09:10:46 crc kubenswrapper[4792]: E0301 09:10:46.409776 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 01 09:10:46 crc kubenswrapper[4792]: E0301 09:10:46.770303 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 01 09:10:47 crc kubenswrapper[4792]: I0301 09:10:47.408340 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:47 crc kubenswrapper[4792]: I0301 09:10:47.408399 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:47 crc kubenswrapper[4792]: E0301 09:10:47.408477 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:47 crc kubenswrapper[4792]: I0301 09:10:47.408399 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:47 crc kubenswrapper[4792]: E0301 09:10:47.408552 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:47 crc kubenswrapper[4792]: E0301 09:10:47.408577 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:48 crc kubenswrapper[4792]: I0301 09:10:48.408220 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:48 crc kubenswrapper[4792]: E0301 09:10:48.408612 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:48 crc kubenswrapper[4792]: I0301 09:10:48.426677 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 01 09:10:49 crc kubenswrapper[4792]: I0301 09:10:49.408131 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:49 crc kubenswrapper[4792]: I0301 09:10:49.408143 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:49 crc kubenswrapper[4792]: E0301 09:10:49.408332 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:49 crc kubenswrapper[4792]: E0301 09:10:49.408427 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:49 crc kubenswrapper[4792]: I0301 09:10:49.408159 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:49 crc kubenswrapper[4792]: E0301 09:10:49.408506 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:50 crc kubenswrapper[4792]: I0301 09:10:50.408784 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:50 crc kubenswrapper[4792]: E0301 09:10:50.409069 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:51 crc kubenswrapper[4792]: I0301 09:10:51.409938 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:51 crc kubenswrapper[4792]: I0301 09:10:51.410037 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:51 crc kubenswrapper[4792]: I0301 09:10:51.410334 4792 scope.go:117] "RemoveContainer" containerID="157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c" Mar 01 09:10:51 crc kubenswrapper[4792]: E0301 09:10:51.410297 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:51 crc kubenswrapper[4792]: I0301 09:10:51.410048 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:51 crc kubenswrapper[4792]: E0301 09:10:51.410437 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:51 crc kubenswrapper[4792]: E0301 09:10:51.410472 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7pp7m_openshift-ovn-kubernetes(e2bd7bac-21cf-4657-ab84-68a14f99f8f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" Mar 01 09:10:51 crc kubenswrapper[4792]: E0301 09:10:51.410508 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:51 crc kubenswrapper[4792]: I0301 09:10:51.437106 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=3.437084381 podStartE2EDuration="3.437084381s" podCreationTimestamp="2026-03-01 09:10:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:10:51.435730982 +0000 UTC m=+180.677610199" watchObservedRunningTime="2026-03-01 09:10:51.437084381 +0000 UTC m=+180.678963698" Mar 01 09:10:51 crc kubenswrapper[4792]: E0301 09:10:51.771317 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 01 09:10:52 crc kubenswrapper[4792]: I0301 09:10:52.408489 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:52 crc kubenswrapper[4792]: E0301 09:10:52.408681 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:52 crc kubenswrapper[4792]: I0301 09:10:52.713510 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs\") pod \"network-metrics-daemon-frm7z\" (UID: \"fa0bf523-6582-46b4-9134-28880a50b474\") " pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:52 crc kubenswrapper[4792]: E0301 09:10:52.714512 4792 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 01 09:10:52 crc kubenswrapper[4792]: E0301 09:10:52.714589 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs podName:fa0bf523-6582-46b4-9134-28880a50b474 nodeName:}" failed. No retries permitted until 2026-03-01 09:11:56.714569037 +0000 UTC m=+245.956448244 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs") pod "network-metrics-daemon-frm7z" (UID: "fa0bf523-6582-46b4-9134-28880a50b474") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 01 09:10:53 crc kubenswrapper[4792]: I0301 09:10:53.407711 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:53 crc kubenswrapper[4792]: E0301 09:10:53.407844 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:53 crc kubenswrapper[4792]: I0301 09:10:53.407859 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:53 crc kubenswrapper[4792]: I0301 09:10:53.407726 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:53 crc kubenswrapper[4792]: E0301 09:10:53.407993 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:53 crc kubenswrapper[4792]: E0301 09:10:53.408069 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:54 crc kubenswrapper[4792]: I0301 09:10:54.408606 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:54 crc kubenswrapper[4792]: E0301 09:10:54.408734 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:55 crc kubenswrapper[4792]: I0301 09:10:55.408810 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:55 crc kubenswrapper[4792]: I0301 09:10:55.408843 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:55 crc kubenswrapper[4792]: E0301 09:10:55.409064 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:55 crc kubenswrapper[4792]: I0301 09:10:55.409219 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:55 crc kubenswrapper[4792]: E0301 09:10:55.409470 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:55 crc kubenswrapper[4792]: E0301 09:10:55.409867 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:56 crc kubenswrapper[4792]: I0301 09:10:56.407890 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:56 crc kubenswrapper[4792]: E0301 09:10:56.408325 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:56 crc kubenswrapper[4792]: E0301 09:10:56.773167 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 01 09:10:57 crc kubenswrapper[4792]: I0301 09:10:57.408796 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:57 crc kubenswrapper[4792]: E0301 09:10:57.408994 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:57 crc kubenswrapper[4792]: I0301 09:10:57.409312 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:57 crc kubenswrapper[4792]: E0301 09:10:57.409375 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:57 crc kubenswrapper[4792]: I0301 09:10:57.409489 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:57 crc kubenswrapper[4792]: E0301 09:10:57.409641 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:57 crc kubenswrapper[4792]: I0301 09:10:57.409997 4792 scope.go:117] "RemoveContainer" containerID="40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2" Mar 01 09:10:58 crc kubenswrapper[4792]: I0301 09:10:58.278595 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Mar 01 09:10:58 crc kubenswrapper[4792]: I0301 09:10:58.281639 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4c38215135d7b0d135f95361664363266a76db5a039fae95c1e2507e52ee9f40"} Mar 01 09:10:58 crc kubenswrapper[4792]: I0301 09:10:58.283098 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:10:58 crc kubenswrapper[4792]: I0301 09:10:58.408047 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:58 crc kubenswrapper[4792]: E0301 09:10:58.408184 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:59 crc kubenswrapper[4792]: I0301 09:10:59.408124 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:59 crc kubenswrapper[4792]: I0301 09:10:59.408184 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:59 crc kubenswrapper[4792]: I0301 09:10:59.408265 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:59 crc kubenswrapper[4792]: E0301 09:10:59.408325 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:59 crc kubenswrapper[4792]: E0301 09:10:59.408257 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:59 crc kubenswrapper[4792]: E0301 09:10:59.408441 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:11:00 crc kubenswrapper[4792]: I0301 09:11:00.408673 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:11:00 crc kubenswrapper[4792]: E0301 09:11:00.408784 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:11:01 crc kubenswrapper[4792]: I0301 09:11:01.408283 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:11:01 crc kubenswrapper[4792]: I0301 09:11:01.408394 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:11:01 crc kubenswrapper[4792]: I0301 09:11:01.409812 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:11:01 crc kubenswrapper[4792]: E0301 09:11:01.409799 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:11:01 crc kubenswrapper[4792]: E0301 09:11:01.410042 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:11:01 crc kubenswrapper[4792]: E0301 09:11:01.410110 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:11:01 crc kubenswrapper[4792]: E0301 09:11:01.774581 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 01 09:11:02 crc kubenswrapper[4792]: I0301 09:11:02.407854 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:11:02 crc kubenswrapper[4792]: E0301 09:11:02.408083 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:11:03 crc kubenswrapper[4792]: I0301 09:11:03.408705 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:11:03 crc kubenswrapper[4792]: E0301 09:11:03.408850 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:11:03 crc kubenswrapper[4792]: I0301 09:11:03.409044 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:11:03 crc kubenswrapper[4792]: E0301 09:11:03.409147 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:11:03 crc kubenswrapper[4792]: I0301 09:11:03.409213 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:11:03 crc kubenswrapper[4792]: E0301 09:11:03.409611 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:11:03 crc kubenswrapper[4792]: I0301 09:11:03.409951 4792 scope.go:117] "RemoveContainer" containerID="157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c" Mar 01 09:11:03 crc kubenswrapper[4792]: E0301 09:11:03.410204 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7pp7m_openshift-ovn-kubernetes(e2bd7bac-21cf-4657-ab84-68a14f99f8f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" Mar 01 09:11:04 crc kubenswrapper[4792]: I0301 09:11:04.408710 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:11:04 crc kubenswrapper[4792]: E0301 09:11:04.409133 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:11:05 crc kubenswrapper[4792]: I0301 09:11:05.408342 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:11:05 crc kubenswrapper[4792]: E0301 09:11:05.408807 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:11:05 crc kubenswrapper[4792]: I0301 09:11:05.408518 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:11:05 crc kubenswrapper[4792]: E0301 09:11:05.409262 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:11:05 crc kubenswrapper[4792]: I0301 09:11:05.408384 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:11:05 crc kubenswrapper[4792]: E0301 09:11:05.410305 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:11:06 crc kubenswrapper[4792]: I0301 09:11:06.408605 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:11:06 crc kubenswrapper[4792]: E0301 09:11:06.408778 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:11:06 crc kubenswrapper[4792]: E0301 09:11:06.776193 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 01 09:11:07 crc kubenswrapper[4792]: I0301 09:11:07.407954 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:11:07 crc kubenswrapper[4792]: I0301 09:11:07.407996 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:11:07 crc kubenswrapper[4792]: E0301 09:11:07.408128 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:11:07 crc kubenswrapper[4792]: I0301 09:11:07.408269 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:11:07 crc kubenswrapper[4792]: E0301 09:11:07.408416 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:11:07 crc kubenswrapper[4792]: E0301 09:11:07.408507 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:11:08 crc kubenswrapper[4792]: I0301 09:11:08.408275 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:11:08 crc kubenswrapper[4792]: E0301 09:11:08.408588 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:11:09 crc kubenswrapper[4792]: I0301 09:11:09.321063 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pq28p_ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3/kube-multus/1.log" Mar 01 09:11:09 crc kubenswrapper[4792]: I0301 09:11:09.322174 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pq28p_ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3/kube-multus/0.log" Mar 01 09:11:09 crc kubenswrapper[4792]: I0301 09:11:09.322308 4792 generic.go:334] "Generic (PLEG): container finished" podID="ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3" containerID="833126c3956c8927e1b68252bd9962e43df9c6e09dc2b98a20208c2db19a5fc1" exitCode=1 Mar 01 09:11:09 crc kubenswrapper[4792]: I0301 09:11:09.322355 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pq28p" event={"ID":"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3","Type":"ContainerDied","Data":"833126c3956c8927e1b68252bd9962e43df9c6e09dc2b98a20208c2db19a5fc1"} Mar 01 09:11:09 crc kubenswrapper[4792]: I0301 09:11:09.322405 4792 scope.go:117] "RemoveContainer" containerID="239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae" Mar 01 09:11:09 crc kubenswrapper[4792]: I0301 09:11:09.322819 4792 scope.go:117] "RemoveContainer" containerID="833126c3956c8927e1b68252bd9962e43df9c6e09dc2b98a20208c2db19a5fc1" Mar 01 09:11:09 crc kubenswrapper[4792]: E0301 09:11:09.323061 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-pq28p_openshift-multus(ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3)\"" pod="openshift-multus/multus-pq28p" podUID="ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3" Mar 01 09:11:09 crc kubenswrapper[4792]: I0301 09:11:09.341782 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=96.341757252 podStartE2EDuration="1m36.341757252s" podCreationTimestamp="2026-03-01 09:09:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:10:58.316163183 +0000 UTC m=+187.558042450" watchObservedRunningTime="2026-03-01 09:11:09.341757252 +0000 UTC m=+198.583636449" Mar 01 09:11:09 crc kubenswrapper[4792]: I0301 09:11:09.408465 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:11:09 crc kubenswrapper[4792]: I0301 09:11:09.408491 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:11:09 crc kubenswrapper[4792]: I0301 09:11:09.408519 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:11:09 crc kubenswrapper[4792]: E0301 09:11:09.408613 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:11:09 crc kubenswrapper[4792]: E0301 09:11:09.408682 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:11:09 crc kubenswrapper[4792]: E0301 09:11:09.408752 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:11:10 crc kubenswrapper[4792]: I0301 09:11:10.326293 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pq28p_ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3/kube-multus/1.log" Mar 01 09:11:10 crc kubenswrapper[4792]: I0301 09:11:10.407739 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:11:10 crc kubenswrapper[4792]: E0301 09:11:10.407876 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:11:11 crc kubenswrapper[4792]: I0301 09:11:11.408137 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:11:11 crc kubenswrapper[4792]: I0301 09:11:11.408343 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:11:11 crc kubenswrapper[4792]: E0301 09:11:11.408333 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:11:11 crc kubenswrapper[4792]: I0301 09:11:11.408408 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:11:11 crc kubenswrapper[4792]: E0301 09:11:11.408511 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:11:11 crc kubenswrapper[4792]: E0301 09:11:11.409498 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:11:11 crc kubenswrapper[4792]: E0301 09:11:11.776963 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 01 09:11:12 crc kubenswrapper[4792]: I0301 09:11:12.208370 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:11:12 crc kubenswrapper[4792]: I0301 09:11:12.408207 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:11:12 crc kubenswrapper[4792]: E0301 09:11:12.408328 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:11:13 crc kubenswrapper[4792]: I0301 09:11:13.407832 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:11:13 crc kubenswrapper[4792]: E0301 09:11:13.407992 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:11:13 crc kubenswrapper[4792]: I0301 09:11:13.408060 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:11:13 crc kubenswrapper[4792]: I0301 09:11:13.408080 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:11:13 crc kubenswrapper[4792]: E0301 09:11:13.408219 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:11:13 crc kubenswrapper[4792]: E0301 09:11:13.408297 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:11:14 crc kubenswrapper[4792]: I0301 09:11:14.407995 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:11:14 crc kubenswrapper[4792]: E0301 09:11:14.408143 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:11:15 crc kubenswrapper[4792]: I0301 09:11:15.408754 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:11:15 crc kubenswrapper[4792]: E0301 09:11:15.408947 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:11:15 crc kubenswrapper[4792]: I0301 09:11:15.409214 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:11:15 crc kubenswrapper[4792]: E0301 09:11:15.409300 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:11:15 crc kubenswrapper[4792]: I0301 09:11:15.409566 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:11:15 crc kubenswrapper[4792]: E0301 09:11:15.409655 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:11:16 crc kubenswrapper[4792]: I0301 09:11:16.408691 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:11:16 crc kubenswrapper[4792]: E0301 09:11:16.408806 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:11:16 crc kubenswrapper[4792]: E0301 09:11:16.778183 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 01 09:11:17 crc kubenswrapper[4792]: I0301 09:11:17.408085 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:11:17 crc kubenswrapper[4792]: I0301 09:11:17.408129 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:11:17 crc kubenswrapper[4792]: E0301 09:11:17.408208 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:11:17 crc kubenswrapper[4792]: I0301 09:11:17.408220 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:11:17 crc kubenswrapper[4792]: E0301 09:11:17.408510 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:11:17 crc kubenswrapper[4792]: E0301 09:11:17.408626 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:11:18 crc kubenswrapper[4792]: I0301 09:11:18.408061 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:11:18 crc kubenswrapper[4792]: E0301 09:11:18.408860 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:11:18 crc kubenswrapper[4792]: I0301 09:11:18.409400 4792 scope.go:117] "RemoveContainer" containerID="157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c" Mar 01 09:11:19 crc kubenswrapper[4792]: I0301 09:11:19.357878 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pp7m_e2bd7bac-21cf-4657-ab84-68a14f99f8f0/ovnkube-controller/3.log" Mar 01 09:11:19 crc kubenswrapper[4792]: I0301 09:11:19.360075 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerStarted","Data":"9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788"} Mar 01 09:11:19 crc kubenswrapper[4792]: I0301 09:11:19.360476 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:11:19 crc kubenswrapper[4792]: I0301 09:11:19.383316 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" podStartSLOduration=147.383301273 podStartE2EDuration="2m27.383301273s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:19.382583375 +0000 UTC m=+208.624462572" watchObservedRunningTime="2026-03-01 09:11:19.383301273 +0000 UTC m=+208.625180470" Mar 01 09:11:19 crc kubenswrapper[4792]: I0301 09:11:19.410768 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:11:19 crc kubenswrapper[4792]: E0301 09:11:19.410887 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:11:19 crc kubenswrapper[4792]: I0301 09:11:19.411086 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:11:19 crc kubenswrapper[4792]: E0301 09:11:19.411144 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:11:19 crc kubenswrapper[4792]: I0301 09:11:19.411774 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:11:19 crc kubenswrapper[4792]: E0301 09:11:19.411849 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:11:19 crc kubenswrapper[4792]: I0301 09:11:19.520169 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-frm7z"] Mar 01 09:11:19 crc kubenswrapper[4792]: I0301 09:11:19.520294 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:11:19 crc kubenswrapper[4792]: E0301 09:11:19.520382 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:11:21 crc kubenswrapper[4792]: I0301 09:11:21.407934 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:11:21 crc kubenswrapper[4792]: I0301 09:11:21.407939 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:11:21 crc kubenswrapper[4792]: I0301 09:11:21.408006 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:11:21 crc kubenswrapper[4792]: I0301 09:11:21.408023 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:11:21 crc kubenswrapper[4792]: E0301 09:11:21.408952 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:11:21 crc kubenswrapper[4792]: E0301 09:11:21.409040 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:11:21 crc kubenswrapper[4792]: E0301 09:11:21.409134 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:11:21 crc kubenswrapper[4792]: E0301 09:11:21.409220 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:11:21 crc kubenswrapper[4792]: E0301 09:11:21.779287 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 01 09:11:23 crc kubenswrapper[4792]: I0301 09:11:23.408575 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:11:23 crc kubenswrapper[4792]: E0301 09:11:23.408747 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:11:23 crc kubenswrapper[4792]: I0301 09:11:23.408825 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:11:23 crc kubenswrapper[4792]: I0301 09:11:23.408934 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:11:23 crc kubenswrapper[4792]: E0301 09:11:23.409075 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:11:23 crc kubenswrapper[4792]: I0301 09:11:23.409094 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:11:23 crc kubenswrapper[4792]: E0301 09:11:23.409239 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:11:23 crc kubenswrapper[4792]: I0301 09:11:23.409375 4792 scope.go:117] "RemoveContainer" containerID="833126c3956c8927e1b68252bd9962e43df9c6e09dc2b98a20208c2db19a5fc1" Mar 01 09:11:23 crc kubenswrapper[4792]: E0301 09:11:23.409438 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:11:24 crc kubenswrapper[4792]: I0301 09:11:24.378409 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pq28p_ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3/kube-multus/1.log" Mar 01 09:11:24 crc kubenswrapper[4792]: I0301 09:11:24.378506 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pq28p" event={"ID":"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3","Type":"ContainerStarted","Data":"43e6bc214d9cf4f260a6b3f92b2bb7a5207d98f68c3fc275ce08d07a9684d65b"} Mar 01 09:11:25 crc kubenswrapper[4792]: I0301 09:11:25.407970 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:11:25 crc kubenswrapper[4792]: I0301 09:11:25.408002 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:11:25 crc kubenswrapper[4792]: I0301 09:11:25.408086 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:11:25 crc kubenswrapper[4792]: E0301 09:11:25.408235 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:11:25 crc kubenswrapper[4792]: E0301 09:11:25.408384 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:11:25 crc kubenswrapper[4792]: I0301 09:11:25.408409 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:11:25 crc kubenswrapper[4792]: E0301 09:11:25.408449 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:11:25 crc kubenswrapper[4792]: E0301 09:11:25.408881 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:11:27 crc kubenswrapper[4792]: I0301 09:11:27.408250 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:11:27 crc kubenswrapper[4792]: I0301 09:11:27.408289 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:11:27 crc kubenswrapper[4792]: I0301 09:11:27.408292 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:11:27 crc kubenswrapper[4792]: I0301 09:11:27.408330 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:11:27 crc kubenswrapper[4792]: I0301 09:11:27.414264 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 01 09:11:27 crc kubenswrapper[4792]: I0301 09:11:27.414536 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 01 09:11:27 crc kubenswrapper[4792]: I0301 09:11:27.414683 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 01 09:11:27 crc kubenswrapper[4792]: I0301 09:11:27.414766 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 01 09:11:27 crc kubenswrapper[4792]: I0301 09:11:27.415176 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 01 09:11:27 crc kubenswrapper[4792]: I0301 09:11:27.415185 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.618074 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.658599 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.659219 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.669132 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.669754 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.669753 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.670726 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.670887 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.670773 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.671241 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-wxl8v"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.671957 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-wxl8v" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.674214 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5kp"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.674874 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-zrzcg"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.675349 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.675881 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5kp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.677164 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-6lk5b"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.682362 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.685007 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-lk5qk"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.685687 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s68hg"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.686029 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lk5qk" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.686177 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s68hg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.686586 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zk9bv"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.694572 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zk9bv" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.706553 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wl9zt"] Mar 01 09:11:31 crc kubenswrapper[4792]: W0301 09:11:31.707795 4792 reflector.go:561] object-"openshift-apiserver"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 01 09:11:31 crc kubenswrapper[4792]: E0301 09:11:31.708015 4792 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 01 09:11:31 crc kubenswrapper[4792]: W0301 09:11:31.719111 4792 reflector.go:561] object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff": failed to list *v1.Secret: secrets "openshift-apiserver-sa-dockercfg-djjff" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 01 09:11:31 crc kubenswrapper[4792]: E0301 09:11:31.719174 4792 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"openshift-apiserver-sa-dockercfg-djjff\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-apiserver-sa-dockercfg-djjff\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 01 09:11:31 crc kubenswrapper[4792]: W0301 09:11:31.719467 4792 reflector.go:561] object-"openshift-apiserver"/"audit-1": failed to list *v1.ConfigMap: configmaps "audit-1" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 01 09:11:31 crc kubenswrapper[4792]: E0301 09:11:31.719487 4792 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"audit-1\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"audit-1\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 01 09:11:31 crc kubenswrapper[4792]: W0301 09:11:31.719615 4792 reflector.go:561] object-"openshift-apiserver"/"etcd-client": failed to list *v1.Secret: secrets "etcd-client" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 01 09:11:31 crc kubenswrapper[4792]: E0301 09:11:31.719627 4792 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"etcd-client\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"etcd-client\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.720043 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.721991 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-tswcj"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.722408 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-2l2w7"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.732734 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-8qrq4"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.732846 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2l2w7" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.723604 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.733344 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-nv4bp"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.733573 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-8qrq4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.723238 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.723661 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 01 09:11:31 crc kubenswrapper[4792]: W0301 09:11:31.723710 4792 reflector.go:561] object-"openshift-apiserver"/"config": failed to list *v1.ConfigMap: configmaps "config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 01 09:11:31 crc kubenswrapper[4792]: E0301 09:11:31.733719 4792 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.723810 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: W0301 09:11:31.723834 4792 reflector.go:561] object-"openshift-apiserver"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 01 09:11:31 crc kubenswrapper[4792]: E0301 09:11:31.733820 4792 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.723850 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 01 09:11:31 crc kubenswrapper[4792]: W0301 09:11:31.723858 4792 reflector.go:561] object-"openshift-apiserver"/"image-import-ca": failed to list *v1.ConfigMap: configmaps "image-import-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 01 09:11:31 crc kubenswrapper[4792]: E0301 09:11:31.733932 4792 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"image-import-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"image-import-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 01 09:11:31 crc kubenswrapper[4792]: W0301 09:11:31.723892 4792 reflector.go:561] object-"openshift-apiserver"/"trusted-ca-bundle": failed to list *v1.ConfigMap: configmaps "trusted-ca-bundle" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 01 09:11:31 crc kubenswrapper[4792]: E0301 09:11:31.733953 4792 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"trusted-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"trusted-ca-bundle\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.723896 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.723900 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: W0301 09:11:31.723980 4792 reflector.go:561] object-"openshift-apiserver"/"etcd-serving-ca": failed to list *v1.ConfigMap: configmaps "etcd-serving-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 01 09:11:31 crc kubenswrapper[4792]: E0301 09:11:31.734123 4792 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"etcd-serving-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"etcd-serving-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.723985 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.724001 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.724016 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.724021 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.724067 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.724067 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.724071 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.724111 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.724116 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.724138 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.724184 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.724195 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.724218 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.724327 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.724383 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.724414 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.724845 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.725001 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.725100 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.725205 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.725347 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.727825 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.728360 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.728462 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.732644 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.724221 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-tswcj" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.737649 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.738013 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-prqqp"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.738227 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4jwnr"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.738457 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dzrg"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.738735 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dzrg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.738868 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.738983 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.740150 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.740629 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-nv4bp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.744832 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-smnq2"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.745313 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-q92nw"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.745463 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.745585 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9smfd"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.746049 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.746150 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9smfd" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.746375 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.746598 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.746858 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.747078 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-smnq2" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.747111 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.747263 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.747390 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.747558 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.747790 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-q92nw" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.748262 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.748436 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.748577 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.748924 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.749221 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.749461 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.753195 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.753339 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.753509 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.753989 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8qxf\" (UniqueName: \"kubernetes.io/projected/2311d615-fd4d-43c2-9fcb-8858383c2dc9-kube-api-access-v8qxf\") pod \"machine-approver-56656f9798-lk5qk\" (UID: \"2311d615-fd4d-43c2-9fcb-8858383c2dc9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lk5qk" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754028 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/499393fc-abcf-4998-9e32-3d43a0b1e488-audit-dir\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754057 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-config\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754071 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2311d615-fd4d-43c2-9fcb-8858383c2dc9-machine-approver-tls\") pod \"machine-approver-56656f9798-lk5qk\" (UID: \"2311d615-fd4d-43c2-9fcb-8858383c2dc9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lk5qk" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754093 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/86788093-42e5-4fa0-9595-97a910e6557e-console-oauth-config\") pod \"console-f9d7485db-zrzcg\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754108 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-service-ca\") pod \"console-f9d7485db-zrzcg\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754129 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-trusted-ca-bundle\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754144 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/499393fc-abcf-4998-9e32-3d43a0b1e488-encryption-config\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754159 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/86788093-42e5-4fa0-9595-97a910e6557e-console-serving-cert\") pod \"console-f9d7485db-zrzcg\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754174 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-etcd-serving-ca\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754188 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2311d615-fd4d-43c2-9fcb-8858383c2dc9-config\") pod \"machine-approver-56656f9798-lk5qk\" (UID: \"2311d615-fd4d-43c2-9fcb-8858383c2dc9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lk5qk" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754203 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g22l6\" (UniqueName: \"kubernetes.io/projected/e12bfc30-3142-4073-96c7-a377ff6723f7-kube-api-access-g22l6\") pod \"openshift-apiserver-operator-796bbdcf4f-zk9bv\" (UID: \"e12bfc30-3142-4073-96c7-a377ff6723f7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zk9bv" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754220 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjp6q\" (UniqueName: \"kubernetes.io/projected/77e0e285-570c-47bd-854e-538c9367486b-kube-api-access-jjp6q\") pod \"cluster-samples-operator-665b6dd947-kq5kp\" (UID: \"77e0e285-570c-47bd-854e-538c9367486b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5kp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754236 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-console-config\") pod \"console-f9d7485db-zrzcg\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754251 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-trusted-ca-bundle\") pod \"console-f9d7485db-zrzcg\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754269 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e12bfc30-3142-4073-96c7-a377ff6723f7-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-zk9bv\" (UID: \"e12bfc30-3142-4073-96c7-a377ff6723f7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zk9bv" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754284 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-serving-cert\") pod \"route-controller-manager-6576b87f9c-knb62\" (UID: \"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754301 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw279\" (UniqueName: \"kubernetes.io/projected/499393fc-abcf-4998-9e32-3d43a0b1e488-kube-api-access-bw279\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754318 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62e02cd9-3008-41de-b7b7-dc1f546c5645-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-s68hg\" (UID: \"62e02cd9-3008-41de-b7b7-dc1f546c5645\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s68hg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754335 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/77e0e285-570c-47bd-854e-538c9367486b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-kq5kp\" (UID: \"77e0e285-570c-47bd-854e-538c9367486b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5kp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754350 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-audit\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754365 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/499393fc-abcf-4998-9e32-3d43a0b1e488-etcd-client\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754379 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-client-ca\") pod \"route-controller-manager-6576b87f9c-knb62\" (UID: \"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754394 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/499393fc-abcf-4998-9e32-3d43a0b1e488-node-pullsecrets\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754408 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-config\") pod \"route-controller-manager-6576b87f9c-knb62\" (UID: \"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754423 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2311d615-fd4d-43c2-9fcb-8858383c2dc9-auth-proxy-config\") pod \"machine-approver-56656f9798-lk5qk\" (UID: \"2311d615-fd4d-43c2-9fcb-8858383c2dc9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lk5qk" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754437 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnjdm\" (UniqueName: \"kubernetes.io/projected/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-kube-api-access-fnjdm\") pod \"route-controller-manager-6576b87f9c-knb62\" (UID: \"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754454 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-image-import-ca\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754468 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/499393fc-abcf-4998-9e32-3d43a0b1e488-serving-cert\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754493 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wdc9\" (UniqueName: \"kubernetes.io/projected/6cc55bdf-6c0f-4d35-879f-c64c2dc4897c-kube-api-access-6wdc9\") pod \"downloads-7954f5f757-wxl8v\" (UID: \"6cc55bdf-6c0f-4d35-879f-c64c2dc4897c\") " pod="openshift-console/downloads-7954f5f757-wxl8v" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754510 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn8s2\" (UniqueName: \"kubernetes.io/projected/62e02cd9-3008-41de-b7b7-dc1f546c5645-kube-api-access-hn8s2\") pod \"openshift-controller-manager-operator-756b6f6bc6-s68hg\" (UID: \"62e02cd9-3008-41de-b7b7-dc1f546c5645\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s68hg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754527 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e12bfc30-3142-4073-96c7-a377ff6723f7-config\") pod \"openshift-apiserver-operator-796bbdcf4f-zk9bv\" (UID: \"e12bfc30-3142-4073-96c7-a377ff6723f7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zk9bv" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754542 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-oauth-serving-cert\") pod \"console-f9d7485db-zrzcg\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754558 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qmx8\" (UniqueName: \"kubernetes.io/projected/86788093-42e5-4fa0-9595-97a910e6557e-kube-api-access-6qmx8\") pod \"console-f9d7485db-zrzcg\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754575 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62e02cd9-3008-41de-b7b7-dc1f546c5645-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-s68hg\" (UID: \"62e02cd9-3008-41de-b7b7-dc1f546c5645\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s68hg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754661 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.755079 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.755448 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.755517 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.755759 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.755883 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.756124 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.756679 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.756809 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.756840 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.756950 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.756994 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.757007 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.758242 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-qtg4x"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.758855 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dbmff"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.759363 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9fc7"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.759890 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9fc7" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.762446 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.762555 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-qtg4x" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.763492 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.763548 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.763677 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.763958 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.763969 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dbmff" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.764181 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.764344 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.764613 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.764828 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.764991 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.765115 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.765236 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7b2lz"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.765257 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.765661 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-877gr"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.765698 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.773421 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.776214 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.776698 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.780329 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dwc4"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.784566 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-877gr" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.786133 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.807405 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.810191 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7b2lz" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.810245 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.810740 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.811742 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.812785 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.812833 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.819879 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dwc4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.836596 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nc4gs"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.837017 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.837322 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nc4gs" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.839431 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.841059 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.841556 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.842551 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-6rmwm"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.843045 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6rmwm" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.845135 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-ffd8l"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.847971 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ffd8l" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.849126 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.850337 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.856641 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857161 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51683a24-edad-4808-b2ec-6a628bfdd937-config\") pod \"machine-api-operator-5694c8668f-nv4bp\" (UID: \"51683a24-edad-4808-b2ec-6a628bfdd937\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nv4bp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857205 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8578f8dc-143c-423c-b62b-b3190444bafd-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-wl9zt\" (UID: \"8578f8dc-143c-423c-b62b-b3190444bafd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857225 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eeacfd31-08e1-49e6-afda-95efa2d815d2-config\") pod \"console-operator-58897d9998-8qrq4\" (UID: \"eeacfd31-08e1-49e6-afda-95efa2d815d2\") " pod="openshift-console-operator/console-operator-58897d9998-8qrq4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857240 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qx6z\" (UniqueName: \"kubernetes.io/projected/eeacfd31-08e1-49e6-afda-95efa2d815d2-kube-api-access-8qx6z\") pod \"console-operator-58897d9998-8qrq4\" (UID: \"eeacfd31-08e1-49e6-afda-95efa2d815d2\") " pod="openshift-console-operator/console-operator-58897d9998-8qrq4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857267 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-config\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857287 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2311d615-fd4d-43c2-9fcb-8858383c2dc9-machine-approver-tls\") pod \"machine-approver-56656f9798-lk5qk\" (UID: \"2311d615-fd4d-43c2-9fcb-8858383c2dc9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lk5qk" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857302 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn288\" (UniqueName: \"kubernetes.io/projected/51683a24-edad-4808-b2ec-6a628bfdd937-kube-api-access-bn288\") pod \"machine-api-operator-5694c8668f-nv4bp\" (UID: \"51683a24-edad-4808-b2ec-6a628bfdd937\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nv4bp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857319 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9d95b2fd-64be-4688-a596-c41bb31cb9c4-audit-policies\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857338 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857354 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857371 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f432b26-7417-4b71-a63a-5cb9a142bd43-config\") pod \"authentication-operator-69f744f599-tswcj\" (UID: \"1f432b26-7417-4b71-a63a-5cb9a142bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tswcj" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857395 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/86788093-42e5-4fa0-9595-97a910e6557e-console-oauth-config\") pod \"console-f9d7485db-zrzcg\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857411 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-service-ca\") pod \"console-f9d7485db-zrzcg\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857427 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/020a8218-62f4-4abf-a8d2-fed602de5f7f-audit-dir\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857450 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-trusted-ca-bundle\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857465 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/499393fc-abcf-4998-9e32-3d43a0b1e488-encryption-config\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857483 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857500 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47d3ee47-7c75-4321-8b9c-5e119a92a311-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-7b2lz\" (UID: \"47d3ee47-7c75-4321-8b9c-5e119a92a311\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7b2lz" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857517 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/86788093-42e5-4fa0-9595-97a910e6557e-console-serving-cert\") pod \"console-f9d7485db-zrzcg\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857532 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-etcd-serving-ca\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857551 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2311d615-fd4d-43c2-9fcb-8858383c2dc9-config\") pod \"machine-approver-56656f9798-lk5qk\" (UID: \"2311d615-fd4d-43c2-9fcb-8858383c2dc9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lk5qk" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857569 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d0571e3-5089-4157-a36a-25ecfe6a67f2-service-ca-bundle\") pod \"router-default-5444994796-qtg4x\" (UID: \"6d0571e3-5089-4157-a36a-25ecfe6a67f2\") " pod="openshift-ingress/router-default-5444994796-qtg4x" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857588 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkbg8\" (UniqueName: \"kubernetes.io/projected/0a5ad85c-19b5-432d-aa36-d0db74e44744-kube-api-access-kkbg8\") pod \"machine-config-controller-84d6567774-877gr\" (UID: \"0a5ad85c-19b5-432d-aa36-d0db74e44744\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-877gr" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857607 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g22l6\" (UniqueName: \"kubernetes.io/projected/e12bfc30-3142-4073-96c7-a377ff6723f7-kube-api-access-g22l6\") pod \"openshift-apiserver-operator-796bbdcf4f-zk9bv\" (UID: \"e12bfc30-3142-4073-96c7-a377ff6723f7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zk9bv" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857625 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjp6q\" (UniqueName: \"kubernetes.io/projected/77e0e285-570c-47bd-854e-538c9367486b-kube-api-access-jjp6q\") pod \"cluster-samples-operator-665b6dd947-kq5kp\" (UID: \"77e0e285-570c-47bd-854e-538c9367486b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5kp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857644 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/51683a24-edad-4808-b2ec-6a628bfdd937-images\") pod \"machine-api-operator-5694c8668f-nv4bp\" (UID: \"51683a24-edad-4808-b2ec-6a628bfdd937\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nv4bp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857662 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f432b26-7417-4b71-a63a-5cb9a142bd43-serving-cert\") pod \"authentication-operator-69f744f599-tswcj\" (UID: \"1f432b26-7417-4b71-a63a-5cb9a142bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tswcj" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857700 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d3e5ed20-a91a-4d9b-a42c-0bf43b8b9842-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-dbmff\" (UID: \"d3e5ed20-a91a-4d9b-a42c-0bf43b8b9842\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dbmff" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857718 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857736 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6d0571e3-5089-4157-a36a-25ecfe6a67f2-stats-auth\") pod \"router-default-5444994796-qtg4x\" (UID: \"6d0571e3-5089-4157-a36a-25ecfe6a67f2\") " pod="openshift-ingress/router-default-5444994796-qtg4x" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857756 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6d6h\" (UniqueName: \"kubernetes.io/projected/47d3ee47-7c75-4321-8b9c-5e119a92a311-kube-api-access-f6d6h\") pod \"kube-storage-version-migrator-operator-b67b599dd-7b2lz\" (UID: \"47d3ee47-7c75-4321-8b9c-5e119a92a311\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7b2lz" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857779 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/51683a24-edad-4808-b2ec-6a628bfdd937-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-nv4bp\" (UID: \"51683a24-edad-4808-b2ec-6a628bfdd937\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nv4bp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857803 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ee9e9d4-e788-41cb-b601-035551b5338c-serving-cert\") pod \"openshift-config-operator-7777fb866f-2l2w7\" (UID: \"9ee9e9d4-e788-41cb-b601-035551b5338c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2l2w7" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857823 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857840 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f432b26-7417-4b71-a63a-5cb9a142bd43-service-ca-bundle\") pod \"authentication-operator-69f744f599-tswcj\" (UID: \"1f432b26-7417-4b71-a63a-5cb9a142bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tswcj" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857854 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d0571e3-5089-4157-a36a-25ecfe6a67f2-metrics-certs\") pod \"router-default-5444994796-qtg4x\" (UID: \"6d0571e3-5089-4157-a36a-25ecfe6a67f2\") " pod="openshift-ingress/router-default-5444994796-qtg4x" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857873 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-console-config\") pod \"console-f9d7485db-zrzcg\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857892 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-trusted-ca-bundle\") pod \"console-f9d7485db-zrzcg\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857948 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e12bfc30-3142-4073-96c7-a377ff6723f7-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-zk9bv\" (UID: \"e12bfc30-3142-4073-96c7-a377ff6723f7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zk9bv" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857966 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-serving-cert\") pod \"route-controller-manager-6576b87f9c-knb62\" (UID: \"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857983 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858000 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8578f8dc-143c-423c-b62b-b3190444bafd-config\") pod \"controller-manager-879f6c89f-wl9zt\" (UID: \"8578f8dc-143c-423c-b62b-b3190444bafd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858022 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvxvc\" (UniqueName: \"kubernetes.io/projected/6d0571e3-5089-4157-a36a-25ecfe6a67f2-kube-api-access-bvxvc\") pod \"router-default-5444994796-qtg4x\" (UID: \"6d0571e3-5089-4157-a36a-25ecfe6a67f2\") " pod="openshift-ingress/router-default-5444994796-qtg4x" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858040 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bw279\" (UniqueName: \"kubernetes.io/projected/499393fc-abcf-4998-9e32-3d43a0b1e488-kube-api-access-bw279\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858057 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62e02cd9-3008-41de-b7b7-dc1f546c5645-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-s68hg\" (UID: \"62e02cd9-3008-41de-b7b7-dc1f546c5645\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s68hg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858073 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3e5ed20-a91a-4d9b-a42c-0bf43b8b9842-config\") pod \"kube-apiserver-operator-766d6c64bb-dbmff\" (UID: \"d3e5ed20-a91a-4d9b-a42c-0bf43b8b9842\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dbmff" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858091 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eeacfd31-08e1-49e6-afda-95efa2d815d2-trusted-ca\") pod \"console-operator-58897d9998-8qrq4\" (UID: \"eeacfd31-08e1-49e6-afda-95efa2d815d2\") " pod="openshift-console-operator/console-operator-58897d9998-8qrq4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858107 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9d95b2fd-64be-4688-a596-c41bb31cb9c4-etcd-client\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858124 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d95b2fd-64be-4688-a596-c41bb31cb9c4-serving-cert\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858139 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858154 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47d3ee47-7c75-4321-8b9c-5e119a92a311-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-7b2lz\" (UID: \"47d3ee47-7c75-4321-8b9c-5e119a92a311\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7b2lz" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858172 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/77e0e285-570c-47bd-854e-538c9367486b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-kq5kp\" (UID: \"77e0e285-570c-47bd-854e-538c9367486b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5kp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858187 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-audit\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858203 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/499393fc-abcf-4998-9e32-3d43a0b1e488-etcd-client\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858219 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-client-ca\") pod \"route-controller-manager-6576b87f9c-knb62\" (UID: \"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858237 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/499393fc-abcf-4998-9e32-3d43a0b1e488-node-pullsecrets\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858252 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-config\") pod \"route-controller-manager-6576b87f9c-knb62\" (UID: \"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858268 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a5ad85c-19b5-432d-aa36-d0db74e44744-proxy-tls\") pod \"machine-config-controller-84d6567774-877gr\" (UID: \"0a5ad85c-19b5-432d-aa36-d0db74e44744\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-877gr" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858284 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/e0b63d94-59de-45da-8058-89714bea7a90-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-9smfd\" (UID: \"e0b63d94-59de-45da-8058-89714bea7a90\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9smfd" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858301 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b4xc\" (UniqueName: \"kubernetes.io/projected/1f432b26-7417-4b71-a63a-5cb9a142bd43-kube-api-access-7b4xc\") pod \"authentication-operator-69f744f599-tswcj\" (UID: \"1f432b26-7417-4b71-a63a-5cb9a142bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tswcj" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858318 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2311d615-fd4d-43c2-9fcb-8858383c2dc9-auth-proxy-config\") pod \"machine-approver-56656f9798-lk5qk\" (UID: \"2311d615-fd4d-43c2-9fcb-8858383c2dc9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lk5qk" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858336 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnjdm\" (UniqueName: \"kubernetes.io/projected/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-kube-api-access-fnjdm\") pod \"route-controller-manager-6576b87f9c-knb62\" (UID: \"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858354 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d95b2fd-64be-4688-a596-c41bb31cb9c4-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858369 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9d95b2fd-64be-4688-a596-c41bb31cb9c4-encryption-config\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858387 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9ee9e9d4-e788-41cb-b601-035551b5338c-available-featuregates\") pod \"openshift-config-operator-7777fb866f-2l2w7\" (UID: \"9ee9e9d4-e788-41cb-b601-035551b5338c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2l2w7" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858402 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858428 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-image-import-ca\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858443 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/499393fc-abcf-4998-9e32-3d43a0b1e488-serving-cert\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858460 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3e5ed20-a91a-4d9b-a42c-0bf43b8b9842-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-dbmff\" (UID: \"d3e5ed20-a91a-4d9b-a42c-0bf43b8b9842\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dbmff" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858477 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpcg5\" (UniqueName: \"kubernetes.io/projected/9ee9e9d4-e788-41cb-b601-035551b5338c-kube-api-access-lpcg5\") pod \"openshift-config-operator-7777fb866f-2l2w7\" (UID: \"9ee9e9d4-e788-41cb-b601-035551b5338c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2l2w7" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858499 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9d95b2fd-64be-4688-a596-c41bb31cb9c4-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858514 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858532 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wdc9\" (UniqueName: \"kubernetes.io/projected/6cc55bdf-6c0f-4d35-879f-c64c2dc4897c-kube-api-access-6wdc9\") pod \"downloads-7954f5f757-wxl8v\" (UID: \"6cc55bdf-6c0f-4d35-879f-c64c2dc4897c\") " pod="openshift-console/downloads-7954f5f757-wxl8v" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858551 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn8s2\" (UniqueName: \"kubernetes.io/projected/62e02cd9-3008-41de-b7b7-dc1f546c5645-kube-api-access-hn8s2\") pod \"openshift-controller-manager-operator-756b6f6bc6-s68hg\" (UID: \"62e02cd9-3008-41de-b7b7-dc1f546c5645\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s68hg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858567 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e12bfc30-3142-4073-96c7-a377ff6723f7-config\") pod \"openshift-apiserver-operator-796bbdcf4f-zk9bv\" (UID: \"e12bfc30-3142-4073-96c7-a377ff6723f7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zk9bv" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858582 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-oauth-serving-cert\") pod \"console-f9d7485db-zrzcg\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858598 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qmx8\" (UniqueName: \"kubernetes.io/projected/86788093-42e5-4fa0-9595-97a910e6557e-kube-api-access-6qmx8\") pod \"console-f9d7485db-zrzcg\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858614 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62e02cd9-3008-41de-b7b7-dc1f546c5645-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-s68hg\" (UID: \"62e02cd9-3008-41de-b7b7-dc1f546c5645\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s68hg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858631 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9d95b2fd-64be-4688-a596-c41bb31cb9c4-audit-dir\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858647 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-audit-policies\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858665 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55wpx\" (UniqueName: \"kubernetes.io/projected/020a8218-62f4-4abf-a8d2-fed602de5f7f-kube-api-access-55wpx\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858681 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eeacfd31-08e1-49e6-afda-95efa2d815d2-serving-cert\") pod \"console-operator-58897d9998-8qrq4\" (UID: \"eeacfd31-08e1-49e6-afda-95efa2d815d2\") " pod="openshift-console-operator/console-operator-58897d9998-8qrq4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858697 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8qxf\" (UniqueName: \"kubernetes.io/projected/2311d615-fd4d-43c2-9fcb-8858383c2dc9-kube-api-access-v8qxf\") pod \"machine-approver-56656f9798-lk5qk\" (UID: \"2311d615-fd4d-43c2-9fcb-8858383c2dc9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lk5qk" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858715 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spbvd\" (UniqueName: \"kubernetes.io/projected/e0b63d94-59de-45da-8058-89714bea7a90-kube-api-access-spbvd\") pod \"control-plane-machine-set-operator-78cbb6b69f-9smfd\" (UID: \"e0b63d94-59de-45da-8058-89714bea7a90\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9smfd" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858731 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858745 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f432b26-7417-4b71-a63a-5cb9a142bd43-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-tswcj\" (UID: \"1f432b26-7417-4b71-a63a-5cb9a142bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tswcj" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858768 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8578f8dc-143c-423c-b62b-b3190444bafd-client-ca\") pod \"controller-manager-879f6c89f-wl9zt\" (UID: \"8578f8dc-143c-423c-b62b-b3190444bafd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858783 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0a5ad85c-19b5-432d-aa36-d0db74e44744-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-877gr\" (UID: \"0a5ad85c-19b5-432d-aa36-d0db74e44744\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-877gr" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858802 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/499393fc-abcf-4998-9e32-3d43a0b1e488-audit-dir\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858819 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8578f8dc-143c-423c-b62b-b3190444bafd-serving-cert\") pod \"controller-manager-879f6c89f-wl9zt\" (UID: \"8578f8dc-143c-423c-b62b-b3190444bafd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858836 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qq5x\" (UniqueName: \"kubernetes.io/projected/8578f8dc-143c-423c-b62b-b3190444bafd-kube-api-access-6qq5x\") pod \"controller-manager-879f6c89f-wl9zt\" (UID: \"8578f8dc-143c-423c-b62b-b3190444bafd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858851 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xslxf\" (UniqueName: \"kubernetes.io/projected/9d95b2fd-64be-4688-a596-c41bb31cb9c4-kube-api-access-xslxf\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858867 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858882 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6d0571e3-5089-4157-a36a-25ecfe6a67f2-default-certificate\") pod \"router-default-5444994796-qtg4x\" (UID: \"6d0571e3-5089-4157-a36a-25ecfe6a67f2\") " pod="openshift-ingress/router-default-5444994796-qtg4x" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.861242 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-service-ca\") pod \"console-f9d7485db-zrzcg\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.864648 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-zrzcg"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.864701 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-wxl8v"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.864711 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5kp"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.869586 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-client-ca\") pod \"route-controller-manager-6576b87f9c-knb62\" (UID: \"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.869876 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/499393fc-abcf-4998-9e32-3d43a0b1e488-node-pullsecrets\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.871287 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-config\") pod \"route-controller-manager-6576b87f9c-knb62\" (UID: \"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.872160 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2311d615-fd4d-43c2-9fcb-8858383c2dc9-auth-proxy-config\") pod \"machine-approver-56656f9798-lk5qk\" (UID: \"2311d615-fd4d-43c2-9fcb-8858383c2dc9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lk5qk" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.873168 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.875851 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-console-config\") pod \"console-f9d7485db-zrzcg\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.877223 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2311d615-fd4d-43c2-9fcb-8858383c2dc9-config\") pod \"machine-approver-56656f9798-lk5qk\" (UID: \"2311d615-fd4d-43c2-9fcb-8858383c2dc9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lk5qk" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.877350 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/499393fc-abcf-4998-9e32-3d43a0b1e488-audit-dir\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.877875 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e12bfc30-3142-4073-96c7-a377ff6723f7-config\") pod \"openshift-apiserver-operator-796bbdcf4f-zk9bv\" (UID: \"e12bfc30-3142-4073-96c7-a377ff6723f7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zk9bv" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.878356 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-oauth-serving-cert\") pod \"console-f9d7485db-zrzcg\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.878357 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/499393fc-abcf-4998-9e32-3d43a0b1e488-encryption-config\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.878781 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62e02cd9-3008-41de-b7b7-dc1f546c5645-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-s68hg\" (UID: \"62e02cd9-3008-41de-b7b7-dc1f546c5645\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s68hg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.886953 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e12bfc30-3142-4073-96c7-a377ff6723f7-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-zk9bv\" (UID: \"e12bfc30-3142-4073-96c7-a377ff6723f7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zk9bv" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.887483 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2311d615-fd4d-43c2-9fcb-8858383c2dc9-machine-approver-tls\") pod \"machine-approver-56656f9798-lk5qk\" (UID: \"2311d615-fd4d-43c2-9fcb-8858383c2dc9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lk5qk" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.888003 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/86788093-42e5-4fa0-9595-97a910e6557e-console-oauth-config\") pod \"console-f9d7485db-zrzcg\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.888987 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/86788093-42e5-4fa0-9595-97a910e6557e-console-serving-cert\") pod \"console-f9d7485db-zrzcg\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.889063 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7l7sj"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.889630 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7l7sj" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.889875 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539270-q7hck"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.890298 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539270-q7hck" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.890755 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-trusted-ca-bundle\") pod \"console-f9d7485db-zrzcg\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.890876 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-serving-cert\") pod \"route-controller-manager-6576b87f9c-knb62\" (UID: \"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.893375 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/77e0e285-570c-47bd-854e-538c9367486b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-kq5kp\" (UID: \"77e0e285-570c-47bd-854e-538c9367486b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5kp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.897314 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s68hg"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.897381 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qsc9d"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.897843 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.899666 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qsc9d" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.900358 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.900518 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62e02cd9-3008-41de-b7b7-dc1f546c5645-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-s68hg\" (UID: \"62e02cd9-3008-41de-b7b7-dc1f546c5645\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s68hg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.921543 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.927406 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.939868 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-prqqp"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.939949 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wl9zt"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.939971 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-vgrb8"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.941066 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vgrb8" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.941428 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.948543 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.950672 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-nv4bp"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.960873 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvxvc\" (UniqueName: \"kubernetes.io/projected/6d0571e3-5089-4157-a36a-25ecfe6a67f2-kube-api-access-bvxvc\") pod \"router-default-5444994796-qtg4x\" (UID: \"6d0571e3-5089-4157-a36a-25ecfe6a67f2\") " pod="openshift-ingress/router-default-5444994796-qtg4x" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.960963 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8578f8dc-143c-423c-b62b-b3190444bafd-config\") pod \"controller-manager-879f6c89f-wl9zt\" (UID: \"8578f8dc-143c-423c-b62b-b3190444bafd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.961018 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3e5ed20-a91a-4d9b-a42c-0bf43b8b9842-config\") pod \"kube-apiserver-operator-766d6c64bb-dbmff\" (UID: \"d3e5ed20-a91a-4d9b-a42c-0bf43b8b9842\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dbmff" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.961061 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47d3ee47-7c75-4321-8b9c-5e119a92a311-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-7b2lz\" (UID: \"47d3ee47-7c75-4321-8b9c-5e119a92a311\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7b2lz" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.961097 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eeacfd31-08e1-49e6-afda-95efa2d815d2-trusted-ca\") pod \"console-operator-58897d9998-8qrq4\" (UID: \"eeacfd31-08e1-49e6-afda-95efa2d815d2\") " pod="openshift-console-operator/console-operator-58897d9998-8qrq4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.961154 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9d95b2fd-64be-4688-a596-c41bb31cb9c4-etcd-client\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.961185 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d95b2fd-64be-4688-a596-c41bb31cb9c4-serving-cert\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.961244 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.961314 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/e0b63d94-59de-45da-8058-89714bea7a90-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-9smfd\" (UID: \"e0b63d94-59de-45da-8058-89714bea7a90\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9smfd" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.961348 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b4xc\" (UniqueName: \"kubernetes.io/projected/1f432b26-7417-4b71-a63a-5cb9a142bd43-kube-api-access-7b4xc\") pod \"authentication-operator-69f744f599-tswcj\" (UID: \"1f432b26-7417-4b71-a63a-5cb9a142bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tswcj" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.961386 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a5ad85c-19b5-432d-aa36-d0db74e44744-proxy-tls\") pod \"machine-config-controller-84d6567774-877gr\" (UID: \"0a5ad85c-19b5-432d-aa36-d0db74e44744\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-877gr" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.961418 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.961478 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d95b2fd-64be-4688-a596-c41bb31cb9c4-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.961523 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9d95b2fd-64be-4688-a596-c41bb31cb9c4-encryption-config\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.961557 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9ee9e9d4-e788-41cb-b601-035551b5338c-available-featuregates\") pod \"openshift-config-operator-7777fb866f-2l2w7\" (UID: \"9ee9e9d4-e788-41cb-b601-035551b5338c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2l2w7" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.961606 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3e5ed20-a91a-4d9b-a42c-0bf43b8b9842-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-dbmff\" (UID: \"d3e5ed20-a91a-4d9b-a42c-0bf43b8b9842\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dbmff" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.961648 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpcg5\" (UniqueName: \"kubernetes.io/projected/9ee9e9d4-e788-41cb-b601-035551b5338c-kube-api-access-lpcg5\") pod \"openshift-config-operator-7777fb866f-2l2w7\" (UID: \"9ee9e9d4-e788-41cb-b601-035551b5338c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2l2w7" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.961702 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9d95b2fd-64be-4688-a596-c41bb31cb9c4-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.961767 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.961850 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eeacfd31-08e1-49e6-afda-95efa2d815d2-serving-cert\") pod \"console-operator-58897d9998-8qrq4\" (UID: \"eeacfd31-08e1-49e6-afda-95efa2d815d2\") " pod="openshift-console-operator/console-operator-58897d9998-8qrq4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.961923 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9d95b2fd-64be-4688-a596-c41bb31cb9c4-audit-dir\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.961959 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-audit-policies\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.961989 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55wpx\" (UniqueName: \"kubernetes.io/projected/020a8218-62f4-4abf-a8d2-fed602de5f7f-kube-api-access-55wpx\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.962080 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spbvd\" (UniqueName: \"kubernetes.io/projected/e0b63d94-59de-45da-8058-89714bea7a90-kube-api-access-spbvd\") pod \"control-plane-machine-set-operator-78cbb6b69f-9smfd\" (UID: \"e0b63d94-59de-45da-8058-89714bea7a90\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9smfd" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.962127 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.962198 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f432b26-7417-4b71-a63a-5cb9a142bd43-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-tswcj\" (UID: \"1f432b26-7417-4b71-a63a-5cb9a142bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tswcj" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.962248 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8578f8dc-143c-423c-b62b-b3190444bafd-client-ca\") pod \"controller-manager-879f6c89f-wl9zt\" (UID: \"8578f8dc-143c-423c-b62b-b3190444bafd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.962276 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0a5ad85c-19b5-432d-aa36-d0db74e44744-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-877gr\" (UID: \"0a5ad85c-19b5-432d-aa36-d0db74e44744\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-877gr" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.962308 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qq5x\" (UniqueName: \"kubernetes.io/projected/8578f8dc-143c-423c-b62b-b3190444bafd-kube-api-access-6qq5x\") pod \"controller-manager-879f6c89f-wl9zt\" (UID: \"8578f8dc-143c-423c-b62b-b3190444bafd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.962365 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xslxf\" (UniqueName: \"kubernetes.io/projected/9d95b2fd-64be-4688-a596-c41bb31cb9c4-kube-api-access-xslxf\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.962376 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8578f8dc-143c-423c-b62b-b3190444bafd-config\") pod \"controller-manager-879f6c89f-wl9zt\" (UID: \"8578f8dc-143c-423c-b62b-b3190444bafd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.962400 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8578f8dc-143c-423c-b62b-b3190444bafd-serving-cert\") pod \"controller-manager-879f6c89f-wl9zt\" (UID: \"8578f8dc-143c-423c-b62b-b3190444bafd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.962427 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.962475 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6d0571e3-5089-4157-a36a-25ecfe6a67f2-default-certificate\") pod \"router-default-5444994796-qtg4x\" (UID: \"6d0571e3-5089-4157-a36a-25ecfe6a67f2\") " pod="openshift-ingress/router-default-5444994796-qtg4x" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.962532 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51683a24-edad-4808-b2ec-6a628bfdd937-config\") pod \"machine-api-operator-5694c8668f-nv4bp\" (UID: \"51683a24-edad-4808-b2ec-6a628bfdd937\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nv4bp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.962602 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8578f8dc-143c-423c-b62b-b3190444bafd-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-wl9zt\" (UID: \"8578f8dc-143c-423c-b62b-b3190444bafd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.962656 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eeacfd31-08e1-49e6-afda-95efa2d815d2-config\") pod \"console-operator-58897d9998-8qrq4\" (UID: \"eeacfd31-08e1-49e6-afda-95efa2d815d2\") " pod="openshift-console-operator/console-operator-58897d9998-8qrq4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.962714 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qx6z\" (UniqueName: \"kubernetes.io/projected/eeacfd31-08e1-49e6-afda-95efa2d815d2-kube-api-access-8qx6z\") pod \"console-operator-58897d9998-8qrq4\" (UID: \"eeacfd31-08e1-49e6-afda-95efa2d815d2\") " pod="openshift-console-operator/console-operator-58897d9998-8qrq4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.962747 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.962768 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9d95b2fd-64be-4688-a596-c41bb31cb9c4-audit-policies\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.962814 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.962862 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn288\" (UniqueName: \"kubernetes.io/projected/51683a24-edad-4808-b2ec-6a628bfdd937-kube-api-access-bn288\") pod \"machine-api-operator-5694c8668f-nv4bp\" (UID: \"51683a24-edad-4808-b2ec-6a628bfdd937\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nv4bp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.962889 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.962959 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f432b26-7417-4b71-a63a-5cb9a142bd43-config\") pod \"authentication-operator-69f744f599-tswcj\" (UID: \"1f432b26-7417-4b71-a63a-5cb9a142bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tswcj" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.963020 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/020a8218-62f4-4abf-a8d2-fed602de5f7f-audit-dir\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.963074 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.963115 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47d3ee47-7c75-4321-8b9c-5e119a92a311-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-7b2lz\" (UID: \"47d3ee47-7c75-4321-8b9c-5e119a92a311\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7b2lz" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.963159 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d0571e3-5089-4157-a36a-25ecfe6a67f2-service-ca-bundle\") pod \"router-default-5444994796-qtg4x\" (UID: \"6d0571e3-5089-4157-a36a-25ecfe6a67f2\") " pod="openshift-ingress/router-default-5444994796-qtg4x" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.963202 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkbg8\" (UniqueName: \"kubernetes.io/projected/0a5ad85c-19b5-432d-aa36-d0db74e44744-kube-api-access-kkbg8\") pod \"machine-config-controller-84d6567774-877gr\" (UID: \"0a5ad85c-19b5-432d-aa36-d0db74e44744\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-877gr" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.963261 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/51683a24-edad-4808-b2ec-6a628bfdd937-images\") pod \"machine-api-operator-5694c8668f-nv4bp\" (UID: \"51683a24-edad-4808-b2ec-6a628bfdd937\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nv4bp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.963292 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f432b26-7417-4b71-a63a-5cb9a142bd43-serving-cert\") pod \"authentication-operator-69f744f599-tswcj\" (UID: \"1f432b26-7417-4b71-a63a-5cb9a142bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tswcj" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.963319 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d3e5ed20-a91a-4d9b-a42c-0bf43b8b9842-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-dbmff\" (UID: \"d3e5ed20-a91a-4d9b-a42c-0bf43b8b9842\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dbmff" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.963351 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.963384 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6d0571e3-5089-4157-a36a-25ecfe6a67f2-stats-auth\") pod \"router-default-5444994796-qtg4x\" (UID: \"6d0571e3-5089-4157-a36a-25ecfe6a67f2\") " pod="openshift-ingress/router-default-5444994796-qtg4x" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.963447 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6d6h\" (UniqueName: \"kubernetes.io/projected/47d3ee47-7c75-4321-8b9c-5e119a92a311-kube-api-access-f6d6h\") pod \"kube-storage-version-migrator-operator-b67b599dd-7b2lz\" (UID: \"47d3ee47-7c75-4321-8b9c-5e119a92a311\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7b2lz" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.963475 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d0571e3-5089-4157-a36a-25ecfe6a67f2-metrics-certs\") pod \"router-default-5444994796-qtg4x\" (UID: \"6d0571e3-5089-4157-a36a-25ecfe6a67f2\") " pod="openshift-ingress/router-default-5444994796-qtg4x" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.963543 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/51683a24-edad-4808-b2ec-6a628bfdd937-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-nv4bp\" (UID: \"51683a24-edad-4808-b2ec-6a628bfdd937\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nv4bp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.963579 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ee9e9d4-e788-41cb-b601-035551b5338c-serving-cert\") pod \"openshift-config-operator-7777fb866f-2l2w7\" (UID: \"9ee9e9d4-e788-41cb-b601-035551b5338c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2l2w7" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.963616 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.963645 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f432b26-7417-4b71-a63a-5cb9a142bd43-service-ca-bundle\") pod \"authentication-operator-69f744f599-tswcj\" (UID: \"1f432b26-7417-4b71-a63a-5cb9a142bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tswcj" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.963693 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.964785 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-audit-policies\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.966477 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eeacfd31-08e1-49e6-afda-95efa2d815d2-trusted-ca\") pod \"console-operator-58897d9998-8qrq4\" (UID: \"eeacfd31-08e1-49e6-afda-95efa2d815d2\") " pod="openshift-console-operator/console-operator-58897d9998-8qrq4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.969071 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f432b26-7417-4b71-a63a-5cb9a142bd43-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-tswcj\" (UID: \"1f432b26-7417-4b71-a63a-5cb9a142bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tswcj" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.970399 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.971183 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0a5ad85c-19b5-432d-aa36-d0db74e44744-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-877gr\" (UID: \"0a5ad85c-19b5-432d-aa36-d0db74e44744\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-877gr" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.972484 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8578f8dc-143c-423c-b62b-b3190444bafd-client-ca\") pod \"controller-manager-879f6c89f-wl9zt\" (UID: \"8578f8dc-143c-423c-b62b-b3190444bafd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.972499 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9ee9e9d4-e788-41cb-b601-035551b5338c-available-featuregates\") pod \"openshift-config-operator-7777fb866f-2l2w7\" (UID: \"9ee9e9d4-e788-41cb-b601-035551b5338c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2l2w7" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.976268 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9d95b2fd-64be-4688-a596-c41bb31cb9c4-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.976358 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9d95b2fd-64be-4688-a596-c41bb31cb9c4-audit-dir\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.987857 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d95b2fd-64be-4688-a596-c41bb31cb9c4-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.992861 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9d95b2fd-64be-4688-a596-c41bb31cb9c4-encryption-config\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.993023 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/51683a24-edad-4808-b2ec-6a628bfdd937-images\") pod \"machine-api-operator-5694c8668f-nv4bp\" (UID: \"51683a24-edad-4808-b2ec-6a628bfdd937\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nv4bp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.993715 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.993915 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.994040 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/e0b63d94-59de-45da-8058-89714bea7a90-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-9smfd\" (UID: \"e0b63d94-59de-45da-8058-89714bea7a90\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9smfd" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.994247 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.994636 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eeacfd31-08e1-49e6-afda-95efa2d815d2-serving-cert\") pod \"console-operator-58897d9998-8qrq4\" (UID: \"eeacfd31-08e1-49e6-afda-95efa2d815d2\") " pod="openshift-console-operator/console-operator-58897d9998-8qrq4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.994815 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9d95b2fd-64be-4688-a596-c41bb31cb9c4-audit-policies\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.995049 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.998721 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zk9bv"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.999109 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.999519 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f432b26-7417-4b71-a63a-5cb9a142bd43-serving-cert\") pod \"authentication-operator-69f744f599-tswcj\" (UID: \"1f432b26-7417-4b71-a63a-5cb9a142bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tswcj" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.000585 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-kxx8s"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.001359 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kxx8s" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.002892 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8578f8dc-143c-423c-b62b-b3190444bafd-serving-cert\") pod \"controller-manager-879f6c89f-wl9zt\" (UID: \"8578f8dc-143c-423c-b62b-b3190444bafd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.003252 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51683a24-edad-4808-b2ec-6a628bfdd937-config\") pod \"machine-api-operator-5694c8668f-nv4bp\" (UID: \"51683a24-edad-4808-b2ec-6a628bfdd937\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nv4bp" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.003322 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f432b26-7417-4b71-a63a-5cb9a142bd43-service-ca-bundle\") pod \"authentication-operator-69f744f599-tswcj\" (UID: \"1f432b26-7417-4b71-a63a-5cb9a142bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tswcj" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.003385 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d95b2fd-64be-4688-a596-c41bb31cb9c4-serving-cert\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.003715 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.004161 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9d95b2fd-64be-4688-a596-c41bb31cb9c4-etcd-client\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.004503 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8578f8dc-143c-423c-b62b-b3190444bafd-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-wl9zt\" (UID: \"8578f8dc-143c-423c-b62b-b3190444bafd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.004765 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eeacfd31-08e1-49e6-afda-95efa2d815d2-config\") pod \"console-operator-58897d9998-8qrq4\" (UID: \"eeacfd31-08e1-49e6-afda-95efa2d815d2\") " pod="openshift-console-operator/console-operator-58897d9998-8qrq4" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.004975 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/51683a24-edad-4808-b2ec-6a628bfdd937-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-nv4bp\" (UID: \"51683a24-edad-4808-b2ec-6a628bfdd937\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nv4bp" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.005100 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/020a8218-62f4-4abf-a8d2-fed602de5f7f-audit-dir\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.005526 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.005801 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f432b26-7417-4b71-a63a-5cb9a142bd43-config\") pod \"authentication-operator-69f744f599-tswcj\" (UID: \"1f432b26-7417-4b71-a63a-5cb9a142bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tswcj" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.006186 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.006456 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ee9e9d4-e788-41cb-b601-035551b5338c-serving-cert\") pod \"openshift-config-operator-7777fb866f-2l2w7\" (UID: \"9ee9e9d4-e788-41cb-b601-035551b5338c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2l2w7" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.008549 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gk6c6"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.011545 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.012266 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-r7d4f"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.012677 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-rjwhk"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.017832 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gk6c6" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.018175 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.019083 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539260-g6qtn"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.019248 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rjwhk" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.019308 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-r7d4f" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.019558 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-wpgg2"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.020734 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-2l2w7"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.020758 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-tswcj"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.020819 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-wpgg2" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.020877 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539260-g6qtn" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.022448 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.027509 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-6lk5b"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.030763 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-q92nw"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.032121 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4jwnr"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.038208 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.038715 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7b2lz"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.040522 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dwc4"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.041174 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-smnq2"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.042988 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rjwhk"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.043254 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.044542 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539270-q7hck"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.044767 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.046506 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-ffd8l"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.046533 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7l7sj"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.047323 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-64dsw"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.049753 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-64dsw" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.051475 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-glj9p"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.052395 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-glj9p" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.052949 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nc4gs"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.057692 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9fc7"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.058447 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dbmff"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.059840 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9smfd"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.060824 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.061323 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.062574 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-64dsw"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.072420 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-877gr"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.075062 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-8qrq4"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.076775 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-r7d4f"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.078620 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qsc9d"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.082009 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dzrg"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.082284 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-6rmwm"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.084470 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.085015 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gk6c6"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.086145 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.087066 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-dgh8q"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.088106 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dgh8q" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.088379 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-vgrb8"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.089614 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dgh8q"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.090961 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539260-g6qtn"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.093540 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-wpgg2"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.094743 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-kxx8s"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.103940 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.121797 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.142705 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.182214 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.202252 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.214184 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6d0571e3-5089-4157-a36a-25ecfe6a67f2-stats-auth\") pod \"router-default-5444994796-qtg4x\" (UID: \"6d0571e3-5089-4157-a36a-25ecfe6a67f2\") " pod="openshift-ingress/router-default-5444994796-qtg4x" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.221833 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.232435 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d0571e3-5089-4157-a36a-25ecfe6a67f2-metrics-certs\") pod \"router-default-5444994796-qtg4x\" (UID: \"6d0571e3-5089-4157-a36a-25ecfe6a67f2\") " pod="openshift-ingress/router-default-5444994796-qtg4x" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.242291 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.253164 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d0571e3-5089-4157-a36a-25ecfe6a67f2-service-ca-bundle\") pod \"router-default-5444994796-qtg4x\" (UID: \"6d0571e3-5089-4157-a36a-25ecfe6a67f2\") " pod="openshift-ingress/router-default-5444994796-qtg4x" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.263552 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.268749 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6d0571e3-5089-4157-a36a-25ecfe6a67f2-default-certificate\") pod \"router-default-5444994796-qtg4x\" (UID: \"6d0571e3-5089-4157-a36a-25ecfe6a67f2\") " pod="openshift-ingress/router-default-5444994796-qtg4x" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.281408 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.286663 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3e5ed20-a91a-4d9b-a42c-0bf43b8b9842-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-dbmff\" (UID: \"d3e5ed20-a91a-4d9b-a42c-0bf43b8b9842\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dbmff" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.303384 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.326432 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.335180 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3e5ed20-a91a-4d9b-a42c-0bf43b8b9842-config\") pod \"kube-apiserver-operator-766d6c64bb-dbmff\" (UID: \"d3e5ed20-a91a-4d9b-a42c-0bf43b8b9842\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dbmff" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.341380 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.362189 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.381249 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.400732 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.421341 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.442618 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.457443 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a5ad85c-19b5-432d-aa36-d0db74e44744-proxy-tls\") pod \"machine-config-controller-84d6567774-877gr\" (UID: \"0a5ad85c-19b5-432d-aa36-d0db74e44744\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-877gr" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.462309 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.481612 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.502273 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.520856 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.542332 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.561719 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.567993 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47d3ee47-7c75-4321-8b9c-5e119a92a311-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-7b2lz\" (UID: \"47d3ee47-7c75-4321-8b9c-5e119a92a311\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7b2lz" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.581219 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.601408 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.622240 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.630347 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47d3ee47-7c75-4321-8b9c-5e119a92a311-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-7b2lz\" (UID: \"47d3ee47-7c75-4321-8b9c-5e119a92a311\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7b2lz" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.662666 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.682209 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.701888 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.722879 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.741822 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.761878 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.782566 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.801590 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.822411 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.841894 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 01 09:11:32 crc kubenswrapper[4792]: E0301 09:11:32.859589 4792 configmap.go:193] Couldn't get configMap openshift-apiserver/config: failed to sync configmap cache: timed out waiting for the condition Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.859630 4792 request.go:700] Waited for 1.007117324s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/configmaps?fieldSelector=metadata.name%3Dmachine-config-operator-images&limit=500&resourceVersion=0 Mar 01 09:11:32 crc kubenswrapper[4792]: E0301 09:11:32.859667 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-config podName:499393fc-abcf-4998-9e32-3d43a0b1e488 nodeName:}" failed. No retries permitted until 2026-03-01 09:11:33.359646186 +0000 UTC m=+222.601525393 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-config") pod "apiserver-76f77b778f-6lk5b" (UID: "499393fc-abcf-4998-9e32-3d43a0b1e488") : failed to sync configmap cache: timed out waiting for the condition Mar 01 09:11:32 crc kubenswrapper[4792]: E0301 09:11:32.861620 4792 configmap.go:193] Couldn't get configMap openshift-apiserver/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 01 09:11:32 crc kubenswrapper[4792]: E0301 09:11:32.861670 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-trusted-ca-bundle podName:499393fc-abcf-4998-9e32-3d43a0b1e488 nodeName:}" failed. No retries permitted until 2026-03-01 09:11:33.361657125 +0000 UTC m=+222.603536332 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-trusted-ca-bundle") pod "apiserver-76f77b778f-6lk5b" (UID: "499393fc-abcf-4998-9e32-3d43a0b1e488") : failed to sync configmap cache: timed out waiting for the condition Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.861798 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 01 09:11:32 crc kubenswrapper[4792]: E0301 09:11:32.868413 4792 secret.go:188] Couldn't get secret openshift-apiserver/etcd-client: failed to sync secret cache: timed out waiting for the condition Mar 01 09:11:32 crc kubenswrapper[4792]: E0301 09:11:32.868472 4792 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-1: failed to sync configmap cache: timed out waiting for the condition Mar 01 09:11:32 crc kubenswrapper[4792]: E0301 09:11:32.868558 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/499393fc-abcf-4998-9e32-3d43a0b1e488-etcd-client podName:499393fc-abcf-4998-9e32-3d43a0b1e488 nodeName:}" failed. No retries permitted until 2026-03-01 09:11:33.368504384 +0000 UTC m=+222.610383661 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/499393fc-abcf-4998-9e32-3d43a0b1e488-etcd-client") pod "apiserver-76f77b778f-6lk5b" (UID: "499393fc-abcf-4998-9e32-3d43a0b1e488") : failed to sync secret cache: timed out waiting for the condition Mar 01 09:11:32 crc kubenswrapper[4792]: E0301 09:11:32.868598 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-audit podName:499393fc-abcf-4998-9e32-3d43a0b1e488 nodeName:}" failed. No retries permitted until 2026-03-01 09:11:33.368580746 +0000 UTC m=+222.610460093 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-audit") pod "apiserver-76f77b778f-6lk5b" (UID: "499393fc-abcf-4998-9e32-3d43a0b1e488") : failed to sync configmap cache: timed out waiting for the condition Mar 01 09:11:32 crc kubenswrapper[4792]: E0301 09:11:32.877877 4792 configmap.go:193] Couldn't get configMap openshift-apiserver/etcd-serving-ca: failed to sync configmap cache: timed out waiting for the condition Mar 01 09:11:32 crc kubenswrapper[4792]: E0301 09:11:32.878017 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-etcd-serving-ca podName:499393fc-abcf-4998-9e32-3d43a0b1e488 nodeName:}" failed. No retries permitted until 2026-03-01 09:11:33.377977087 +0000 UTC m=+222.619856324 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-etcd-serving-ca") pod "apiserver-76f77b778f-6lk5b" (UID: "499393fc-abcf-4998-9e32-3d43a0b1e488") : failed to sync configmap cache: timed out waiting for the condition Mar 01 09:11:32 crc kubenswrapper[4792]: E0301 09:11:32.879383 4792 configmap.go:193] Couldn't get configMap openshift-apiserver/image-import-ca: failed to sync configmap cache: timed out waiting for the condition Mar 01 09:11:32 crc kubenswrapper[4792]: E0301 09:11:32.879465 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-image-import-ca podName:499393fc-abcf-4998-9e32-3d43a0b1e488 nodeName:}" failed. No retries permitted until 2026-03-01 09:11:33.379443063 +0000 UTC m=+222.621322300 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-import-ca" (UniqueName: "kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-image-import-ca") pod "apiserver-76f77b778f-6lk5b" (UID: "499393fc-abcf-4998-9e32-3d43a0b1e488") : failed to sync configmap cache: timed out waiting for the condition Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.882261 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 01 09:11:32 crc kubenswrapper[4792]: E0301 09:11:32.887431 4792 secret.go:188] Couldn't get secret openshift-apiserver/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 01 09:11:32 crc kubenswrapper[4792]: E0301 09:11:32.887523 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/499393fc-abcf-4998-9e32-3d43a0b1e488-serving-cert podName:499393fc-abcf-4998-9e32-3d43a0b1e488 nodeName:}" failed. No retries permitted until 2026-03-01 09:11:33.387500621 +0000 UTC m=+222.629379858 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/499393fc-abcf-4998-9e32-3d43a0b1e488-serving-cert") pod "apiserver-76f77b778f-6lk5b" (UID: "499393fc-abcf-4998-9e32-3d43a0b1e488") : failed to sync secret cache: timed out waiting for the condition Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.932423 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnjdm\" (UniqueName: \"kubernetes.io/projected/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-kube-api-access-fnjdm\") pod \"route-controller-manager-6576b87f9c-knb62\" (UID: \"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.949892 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8qxf\" (UniqueName: \"kubernetes.io/projected/2311d615-fd4d-43c2-9fcb-8858383c2dc9-kube-api-access-v8qxf\") pod \"machine-approver-56656f9798-lk5qk\" (UID: \"2311d615-fd4d-43c2-9fcb-8858383c2dc9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lk5qk" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.955554 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g22l6\" (UniqueName: \"kubernetes.io/projected/e12bfc30-3142-4073-96c7-a377ff6723f7-kube-api-access-g22l6\") pod \"openshift-apiserver-operator-796bbdcf4f-zk9bv\" (UID: \"e12bfc30-3142-4073-96c7-a377ff6723f7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zk9bv" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.980167 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjp6q\" (UniqueName: \"kubernetes.io/projected/77e0e285-570c-47bd-854e-538c9367486b-kube-api-access-jjp6q\") pod \"cluster-samples-operator-665b6dd947-kq5kp\" (UID: \"77e0e285-570c-47bd-854e-538c9367486b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5kp" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.002363 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wdc9\" (UniqueName: \"kubernetes.io/projected/6cc55bdf-6c0f-4d35-879f-c64c2dc4897c-kube-api-access-6wdc9\") pod \"downloads-7954f5f757-wxl8v\" (UID: \"6cc55bdf-6c0f-4d35-879f-c64c2dc4897c\") " pod="openshift-console/downloads-7954f5f757-wxl8v" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.019317 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn8s2\" (UniqueName: \"kubernetes.io/projected/62e02cd9-3008-41de-b7b7-dc1f546c5645-kube-api-access-hn8s2\") pod \"openshift-controller-manager-operator-756b6f6bc6-s68hg\" (UID: \"62e02cd9-3008-41de-b7b7-dc1f546c5645\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s68hg" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.029292 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lk5qk" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.036296 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qmx8\" (UniqueName: \"kubernetes.io/projected/86788093-42e5-4fa0-9595-97a910e6557e-kube-api-access-6qmx8\") pod \"console-f9d7485db-zrzcg\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:33 crc kubenswrapper[4792]: W0301 09:11:33.044338 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2311d615_fd4d_43c2_9fcb_8858383c2dc9.slice/crio-65797377256be8bc6bc0a1ee46963b80d3e126171b7e9e542f02e3a14e38f21f WatchSource:0}: Error finding container 65797377256be8bc6bc0a1ee46963b80d3e126171b7e9e542f02e3a14e38f21f: Status 404 returned error can't find the container with id 65797377256be8bc6bc0a1ee46963b80d3e126171b7e9e542f02e3a14e38f21f Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.053482 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s68hg" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.061978 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.072737 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zk9bv" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.083389 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.102181 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.122991 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.142211 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.162379 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.172650 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.185872 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-wxl8v" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.201479 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.205266 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.212540 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.222842 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.233241 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5kp" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.250263 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.266191 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.307405 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvxvc\" (UniqueName: \"kubernetes.io/projected/6d0571e3-5089-4157-a36a-25ecfe6a67f2-kube-api-access-bvxvc\") pod \"router-default-5444994796-qtg4x\" (UID: \"6d0571e3-5089-4157-a36a-25ecfe6a67f2\") " pod="openshift-ingress/router-default-5444994796-qtg4x" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.318616 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55wpx\" (UniqueName: \"kubernetes.io/projected/020a8218-62f4-4abf-a8d2-fed602de5f7f-kube-api-access-55wpx\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.327640 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s68hg"] Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.340127 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spbvd\" (UniqueName: \"kubernetes.io/projected/e0b63d94-59de-45da-8058-89714bea7a90-kube-api-access-spbvd\") pod \"control-plane-machine-set-operator-78cbb6b69f-9smfd\" (UID: \"e0b63d94-59de-45da-8058-89714bea7a90\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9smfd" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.346253 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.365354 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qq5x\" (UniqueName: \"kubernetes.io/projected/8578f8dc-143c-423c-b62b-b3190444bafd-kube-api-access-6qq5x\") pod \"controller-manager-879f6c89f-wl9zt\" (UID: \"8578f8dc-143c-423c-b62b-b3190444bafd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.377433 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xslxf\" (UniqueName: \"kubernetes.io/projected/9d95b2fd-64be-4688-a596-c41bb31cb9c4-kube-api-access-xslxf\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.396576 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-config\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.396651 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-trusted-ca-bundle\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.396676 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-etcd-serving-ca\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.396750 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-audit\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.396772 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/499393fc-abcf-4998-9e32-3d43a0b1e488-etcd-client\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.396806 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/499393fc-abcf-4998-9e32-3d43a0b1e488-serving-cert\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.396830 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-image-import-ca\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.412021 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpcg5\" (UniqueName: \"kubernetes.io/projected/9ee9e9d4-e788-41cb-b601-035551b5338c-kube-api-access-lpcg5\") pod \"openshift-config-operator-7777fb866f-2l2w7\" (UID: \"9ee9e9d4-e788-41cb-b601-035551b5338c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2l2w7" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.419736 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b4xc\" (UniqueName: \"kubernetes.io/projected/1f432b26-7417-4b71-a63a-5cb9a142bd43-kube-api-access-7b4xc\") pod \"authentication-operator-69f744f599-tswcj\" (UID: \"1f432b26-7417-4b71-a63a-5cb9a142bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tswcj" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.422317 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s68hg" event={"ID":"62e02cd9-3008-41de-b7b7-dc1f546c5645","Type":"ContainerStarted","Data":"266c36198a1abe62955aa041f65c04fc576809261770109d31b0041307a83502"} Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.422349 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zk9bv"] Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.437500 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2l2w7" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.441068 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lk5qk" event={"ID":"2311d615-fd4d-43c2-9fcb-8858383c2dc9","Type":"ContainerStarted","Data":"8b2df384673cae5bf90ee9062b4cc2140974e37edfa75e008601913dd12b843c"} Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.441114 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lk5qk" event={"ID":"2311d615-fd4d-43c2-9fcb-8858383c2dc9","Type":"ContainerStarted","Data":"65797377256be8bc6bc0a1ee46963b80d3e126171b7e9e542f02e3a14e38f21f"} Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.441636 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkbg8\" (UniqueName: \"kubernetes.io/projected/0a5ad85c-19b5-432d-aa36-d0db74e44744-kube-api-access-kkbg8\") pod \"machine-config-controller-84d6567774-877gr\" (UID: \"0a5ad85c-19b5-432d-aa36-d0db74e44744\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-877gr" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.464493 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d3e5ed20-a91a-4d9b-a42c-0bf43b8b9842-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-dbmff\" (UID: \"d3e5ed20-a91a-4d9b-a42c-0bf43b8b9842\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dbmff" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.479654 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6d6h\" (UniqueName: \"kubernetes.io/projected/47d3ee47-7c75-4321-8b9c-5e119a92a311-kube-api-access-f6d6h\") pod \"kube-storage-version-migrator-operator-b67b599dd-7b2lz\" (UID: \"47d3ee47-7c75-4321-8b9c-5e119a92a311\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7b2lz" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.489308 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62"] Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.498845 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qx6z\" (UniqueName: \"kubernetes.io/projected/eeacfd31-08e1-49e6-afda-95efa2d815d2-kube-api-access-8qx6z\") pod \"console-operator-58897d9998-8qrq4\" (UID: \"eeacfd31-08e1-49e6-afda-95efa2d815d2\") " pod="openshift-console-operator/console-operator-58897d9998-8qrq4" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.506346 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5kp"] Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.516612 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn288\" (UniqueName: \"kubernetes.io/projected/51683a24-edad-4808-b2ec-6a628bfdd937-kube-api-access-bn288\") pod \"machine-api-operator-5694c8668f-nv4bp\" (UID: \"51683a24-edad-4808-b2ec-6a628bfdd937\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nv4bp" Mar 01 09:11:33 crc kubenswrapper[4792]: W0301 09:11:33.522871 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44c647cb_a9e2_4e75_abb3_5d3cdbe881a2.slice/crio-c11caa71735c72b6244f64898704ac0350a94ec2fdec7c12cfb52f25b0fabfa9 WatchSource:0}: Error finding container c11caa71735c72b6244f64898704ac0350a94ec2fdec7c12cfb52f25b0fabfa9: Status 404 returned error can't find the container with id c11caa71735c72b6244f64898704ac0350a94ec2fdec7c12cfb52f25b0fabfa9 Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.523581 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.539283 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.541413 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.546802 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-877gr" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.553528 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-wxl8v"] Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.564268 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 01 09:11:33 crc kubenswrapper[4792]: W0301 09:11:33.574005 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cc55bdf_6c0f_4d35_879f_c64c2dc4897c.slice/crio-46206446d1604ae68341dc037b20eac001ae46b9e483ab82dab28549f8b6feab WatchSource:0}: Error finding container 46206446d1604ae68341dc037b20eac001ae46b9e483ab82dab28549f8b6feab: Status 404 returned error can't find the container with id 46206446d1604ae68341dc037b20eac001ae46b9e483ab82dab28549f8b6feab Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.581639 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.582566 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dbmff" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.593241 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-nv4bp" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.602418 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.602565 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-qtg4x" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.610297 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9smfd" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.622238 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-tswcj" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.624528 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.646730 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.651032 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.656485 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-prqqp"] Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.659782 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7b2lz" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.661822 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.681984 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.706714 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.721846 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.744348 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.758177 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-8qrq4" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.767057 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.768685 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-2l2w7"] Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.782938 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 01 09:11:33 crc kubenswrapper[4792]: W0301 09:11:33.783845 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ee9e9d4_e788_41cb_b601_035551b5338c.slice/crio-b3f78f535e977cde4c2e5ec2a45cfe5751a4144dd17031be1b1eeada78b1af76 WatchSource:0}: Error finding container b3f78f535e977cde4c2e5ec2a45cfe5751a4144dd17031be1b1eeada78b1af76: Status 404 returned error can't find the container with id b3f78f535e977cde4c2e5ec2a45cfe5751a4144dd17031be1b1eeada78b1af76 Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.801596 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-zrzcg"] Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.804640 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.835097 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.848021 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-877gr"] Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.852277 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.861749 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.880179 4792 request.go:700] Waited for 1.848095397s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/configmaps?fieldSelector=metadata.name%3Dcollect-profiles-config&limit=500&resourceVersion=0 Mar 01 09:11:33 crc kubenswrapper[4792]: W0301 09:11:33.883708 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a5ad85c_19b5_432d_aa36_d0db74e44744.slice/crio-6737d4e8211f8c4d682ae34ee5d82c8354f8c966d735715288edc667235636e3 WatchSource:0}: Error finding container 6737d4e8211f8c4d682ae34ee5d82c8354f8c966d735715288edc667235636e3: Status 404 returned error can't find the container with id 6737d4e8211f8c4d682ae34ee5d82c8354f8c966d735715288edc667235636e3 Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.883880 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 01 09:11:33 crc kubenswrapper[4792]: W0301 09:11:33.883982 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86788093_42e5_4fa0_9595_97a910e6557e.slice/crio-d8eca6eb41f8b18cd9f4704945f3f7da8382b69f90c64f182bda36eef645951e WatchSource:0}: Error finding container d8eca6eb41f8b18cd9f4704945f3f7da8382b69f90c64f182bda36eef645951e: Status 404 returned error can't find the container with id d8eca6eb41f8b18cd9f4704945f3f7da8382b69f90c64f182bda36eef645951e Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.901441 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.902062 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wl9zt"] Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.947430 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.955061 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.964992 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.988581 4792 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.002777 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.023810 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.044464 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 01 09:11:34 crc kubenswrapper[4792]: E0301 09:11:34.062335 4792 projected.go:288] Couldn't get configMap openshift-apiserver/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 01 09:11:34 crc kubenswrapper[4792]: E0301 09:11:34.062371 4792 projected.go:194] Error preparing data for projected volume kube-api-access-bw279 for pod openshift-apiserver/apiserver-76f77b778f-6lk5b: failed to sync configmap cache: timed out waiting for the condition Mar 01 09:11:34 crc kubenswrapper[4792]: E0301 09:11:34.062440 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/499393fc-abcf-4998-9e32-3d43a0b1e488-kube-api-access-bw279 podName:499393fc-abcf-4998-9e32-3d43a0b1e488 nodeName:}" failed. No retries permitted until 2026-03-01 09:11:34.562419236 +0000 UTC m=+223.804298433 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-bw279" (UniqueName: "kubernetes.io/projected/499393fc-abcf-4998-9e32-3d43a0b1e488-kube-api-access-bw279") pod "apiserver-76f77b778f-6lk5b" (UID: "499393fc-abcf-4998-9e32-3d43a0b1e488") : failed to sync configmap cache: timed out waiting for the condition Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.065072 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.081758 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.103496 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.122827 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.129140 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-tswcj"] Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.131032 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4"] Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.142200 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.219597 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9smfd"] Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.221470 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.221768 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7b2lz"] Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.225523 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.228128 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-trusted-ca-bundle\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.233524 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/499393fc-abcf-4998-9e32-3d43a0b1e488-serving-cert\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.241725 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.243871 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dbmff"] Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.250667 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-etcd-serving-ca\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.258167 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6f49f99d-4119-400a-88d5-6fdf48da4d64-etcd-client\") pod \"etcd-operator-b45778765-q92nw\" (UID: \"6f49f99d-4119-400a-88d5-6fdf48da4d64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q92nw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.258227 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b1f7190f-8547-4938-8023-708e4891409d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8dzrg\" (UID: \"b1f7190f-8547-4938-8023-708e4891409d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dzrg" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.258276 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6f49f99d-4119-400a-88d5-6fdf48da4d64-etcd-service-ca\") pod \"etcd-operator-b45778765-q92nw\" (UID: \"6f49f99d-4119-400a-88d5-6fdf48da4d64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q92nw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.258312 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbv6x\" (UniqueName: \"kubernetes.io/projected/6f49f99d-4119-400a-88d5-6fdf48da4d64-kube-api-access-gbv6x\") pod \"etcd-operator-b45778765-q92nw\" (UID: \"6f49f99d-4119-400a-88d5-6fdf48da4d64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q92nw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.258335 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-trusted-ca\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.258354 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k789z\" (UniqueName: \"kubernetes.io/projected/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-kube-api-access-k789z\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.258380 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.258404 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f49f99d-4119-400a-88d5-6fdf48da4d64-config\") pod \"etcd-operator-b45778765-q92nw\" (UID: \"6f49f99d-4119-400a-88d5-6fdf48da4d64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q92nw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.258426 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a29641af-98a4-47ca-baca-7e933d7a00d5-config\") pod \"kube-controller-manager-operator-78b949d7b-r9fc7\" (UID: \"a29641af-98a4-47ca-baca-7e933d7a00d5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9fc7" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.258450 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b1f7190f-8547-4938-8023-708e4891409d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8dzrg\" (UID: \"b1f7190f-8547-4938-8023-708e4891409d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dzrg" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.258491 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx4p9\" (UniqueName: \"kubernetes.io/projected/b1f7190f-8547-4938-8023-708e4891409d-kube-api-access-kx4p9\") pod \"cluster-image-registry-operator-dc59b4c8b-8dzrg\" (UID: \"b1f7190f-8547-4938-8023-708e4891409d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dzrg" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.258568 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.258599 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-registry-tls\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.258672 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-registry-certificates\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: E0301 09:11:34.259130 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:34.759116848 +0000 UTC m=+224.000996045 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:34 crc kubenswrapper[4792]: W0301 09:11:34.259380 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d95b2fd_64be_4688_a596_c41bb31cb9c4.slice/crio-fcc34692643388dd6dce5229f4741d15ab4fda6bca936d66d57b07de450c074f WatchSource:0}: Error finding container fcc34692643388dd6dce5229f4741d15ab4fda6bca936d66d57b07de450c074f: Status 404 returned error can't find the container with id fcc34692643388dd6dce5229f4741d15ab4fda6bca936d66d57b07de450c074f Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.259432 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a29641af-98a4-47ca-baca-7e933d7a00d5-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-r9fc7\" (UID: \"a29641af-98a4-47ca-baca-7e933d7a00d5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9fc7" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.259551 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d566570d-4f58-487b-b824-839792e88650-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-smnq2\" (UID: \"d566570d-4f58-487b-b824-839792e88650\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-smnq2" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.259571 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb3b55fa-972b-4231-8445-bd4cd9a8b88b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8dwc4\" (UID: \"fb3b55fa-972b-4231-8445-bd4cd9a8b88b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dwc4" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.259593 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fb3b55fa-972b-4231-8445-bd4cd9a8b88b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8dwc4\" (UID: \"fb3b55fa-972b-4231-8445-bd4cd9a8b88b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dwc4" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.259616 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb3b55fa-972b-4231-8445-bd4cd9a8b88b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8dwc4\" (UID: \"fb3b55fa-972b-4231-8445-bd4cd9a8b88b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dwc4" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.259626 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-nv4bp"] Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.259643 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.259657 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-bound-sa-token\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.259891 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b1f7190f-8547-4938-8023-708e4891409d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8dzrg\" (UID: \"b1f7190f-8547-4938-8023-708e4891409d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dzrg" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.260125 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6f49f99d-4119-400a-88d5-6fdf48da4d64-etcd-ca\") pod \"etcd-operator-b45778765-q92nw\" (UID: \"6f49f99d-4119-400a-88d5-6fdf48da4d64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q92nw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.260163 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f49f99d-4119-400a-88d5-6fdf48da4d64-serving-cert\") pod \"etcd-operator-b45778765-q92nw\" (UID: \"6f49f99d-4119-400a-88d5-6fdf48da4d64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q92nw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.260182 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a29641af-98a4-47ca-baca-7e933d7a00d5-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-r9fc7\" (UID: \"a29641af-98a4-47ca-baca-7e933d7a00d5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9fc7" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.260214 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmb5d\" (UniqueName: \"kubernetes.io/projected/d566570d-4f58-487b-b824-839792e88650-kube-api-access-jmb5d\") pod \"multus-admission-controller-857f4d67dd-smnq2\" (UID: \"d566570d-4f58-487b-b824-839792e88650\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-smnq2" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.262620 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.270588 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-config\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.281474 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.282671 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-8qrq4"] Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.290791 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/499393fc-abcf-4998-9e32-3d43a0b1e488-etcd-client\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.301277 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.320816 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.328480 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-image-import-ca\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.342080 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.347419 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-audit\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.360638 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.360735 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9b2af767-57b1-4774-9668-6610e9ac1bb9-socket-dir\") pod \"csi-hostpathplugin-64dsw\" (UID: \"9b2af767-57b1-4774-9668-6610e9ac1bb9\") " pod="hostpath-provisioner/csi-hostpathplugin-64dsw" Mar 01 09:11:34 crc kubenswrapper[4792]: E0301 09:11:34.360786 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:34.86074861 +0000 UTC m=+224.102627807 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.360821 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03389f1b-2d84-4b8b-879d-545498a154cc-config\") pod \"service-ca-operator-777779d784-6rmwm\" (UID: \"03389f1b-2d84-4b8b-879d-545498a154cc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6rmwm" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.360851 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd837cd0-c714-48e1-8771-cc6c419f7639-cert\") pod \"ingress-canary-dgh8q\" (UID: \"bd837cd0-c714-48e1-8771-cc6c419f7639\") " pod="openshift-ingress-canary/ingress-canary-dgh8q" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.360854 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.360918 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d566570d-4f58-487b-b824-839792e88650-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-smnq2\" (UID: \"d566570d-4f58-487b-b824-839792e88650\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-smnq2" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.360946 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb3b55fa-972b-4231-8445-bd4cd9a8b88b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8dwc4\" (UID: \"fb3b55fa-972b-4231-8445-bd4cd9a8b88b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dwc4" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.360972 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b72d8baa-f3f8-4263-a36d-9741ad4243d5-tmpfs\") pod \"packageserver-d55dfcdfc-4jk5c\" (UID: \"b72d8baa-f3f8-4263-a36d-9741ad4243d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361004 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh4rn\" (UniqueName: \"kubernetes.io/projected/d25464be-fe72-4409-a934-9e8c70542ed6-kube-api-access-rh4rn\") pod \"service-ca-9c57cc56f-r7d4f\" (UID: \"d25464be-fe72-4409-a934-9e8c70542ed6\") " pod="openshift-service-ca/service-ca-9c57cc56f-r7d4f" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361062 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb3b55fa-972b-4231-8445-bd4cd9a8b88b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8dwc4\" (UID: \"fb3b55fa-972b-4231-8445-bd4cd9a8b88b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dwc4" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361084 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fb3b55fa-972b-4231-8445-bd4cd9a8b88b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8dwc4\" (UID: \"fb3b55fa-972b-4231-8445-bd4cd9a8b88b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dwc4" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361105 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2z9s\" (UniqueName: \"kubernetes.io/projected/03389f1b-2d84-4b8b-879d-545498a154cc-kube-api-access-f2z9s\") pod \"service-ca-operator-777779d784-6rmwm\" (UID: \"03389f1b-2d84-4b8b-879d-545498a154cc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6rmwm" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361151 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cc60e4e1-1b94-4913-879c-fbd25ff314b9-srv-cert\") pod \"olm-operator-6b444d44fb-qsc9d\" (UID: \"cc60e4e1-1b94-4913-879c-fbd25ff314b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qsc9d" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361169 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppw5j\" (UniqueName: \"kubernetes.io/projected/bd837cd0-c714-48e1-8771-cc6c419f7639-kube-api-access-ppw5j\") pod \"ingress-canary-dgh8q\" (UID: \"bd837cd0-c714-48e1-8771-cc6c419f7639\") " pod="openshift-ingress-canary/ingress-canary-dgh8q" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361231 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361268 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-bound-sa-token\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361291 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/cc60e4e1-1b94-4913-879c-fbd25ff314b9-profile-collector-cert\") pod \"olm-operator-6b444d44fb-qsc9d\" (UID: \"cc60e4e1-1b94-4913-879c-fbd25ff314b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qsc9d" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361336 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b1f7190f-8547-4938-8023-708e4891409d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8dzrg\" (UID: \"b1f7190f-8547-4938-8023-708e4891409d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dzrg" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361356 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/04b15432-c193-4b0c-b527-df9a9b37c886-images\") pod \"machine-config-operator-74547568cd-ffd8l\" (UID: \"04b15432-c193-4b0c-b527-df9a9b37c886\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ffd8l" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361393 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b72d8baa-f3f8-4263-a36d-9741ad4243d5-apiservice-cert\") pod \"packageserver-d55dfcdfc-4jk5c\" (UID: \"b72d8baa-f3f8-4263-a36d-9741ad4243d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361415 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3f7932d3-c8c1-4f66-94fb-ea1a45b46889-node-bootstrap-token\") pod \"machine-config-server-glj9p\" (UID: \"3f7932d3-c8c1-4f66-94fb-ea1a45b46889\") " pod="openshift-machine-config-operator/machine-config-server-glj9p" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361470 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6f49f99d-4119-400a-88d5-6fdf48da4d64-etcd-ca\") pod \"etcd-operator-b45778765-q92nw\" (UID: \"6f49f99d-4119-400a-88d5-6fdf48da4d64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q92nw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361508 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/011a4c5f-1a18-4f0d-884f-43bb6477efb6-trusted-ca\") pod \"ingress-operator-5b745b69d9-vgrb8\" (UID: \"011a4c5f-1a18-4f0d-884f-43bb6477efb6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vgrb8" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361527 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/011a4c5f-1a18-4f0d-884f-43bb6477efb6-metrics-tls\") pod \"ingress-operator-5b745b69d9-vgrb8\" (UID: \"011a4c5f-1a18-4f0d-884f-43bb6477efb6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vgrb8" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361544 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/644b4b74-7ce9-4d36-8938-58a1e2b2b49f-metrics-tls\") pod \"dns-operator-744455d44c-wpgg2\" (UID: \"644b4b74-7ce9-4d36-8938-58a1e2b2b49f\") " pod="openshift-dns-operator/dns-operator-744455d44c-wpgg2" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361570 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f49f99d-4119-400a-88d5-6fdf48da4d64-serving-cert\") pod \"etcd-operator-b45778765-q92nw\" (UID: \"6f49f99d-4119-400a-88d5-6fdf48da4d64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q92nw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361593 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a29641af-98a4-47ca-baca-7e933d7a00d5-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-r9fc7\" (UID: \"a29641af-98a4-47ca-baca-7e933d7a00d5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9fc7" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361660 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e37e6dcb-be13-4787-8555-3ba1050f7b77-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gk6c6\" (UID: \"e37e6dcb-be13-4787-8555-3ba1050f7b77\") " pod="openshift-marketplace/marketplace-operator-79b997595-gk6c6" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361714 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmb5d\" (UniqueName: \"kubernetes.io/projected/d566570d-4f58-487b-b824-839792e88650-kube-api-access-jmb5d\") pod \"multus-admission-controller-857f4d67dd-smnq2\" (UID: \"d566570d-4f58-487b-b824-839792e88650\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-smnq2" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361750 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bntn7\" (UniqueName: \"kubernetes.io/projected/011a4c5f-1a18-4f0d-884f-43bb6477efb6-kube-api-access-bntn7\") pod \"ingress-operator-5b745b69d9-vgrb8\" (UID: \"011a4c5f-1a18-4f0d-884f-43bb6477efb6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vgrb8" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361775 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdrrj\" (UniqueName: \"kubernetes.io/projected/3f7932d3-c8c1-4f66-94fb-ea1a45b46889-kube-api-access-rdrrj\") pod \"machine-config-server-glj9p\" (UID: \"3f7932d3-c8c1-4f66-94fb-ea1a45b46889\") " pod="openshift-machine-config-operator/machine-config-server-glj9p" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361802 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9c283b49-5e58-4c99-97c2-d53ab428265f-config-volume\") pod \"dns-default-rjwhk\" (UID: \"9c283b49-5e58-4c99-97c2-d53ab428265f\") " pod="openshift-dns/dns-default-rjwhk" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361838 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvmn4\" (UniqueName: \"kubernetes.io/projected/cc60e4e1-1b94-4913-879c-fbd25ff314b9-kube-api-access-pvmn4\") pod \"olm-operator-6b444d44fb-qsc9d\" (UID: \"cc60e4e1-1b94-4913-879c-fbd25ff314b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qsc9d" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361863 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3-secret-volume\") pod \"collect-profiles-29539260-g6qtn\" (UID: \"6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539260-g6qtn" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361960 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9c283b49-5e58-4c99-97c2-d53ab428265f-metrics-tls\") pod \"dns-default-rjwhk\" (UID: \"9c283b49-5e58-4c99-97c2-d53ab428265f\") " pod="openshift-dns/dns-default-rjwhk" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362022 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/04b15432-c193-4b0c-b527-df9a9b37c886-auth-proxy-config\") pod \"machine-config-operator-74547568cd-ffd8l\" (UID: \"04b15432-c193-4b0c-b527-df9a9b37c886\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ffd8l" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362077 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7pkt\" (UniqueName: \"kubernetes.io/projected/e37e6dcb-be13-4787-8555-3ba1050f7b77-kube-api-access-r7pkt\") pod \"marketplace-operator-79b997595-gk6c6\" (UID: \"e37e6dcb-be13-4787-8555-3ba1050f7b77\") " pod="openshift-marketplace/marketplace-operator-79b997595-gk6c6" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362121 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6f49f99d-4119-400a-88d5-6fdf48da4d64-etcd-client\") pod \"etcd-operator-b45778765-q92nw\" (UID: \"6f49f99d-4119-400a-88d5-6fdf48da4d64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q92nw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362162 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5c679376-bb09-4944-b4ee-3710661612b5-srv-cert\") pod \"catalog-operator-68c6474976-7l7sj\" (UID: \"5c679376-bb09-4944-b4ee-3710661612b5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7l7sj" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362186 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9b2af767-57b1-4774-9668-6610e9ac1bb9-registration-dir\") pod \"csi-hostpathplugin-64dsw\" (UID: \"9b2af767-57b1-4774-9668-6610e9ac1bb9\") " pod="hostpath-provisioner/csi-hostpathplugin-64dsw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362227 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03389f1b-2d84-4b8b-879d-545498a154cc-serving-cert\") pod \"service-ca-operator-777779d784-6rmwm\" (UID: \"03389f1b-2d84-4b8b-879d-545498a154cc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6rmwm" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362252 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdlfc\" (UniqueName: \"kubernetes.io/projected/644b4b74-7ce9-4d36-8938-58a1e2b2b49f-kube-api-access-jdlfc\") pod \"dns-operator-744455d44c-wpgg2\" (UID: \"644b4b74-7ce9-4d36-8938-58a1e2b2b49f\") " pod="openshift-dns-operator/dns-operator-744455d44c-wpgg2" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362280 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb3b55fa-972b-4231-8445-bd4cd9a8b88b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8dwc4\" (UID: \"fb3b55fa-972b-4231-8445-bd4cd9a8b88b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dwc4" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362291 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b6np\" (UniqueName: \"kubernetes.io/projected/b72d8baa-f3f8-4263-a36d-9741ad4243d5-kube-api-access-6b6np\") pod \"packageserver-d55dfcdfc-4jk5c\" (UID: \"b72d8baa-f3f8-4263-a36d-9741ad4243d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362366 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3-config-volume\") pod \"collect-profiles-29539260-g6qtn\" (UID: \"6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539260-g6qtn" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362395 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9b2af767-57b1-4774-9668-6610e9ac1bb9-mountpoint-dir\") pod \"csi-hostpathplugin-64dsw\" (UID: \"9b2af767-57b1-4774-9668-6610e9ac1bb9\") " pod="hostpath-provisioner/csi-hostpathplugin-64dsw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362432 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b72d8baa-f3f8-4263-a36d-9741ad4243d5-webhook-cert\") pod \"packageserver-d55dfcdfc-4jk5c\" (UID: \"b72d8baa-f3f8-4263-a36d-9741ad4243d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362481 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/011a4c5f-1a18-4f0d-884f-43bb6477efb6-bound-sa-token\") pod \"ingress-operator-5b745b69d9-vgrb8\" (UID: \"011a4c5f-1a18-4f0d-884f-43bb6477efb6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vgrb8" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362505 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3f7932d3-c8c1-4f66-94fb-ea1a45b46889-certs\") pod \"machine-config-server-glj9p\" (UID: \"3f7932d3-c8c1-4f66-94fb-ea1a45b46889\") " pod="openshift-machine-config-operator/machine-config-server-glj9p" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362546 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e37e6dcb-be13-4787-8555-3ba1050f7b77-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gk6c6\" (UID: \"e37e6dcb-be13-4787-8555-3ba1050f7b77\") " pod="openshift-marketplace/marketplace-operator-79b997595-gk6c6" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362598 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b1f7190f-8547-4938-8023-708e4891409d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8dzrg\" (UID: \"b1f7190f-8547-4938-8023-708e4891409d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dzrg" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362630 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d25464be-fe72-4409-a934-9e8c70542ed6-signing-key\") pod \"service-ca-9c57cc56f-r7d4f\" (UID: \"d25464be-fe72-4409-a934-9e8c70542ed6\") " pod="openshift-service-ca/service-ca-9c57cc56f-r7d4f" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362669 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6f49f99d-4119-400a-88d5-6fdf48da4d64-etcd-service-ca\") pod \"etcd-operator-b45778765-q92nw\" (UID: \"6f49f99d-4119-400a-88d5-6fdf48da4d64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q92nw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362693 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9ab8f156-05d7-47d8-b849-a49f1c5cf03b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-nc4gs\" (UID: \"9ab8f156-05d7-47d8-b849-a49f1c5cf03b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nc4gs" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362717 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdjxb\" (UniqueName: \"kubernetes.io/projected/9c283b49-5e58-4c99-97c2-d53ab428265f-kube-api-access-pdjxb\") pod \"dns-default-rjwhk\" (UID: \"9c283b49-5e58-4c99-97c2-d53ab428265f\") " pod="openshift-dns/dns-default-rjwhk" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362744 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbv6x\" (UniqueName: \"kubernetes.io/projected/6f49f99d-4119-400a-88d5-6fdf48da4d64-kube-api-access-gbv6x\") pod \"etcd-operator-b45778765-q92nw\" (UID: \"6f49f99d-4119-400a-88d5-6fdf48da4d64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q92nw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362768 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzw4q\" (UniqueName: \"kubernetes.io/projected/6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3-kube-api-access-lzw4q\") pod \"collect-profiles-29539260-g6qtn\" (UID: \"6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539260-g6qtn" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362818 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-trusted-ca\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362862 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k789z\" (UniqueName: \"kubernetes.io/projected/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-kube-api-access-k789z\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362919 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362951 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f49f99d-4119-400a-88d5-6fdf48da4d64-config\") pod \"etcd-operator-b45778765-q92nw\" (UID: \"6f49f99d-4119-400a-88d5-6fdf48da4d64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q92nw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.363002 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9b2af767-57b1-4774-9668-6610e9ac1bb9-csi-data-dir\") pod \"csi-hostpathplugin-64dsw\" (UID: \"9b2af767-57b1-4774-9668-6610e9ac1bb9\") " pod="hostpath-provisioner/csi-hostpathplugin-64dsw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.363054 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a29641af-98a4-47ca-baca-7e933d7a00d5-config\") pod \"kube-controller-manager-operator-78b949d7b-r9fc7\" (UID: \"a29641af-98a4-47ca-baca-7e933d7a00d5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9fc7" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.363080 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b1f7190f-8547-4938-8023-708e4891409d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8dzrg\" (UID: \"b1f7190f-8547-4938-8023-708e4891409d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dzrg" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.363118 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx4p9\" (UniqueName: \"kubernetes.io/projected/b1f7190f-8547-4938-8023-708e4891409d-kube-api-access-kx4p9\") pod \"cluster-image-registry-operator-dc59b4c8b-8dzrg\" (UID: \"b1f7190f-8547-4938-8023-708e4891409d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dzrg" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.363141 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkjx7\" (UniqueName: \"kubernetes.io/projected/9ab8f156-05d7-47d8-b849-a49f1c5cf03b-kube-api-access-bkjx7\") pod \"package-server-manager-789f6589d5-nc4gs\" (UID: \"9ab8f156-05d7-47d8-b849-a49f1c5cf03b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nc4gs" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.363195 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.363279 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-registry-tls\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.363311 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql5bq\" (UniqueName: \"kubernetes.io/projected/b4130507-2de2-48c2-9c3f-e9474aeca556-kube-api-access-ql5bq\") pod \"auto-csr-approver-29539270-q7hck\" (UID: \"b4130507-2de2-48c2-9c3f-e9474aeca556\") " pod="openshift-infra/auto-csr-approver-29539270-q7hck" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.363345 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.363612 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d25464be-fe72-4409-a934-9e8c70542ed6-signing-cabundle\") pod \"service-ca-9c57cc56f-r7d4f\" (UID: \"d25464be-fe72-4409-a934-9e8c70542ed6\") " pod="openshift-service-ca/service-ca-9c57cc56f-r7d4f" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.363650 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-registry-certificates\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.363693 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwl7l\" (UniqueName: \"kubernetes.io/projected/04b15432-c193-4b0c-b527-df9a9b37c886-kube-api-access-xwl7l\") pod \"machine-config-operator-74547568cd-ffd8l\" (UID: \"04b15432-c193-4b0c-b527-df9a9b37c886\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ffd8l" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.363719 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5c679376-bb09-4944-b4ee-3710661612b5-profile-collector-cert\") pod \"catalog-operator-68c6474976-7l7sj\" (UID: \"5c679376-bb09-4944-b4ee-3710661612b5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7l7sj" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.363745 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52bbw\" (UniqueName: \"kubernetes.io/projected/5c679376-bb09-4944-b4ee-3710661612b5-kube-api-access-52bbw\") pod \"catalog-operator-68c6474976-7l7sj\" (UID: \"5c679376-bb09-4944-b4ee-3710661612b5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7l7sj" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.363765 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d566570d-4f58-487b-b824-839792e88650-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-smnq2\" (UID: \"d566570d-4f58-487b-b824-839792e88650\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-smnq2" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.363818 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a29641af-98a4-47ca-baca-7e933d7a00d5-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-r9fc7\" (UID: \"a29641af-98a4-47ca-baca-7e933d7a00d5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9fc7" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.363843 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/04b15432-c193-4b0c-b527-df9a9b37c886-proxy-tls\") pod \"machine-config-operator-74547568cd-ffd8l\" (UID: \"04b15432-c193-4b0c-b527-df9a9b37c886\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ffd8l" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.363867 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xv7w\" (UniqueName: \"kubernetes.io/projected/9b2af767-57b1-4774-9668-6610e9ac1bb9-kube-api-access-6xv7w\") pod \"csi-hostpathplugin-64dsw\" (UID: \"9b2af767-57b1-4774-9668-6610e9ac1bb9\") " pod="hostpath-provisioner/csi-hostpathplugin-64dsw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.363937 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9b2af767-57b1-4774-9668-6610e9ac1bb9-plugins-dir\") pod \"csi-hostpathplugin-64dsw\" (UID: \"9b2af767-57b1-4774-9668-6610e9ac1bb9\") " pod="hostpath-provisioner/csi-hostpathplugin-64dsw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.363966 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs8s7\" (UniqueName: \"kubernetes.io/projected/f3480f1b-eedb-4bc8-b40f-5c527869096a-kube-api-access-hs8s7\") pod \"migrator-59844c95c7-kxx8s\" (UID: \"f3480f1b-eedb-4bc8-b40f-5c527869096a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kxx8s" Mar 01 09:11:34 crc kubenswrapper[4792]: E0301 09:11:34.365318 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:34.865305492 +0000 UTC m=+224.107184689 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.365518 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb3b55fa-972b-4231-8445-bd4cd9a8b88b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8dwc4\" (UID: \"fb3b55fa-972b-4231-8445-bd4cd9a8b88b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dwc4" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.366056 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f49f99d-4119-400a-88d5-6fdf48da4d64-config\") pod \"etcd-operator-b45778765-q92nw\" (UID: \"6f49f99d-4119-400a-88d5-6fdf48da4d64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q92nw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.366796 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6f49f99d-4119-400a-88d5-6fdf48da4d64-etcd-ca\") pod \"etcd-operator-b45778765-q92nw\" (UID: \"6f49f99d-4119-400a-88d5-6fdf48da4d64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q92nw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.367352 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-registry-certificates\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.367522 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a29641af-98a4-47ca-baca-7e933d7a00d5-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-r9fc7\" (UID: \"a29641af-98a4-47ca-baca-7e933d7a00d5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9fc7" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.368282 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6f49f99d-4119-400a-88d5-6fdf48da4d64-etcd-service-ca\") pod \"etcd-operator-b45778765-q92nw\" (UID: \"6f49f99d-4119-400a-88d5-6fdf48da4d64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q92nw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.368474 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a29641af-98a4-47ca-baca-7e933d7a00d5-config\") pod \"kube-controller-manager-operator-78b949d7b-r9fc7\" (UID: \"a29641af-98a4-47ca-baca-7e933d7a00d5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9fc7" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.369168 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b1f7190f-8547-4938-8023-708e4891409d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8dzrg\" (UID: \"b1f7190f-8547-4938-8023-708e4891409d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dzrg" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.369615 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b1f7190f-8547-4938-8023-708e4891409d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8dzrg\" (UID: \"b1f7190f-8547-4938-8023-708e4891409d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dzrg" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.370369 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6f49f99d-4119-400a-88d5-6fdf48da4d64-etcd-client\") pod \"etcd-operator-b45778765-q92nw\" (UID: \"6f49f99d-4119-400a-88d5-6fdf48da4d64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q92nw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.370698 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f49f99d-4119-400a-88d5-6fdf48da4d64-serving-cert\") pod \"etcd-operator-b45778765-q92nw\" (UID: \"6f49f99d-4119-400a-88d5-6fdf48da4d64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q92nw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.413075 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-trusted-ca\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.413219 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.413586 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-registry-tls\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: W0301 09:11:34.418055 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51683a24_edad_4808_b2ec_6a628bfdd937.slice/crio-9bdf109597e2093483ee18245241f28df0a563ea6bae665bae2821dc90890d2e WatchSource:0}: Error finding container 9bdf109597e2093483ee18245241f28df0a563ea6bae665bae2821dc90890d2e: Status 404 returned error can't find the container with id 9bdf109597e2093483ee18245241f28df0a563ea6bae665bae2821dc90890d2e Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.418448 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b1f7190f-8547-4938-8023-708e4891409d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8dzrg\" (UID: \"b1f7190f-8547-4938-8023-708e4891409d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dzrg" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.435766 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmb5d\" (UniqueName: \"kubernetes.io/projected/d566570d-4f58-487b-b824-839792e88650-kube-api-access-jmb5d\") pod \"multus-admission-controller-857f4d67dd-smnq2\" (UID: \"d566570d-4f58-487b-b824-839792e88650\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-smnq2" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.448503 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" event={"ID":"8578f8dc-143c-423c-b62b-b3190444bafd","Type":"ContainerStarted","Data":"452f21dc7923df996fc4ebcc58043ac03b69b3315c7778ffe5676b68b45c4e4f"} Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.449808 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zk9bv" event={"ID":"e12bfc30-3142-4073-96c7-a377ff6723f7","Type":"ContainerStarted","Data":"0e23e8c1bc8596bfef55378e85c51963d81b7cd8ed6c35c790cb0e766fb2db0e"} Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.449835 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zk9bv" event={"ID":"e12bfc30-3142-4073-96c7-a377ff6723f7","Type":"ContainerStarted","Data":"9a36e7fd9cd6114fc82268c71c363e9c165cf82457d0d5553c7a776d50b2b6e4"} Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.451172 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9smfd" event={"ID":"e0b63d94-59de-45da-8058-89714bea7a90","Type":"ContainerStarted","Data":"bd38c09d69ba47ec9ef0c003938aaf7f3ee7204fc6c25ebfda3154487ebc28ca"} Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.452245 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-877gr" event={"ID":"0a5ad85c-19b5-432d-aa36-d0db74e44744","Type":"ContainerStarted","Data":"6737d4e8211f8c4d682ae34ee5d82c8354f8c966d735715288edc667235636e3"} Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.454883 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lk5qk" event={"ID":"2311d615-fd4d-43c2-9fcb-8858383c2dc9","Type":"ContainerStarted","Data":"502c112c1bd8a34f51ea3e1ea353eca15fa9a9b3aa6b54edd787bb3e1c2118d4"} Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.455839 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-bound-sa-token\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.456706 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-wxl8v" event={"ID":"6cc55bdf-6c0f-4d35-879f-c64c2dc4897c","Type":"ContainerStarted","Data":"7b9215af955101b19487a965042f4a5f46cb6e7c587c700e7e568a19ff7442d6"} Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.456736 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-wxl8v" event={"ID":"6cc55bdf-6c0f-4d35-879f-c64c2dc4897c","Type":"ContainerStarted","Data":"46206446d1604ae68341dc037b20eac001ae46b9e483ab82dab28549f8b6feab"} Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.458049 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-zrzcg" event={"ID":"86788093-42e5-4fa0-9595-97a910e6557e","Type":"ContainerStarted","Data":"d8eca6eb41f8b18cd9f4704945f3f7da8382b69f90c64f182bda36eef645951e"} Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.458873 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" event={"ID":"9d95b2fd-64be-4688-a596-c41bb31cb9c4","Type":"ContainerStarted","Data":"fcc34692643388dd6dce5229f4741d15ab4fda6bca936d66d57b07de450c074f"} Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.459630 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dbmff" event={"ID":"d3e5ed20-a91a-4d9b-a42c-0bf43b8b9842","Type":"ContainerStarted","Data":"fa97e4696280686943f690b7aa07cba997686cb8ae8cbc98143a0bc6a19ba3f1"} Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.461502 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" event={"ID":"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2","Type":"ContainerStarted","Data":"ee95da5910e6e2c13434125c99652d9a77af8d0b5457ae7a7ce20a9282ee3b00"} Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.461552 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" event={"ID":"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2","Type":"ContainerStarted","Data":"c11caa71735c72b6244f64898704ac0350a94ec2fdec7c12cfb52f25b0fabfa9"} Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.461683 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.462721 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s68hg" event={"ID":"62e02cd9-3008-41de-b7b7-dc1f546c5645","Type":"ContainerStarted","Data":"7fc2d832ee54019ed03f6c7bbda197a07d9d4784bbe5f2d4f6eb053289222e48"} Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.463710 4792 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-knb62 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.463743 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" podUID="44c647cb-a9e2-4e75-abb3-5d3cdbe881a2" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.463890 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-tswcj" event={"ID":"1f432b26-7417-4b71-a63a-5cb9a142bd43","Type":"ContainerStarted","Data":"05403abb1063104f17831c789351343da38d5a852d04a7591b0f89dc5c068a3f"} Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.464339 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.464489 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d25464be-fe72-4409-a934-9e8c70542ed6-signing-cabundle\") pod \"service-ca-9c57cc56f-r7d4f\" (UID: \"d25464be-fe72-4409-a934-9e8c70542ed6\") " pod="openshift-service-ca/service-ca-9c57cc56f-r7d4f" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.464516 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52bbw\" (UniqueName: \"kubernetes.io/projected/5c679376-bb09-4944-b4ee-3710661612b5-kube-api-access-52bbw\") pod \"catalog-operator-68c6474976-7l7sj\" (UID: \"5c679376-bb09-4944-b4ee-3710661612b5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7l7sj" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.464547 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwl7l\" (UniqueName: \"kubernetes.io/projected/04b15432-c193-4b0c-b527-df9a9b37c886-kube-api-access-xwl7l\") pod \"machine-config-operator-74547568cd-ffd8l\" (UID: \"04b15432-c193-4b0c-b527-df9a9b37c886\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ffd8l" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.464570 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5c679376-bb09-4944-b4ee-3710661612b5-profile-collector-cert\") pod \"catalog-operator-68c6474976-7l7sj\" (UID: \"5c679376-bb09-4944-b4ee-3710661612b5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7l7sj" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.464592 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/04b15432-c193-4b0c-b527-df9a9b37c886-proxy-tls\") pod \"machine-config-operator-74547568cd-ffd8l\" (UID: \"04b15432-c193-4b0c-b527-df9a9b37c886\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ffd8l" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.464615 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xv7w\" (UniqueName: \"kubernetes.io/projected/9b2af767-57b1-4774-9668-6610e9ac1bb9-kube-api-access-6xv7w\") pod \"csi-hostpathplugin-64dsw\" (UID: \"9b2af767-57b1-4774-9668-6610e9ac1bb9\") " pod="hostpath-provisioner/csi-hostpathplugin-64dsw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.464654 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9b2af767-57b1-4774-9668-6610e9ac1bb9-plugins-dir\") pod \"csi-hostpathplugin-64dsw\" (UID: \"9b2af767-57b1-4774-9668-6610e9ac1bb9\") " pod="hostpath-provisioner/csi-hostpathplugin-64dsw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.464675 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs8s7\" (UniqueName: \"kubernetes.io/projected/f3480f1b-eedb-4bc8-b40f-5c527869096a-kube-api-access-hs8s7\") pod \"migrator-59844c95c7-kxx8s\" (UID: \"f3480f1b-eedb-4bc8-b40f-5c527869096a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kxx8s" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.464717 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd837cd0-c714-48e1-8771-cc6c419f7639-cert\") pod \"ingress-canary-dgh8q\" (UID: \"bd837cd0-c714-48e1-8771-cc6c419f7639\") " pod="openshift-ingress-canary/ingress-canary-dgh8q" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.464739 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9b2af767-57b1-4774-9668-6610e9ac1bb9-socket-dir\") pod \"csi-hostpathplugin-64dsw\" (UID: \"9b2af767-57b1-4774-9668-6610e9ac1bb9\") " pod="hostpath-provisioner/csi-hostpathplugin-64dsw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.464759 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03389f1b-2d84-4b8b-879d-545498a154cc-config\") pod \"service-ca-operator-777779d784-6rmwm\" (UID: \"03389f1b-2d84-4b8b-879d-545498a154cc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6rmwm" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.464796 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b72d8baa-f3f8-4263-a36d-9741ad4243d5-tmpfs\") pod \"packageserver-d55dfcdfc-4jk5c\" (UID: \"b72d8baa-f3f8-4263-a36d-9741ad4243d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.464816 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh4rn\" (UniqueName: \"kubernetes.io/projected/d25464be-fe72-4409-a934-9e8c70542ed6-kube-api-access-rh4rn\") pod \"service-ca-9c57cc56f-r7d4f\" (UID: \"d25464be-fe72-4409-a934-9e8c70542ed6\") " pod="openshift-service-ca/service-ca-9c57cc56f-r7d4f" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.464856 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2z9s\" (UniqueName: \"kubernetes.io/projected/03389f1b-2d84-4b8b-879d-545498a154cc-kube-api-access-f2z9s\") pod \"service-ca-operator-777779d784-6rmwm\" (UID: \"03389f1b-2d84-4b8b-879d-545498a154cc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6rmwm" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.464880 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cc60e4e1-1b94-4913-879c-fbd25ff314b9-srv-cert\") pod \"olm-operator-6b444d44fb-qsc9d\" (UID: \"cc60e4e1-1b94-4913-879c-fbd25ff314b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qsc9d" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.464927 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppw5j\" (UniqueName: \"kubernetes.io/projected/bd837cd0-c714-48e1-8771-cc6c419f7639-kube-api-access-ppw5j\") pod \"ingress-canary-dgh8q\" (UID: \"bd837cd0-c714-48e1-8771-cc6c419f7639\") " pod="openshift-ingress-canary/ingress-canary-dgh8q" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.464970 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/cc60e4e1-1b94-4913-879c-fbd25ff314b9-profile-collector-cert\") pod \"olm-operator-6b444d44fb-qsc9d\" (UID: \"cc60e4e1-1b94-4913-879c-fbd25ff314b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qsc9d" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.464994 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/04b15432-c193-4b0c-b527-df9a9b37c886-images\") pod \"machine-config-operator-74547568cd-ffd8l\" (UID: \"04b15432-c193-4b0c-b527-df9a9b37c886\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ffd8l" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465018 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b72d8baa-f3f8-4263-a36d-9741ad4243d5-apiservice-cert\") pod \"packageserver-d55dfcdfc-4jk5c\" (UID: \"b72d8baa-f3f8-4263-a36d-9741ad4243d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465039 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3f7932d3-c8c1-4f66-94fb-ea1a45b46889-node-bootstrap-token\") pod \"machine-config-server-glj9p\" (UID: \"3f7932d3-c8c1-4f66-94fb-ea1a45b46889\") " pod="openshift-machine-config-operator/machine-config-server-glj9p" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465075 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/011a4c5f-1a18-4f0d-884f-43bb6477efb6-trusted-ca\") pod \"ingress-operator-5b745b69d9-vgrb8\" (UID: \"011a4c5f-1a18-4f0d-884f-43bb6477efb6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vgrb8" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465099 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/011a4c5f-1a18-4f0d-884f-43bb6477efb6-metrics-tls\") pod \"ingress-operator-5b745b69d9-vgrb8\" (UID: \"011a4c5f-1a18-4f0d-884f-43bb6477efb6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vgrb8" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465120 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/644b4b74-7ce9-4d36-8938-58a1e2b2b49f-metrics-tls\") pod \"dns-operator-744455d44c-wpgg2\" (UID: \"644b4b74-7ce9-4d36-8938-58a1e2b2b49f\") " pod="openshift-dns-operator/dns-operator-744455d44c-wpgg2" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465157 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e37e6dcb-be13-4787-8555-3ba1050f7b77-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gk6c6\" (UID: \"e37e6dcb-be13-4787-8555-3ba1050f7b77\") " pod="openshift-marketplace/marketplace-operator-79b997595-gk6c6" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465181 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bntn7\" (UniqueName: \"kubernetes.io/projected/011a4c5f-1a18-4f0d-884f-43bb6477efb6-kube-api-access-bntn7\") pod \"ingress-operator-5b745b69d9-vgrb8\" (UID: \"011a4c5f-1a18-4f0d-884f-43bb6477efb6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vgrb8" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465203 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdrrj\" (UniqueName: \"kubernetes.io/projected/3f7932d3-c8c1-4f66-94fb-ea1a45b46889-kube-api-access-rdrrj\") pod \"machine-config-server-glj9p\" (UID: \"3f7932d3-c8c1-4f66-94fb-ea1a45b46889\") " pod="openshift-machine-config-operator/machine-config-server-glj9p" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465222 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9c283b49-5e58-4c99-97c2-d53ab428265f-config-volume\") pod \"dns-default-rjwhk\" (UID: \"9c283b49-5e58-4c99-97c2-d53ab428265f\") " pod="openshift-dns/dns-default-rjwhk" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465245 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvmn4\" (UniqueName: \"kubernetes.io/projected/cc60e4e1-1b94-4913-879c-fbd25ff314b9-kube-api-access-pvmn4\") pod \"olm-operator-6b444d44fb-qsc9d\" (UID: \"cc60e4e1-1b94-4913-879c-fbd25ff314b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qsc9d" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465275 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3-secret-volume\") pod \"collect-profiles-29539260-g6qtn\" (UID: \"6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539260-g6qtn" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465296 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9c283b49-5e58-4c99-97c2-d53ab428265f-metrics-tls\") pod \"dns-default-rjwhk\" (UID: \"9c283b49-5e58-4c99-97c2-d53ab428265f\") " pod="openshift-dns/dns-default-rjwhk" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465328 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/04b15432-c193-4b0c-b527-df9a9b37c886-auth-proxy-config\") pod \"machine-config-operator-74547568cd-ffd8l\" (UID: \"04b15432-c193-4b0c-b527-df9a9b37c886\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ffd8l" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465351 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7pkt\" (UniqueName: \"kubernetes.io/projected/e37e6dcb-be13-4787-8555-3ba1050f7b77-kube-api-access-r7pkt\") pod \"marketplace-operator-79b997595-gk6c6\" (UID: \"e37e6dcb-be13-4787-8555-3ba1050f7b77\") " pod="openshift-marketplace/marketplace-operator-79b997595-gk6c6" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465384 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5c679376-bb09-4944-b4ee-3710661612b5-srv-cert\") pod \"catalog-operator-68c6474976-7l7sj\" (UID: \"5c679376-bb09-4944-b4ee-3710661612b5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7l7sj" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465405 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9b2af767-57b1-4774-9668-6610e9ac1bb9-registration-dir\") pod \"csi-hostpathplugin-64dsw\" (UID: \"9b2af767-57b1-4774-9668-6610e9ac1bb9\") " pod="hostpath-provisioner/csi-hostpathplugin-64dsw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465427 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03389f1b-2d84-4b8b-879d-545498a154cc-serving-cert\") pod \"service-ca-operator-777779d784-6rmwm\" (UID: \"03389f1b-2d84-4b8b-879d-545498a154cc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6rmwm" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465449 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdlfc\" (UniqueName: \"kubernetes.io/projected/644b4b74-7ce9-4d36-8938-58a1e2b2b49f-kube-api-access-jdlfc\") pod \"dns-operator-744455d44c-wpgg2\" (UID: \"644b4b74-7ce9-4d36-8938-58a1e2b2b49f\") " pod="openshift-dns-operator/dns-operator-744455d44c-wpgg2" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465470 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6b6np\" (UniqueName: \"kubernetes.io/projected/b72d8baa-f3f8-4263-a36d-9741ad4243d5-kube-api-access-6b6np\") pod \"packageserver-d55dfcdfc-4jk5c\" (UID: \"b72d8baa-f3f8-4263-a36d-9741ad4243d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465493 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3-config-volume\") pod \"collect-profiles-29539260-g6qtn\" (UID: \"6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539260-g6qtn" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465514 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9b2af767-57b1-4774-9668-6610e9ac1bb9-mountpoint-dir\") pod \"csi-hostpathplugin-64dsw\" (UID: \"9b2af767-57b1-4774-9668-6610e9ac1bb9\") " pod="hostpath-provisioner/csi-hostpathplugin-64dsw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465532 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3f7932d3-c8c1-4f66-94fb-ea1a45b46889-certs\") pod \"machine-config-server-glj9p\" (UID: \"3f7932d3-c8c1-4f66-94fb-ea1a45b46889\") " pod="openshift-machine-config-operator/machine-config-server-glj9p" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465553 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b72d8baa-f3f8-4263-a36d-9741ad4243d5-webhook-cert\") pod \"packageserver-d55dfcdfc-4jk5c\" (UID: \"b72d8baa-f3f8-4263-a36d-9741ad4243d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465587 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/011a4c5f-1a18-4f0d-884f-43bb6477efb6-bound-sa-token\") pod \"ingress-operator-5b745b69d9-vgrb8\" (UID: \"011a4c5f-1a18-4f0d-884f-43bb6477efb6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vgrb8" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465612 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e37e6dcb-be13-4787-8555-3ba1050f7b77-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gk6c6\" (UID: \"e37e6dcb-be13-4787-8555-3ba1050f7b77\") " pod="openshift-marketplace/marketplace-operator-79b997595-gk6c6" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465636 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d25464be-fe72-4409-a934-9e8c70542ed6-signing-key\") pod \"service-ca-9c57cc56f-r7d4f\" (UID: \"d25464be-fe72-4409-a934-9e8c70542ed6\") " pod="openshift-service-ca/service-ca-9c57cc56f-r7d4f" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465660 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdjxb\" (UniqueName: \"kubernetes.io/projected/9c283b49-5e58-4c99-97c2-d53ab428265f-kube-api-access-pdjxb\") pod \"dns-default-rjwhk\" (UID: \"9c283b49-5e58-4c99-97c2-d53ab428265f\") " pod="openshift-dns/dns-default-rjwhk" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465682 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9ab8f156-05d7-47d8-b849-a49f1c5cf03b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-nc4gs\" (UID: \"9ab8f156-05d7-47d8-b849-a49f1c5cf03b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nc4gs" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465710 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzw4q\" (UniqueName: \"kubernetes.io/projected/6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3-kube-api-access-lzw4q\") pod \"collect-profiles-29539260-g6qtn\" (UID: \"6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539260-g6qtn" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465761 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9b2af767-57b1-4774-9668-6610e9ac1bb9-csi-data-dir\") pod \"csi-hostpathplugin-64dsw\" (UID: \"9b2af767-57b1-4774-9668-6610e9ac1bb9\") " pod="hostpath-provisioner/csi-hostpathplugin-64dsw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465810 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkjx7\" (UniqueName: \"kubernetes.io/projected/9ab8f156-05d7-47d8-b849-a49f1c5cf03b-kube-api-access-bkjx7\") pod \"package-server-manager-789f6589d5-nc4gs\" (UID: \"9ab8f156-05d7-47d8-b849-a49f1c5cf03b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nc4gs" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465840 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql5bq\" (UniqueName: \"kubernetes.io/projected/b4130507-2de2-48c2-9c3f-e9474aeca556-kube-api-access-ql5bq\") pod \"auto-csr-approver-29539270-q7hck\" (UID: \"b4130507-2de2-48c2-9c3f-e9474aeca556\") " pod="openshift-infra/auto-csr-approver-29539270-q7hck" Mar 01 09:11:34 crc kubenswrapper[4792]: E0301 09:11:34.466069 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:34.966052233 +0000 UTC m=+224.207931430 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.466860 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d25464be-fe72-4409-a934-9e8c70542ed6-signing-cabundle\") pod \"service-ca-9c57cc56f-r7d4f\" (UID: \"d25464be-fe72-4409-a934-9e8c70542ed6\") " pod="openshift-service-ca/service-ca-9c57cc56f-r7d4f" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.467487 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9c283b49-5e58-4c99-97c2-d53ab428265f-config-volume\") pod \"dns-default-rjwhk\" (UID: \"9c283b49-5e58-4c99-97c2-d53ab428265f\") " pod="openshift-dns/dns-default-rjwhk" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.467980 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9b2af767-57b1-4774-9668-6610e9ac1bb9-mountpoint-dir\") pod \"csi-hostpathplugin-64dsw\" (UID: \"9b2af767-57b1-4774-9668-6610e9ac1bb9\") " pod="hostpath-provisioner/csi-hostpathplugin-64dsw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.470308 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5c679376-bb09-4944-b4ee-3710661612b5-profile-collector-cert\") pod \"catalog-operator-68c6474976-7l7sj\" (UID: \"5c679376-bb09-4944-b4ee-3710661612b5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7l7sj" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.470331 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3-secret-volume\") pod \"collect-profiles-29539260-g6qtn\" (UID: \"6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539260-g6qtn" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.470438 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9b2af767-57b1-4774-9668-6610e9ac1bb9-csi-data-dir\") pod \"csi-hostpathplugin-64dsw\" (UID: \"9b2af767-57b1-4774-9668-6610e9ac1bb9\") " pod="hostpath-provisioner/csi-hostpathplugin-64dsw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.472228 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9c283b49-5e58-4c99-97c2-d53ab428265f-metrics-tls\") pod \"dns-default-rjwhk\" (UID: \"9c283b49-5e58-4c99-97c2-d53ab428265f\") " pod="openshift-dns/dns-default-rjwhk" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.472407 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5c679376-bb09-4944-b4ee-3710661612b5-srv-cert\") pod \"catalog-operator-68c6474976-7l7sj\" (UID: \"5c679376-bb09-4944-b4ee-3710661612b5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7l7sj" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.472429 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9b2af767-57b1-4774-9668-6610e9ac1bb9-registration-dir\") pod \"csi-hostpathplugin-64dsw\" (UID: \"9b2af767-57b1-4774-9668-6610e9ac1bb9\") " pod="hostpath-provisioner/csi-hostpathplugin-64dsw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.472737 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/04b15432-c193-4b0c-b527-df9a9b37c886-auth-proxy-config\") pod \"machine-config-operator-74547568cd-ffd8l\" (UID: \"04b15432-c193-4b0c-b527-df9a9b37c886\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ffd8l" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.474020 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b72d8baa-f3f8-4263-a36d-9741ad4243d5-webhook-cert\") pod \"packageserver-d55dfcdfc-4jk5c\" (UID: \"b72d8baa-f3f8-4263-a36d-9741ad4243d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.474690 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3f7932d3-c8c1-4f66-94fb-ea1a45b46889-certs\") pod \"machine-config-server-glj9p\" (UID: \"3f7932d3-c8c1-4f66-94fb-ea1a45b46889\") " pod="openshift-machine-config-operator/machine-config-server-glj9p" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.475382 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/04b15432-c193-4b0c-b527-df9a9b37c886-images\") pod \"machine-config-operator-74547568cd-ffd8l\" (UID: \"04b15432-c193-4b0c-b527-df9a9b37c886\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ffd8l" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.475471 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9b2af767-57b1-4774-9668-6610e9ac1bb9-socket-dir\") pod \"csi-hostpathplugin-64dsw\" (UID: \"9b2af767-57b1-4774-9668-6610e9ac1bb9\") " pod="hostpath-provisioner/csi-hostpathplugin-64dsw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.475593 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9b2af767-57b1-4774-9668-6610e9ac1bb9-plugins-dir\") pod \"csi-hostpathplugin-64dsw\" (UID: \"9b2af767-57b1-4774-9668-6610e9ac1bb9\") " pod="hostpath-provisioner/csi-hostpathplugin-64dsw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.476077 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-nv4bp" event={"ID":"51683a24-edad-4808-b2ec-6a628bfdd937","Type":"ContainerStarted","Data":"9bdf109597e2093483ee18245241f28df0a563ea6bae665bae2821dc90890d2e"} Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.476385 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/011a4c5f-1a18-4f0d-884f-43bb6477efb6-trusted-ca\") pod \"ingress-operator-5b745b69d9-vgrb8\" (UID: \"011a4c5f-1a18-4f0d-884f-43bb6477efb6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vgrb8" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.476572 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03389f1b-2d84-4b8b-879d-545498a154cc-config\") pod \"service-ca-operator-777779d784-6rmwm\" (UID: \"03389f1b-2d84-4b8b-879d-545498a154cc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6rmwm" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.476769 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03389f1b-2d84-4b8b-879d-545498a154cc-serving-cert\") pod \"service-ca-operator-777779d784-6rmwm\" (UID: \"03389f1b-2d84-4b8b-879d-545498a154cc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6rmwm" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.476990 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b72d8baa-f3f8-4263-a36d-9741ad4243d5-tmpfs\") pod \"packageserver-d55dfcdfc-4jk5c\" (UID: \"b72d8baa-f3f8-4263-a36d-9741ad4243d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.479002 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3f7932d3-c8c1-4f66-94fb-ea1a45b46889-node-bootstrap-token\") pod \"machine-config-server-glj9p\" (UID: \"3f7932d3-c8c1-4f66-94fb-ea1a45b46889\") " pod="openshift-machine-config-operator/machine-config-server-glj9p" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.479486 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/04b15432-c193-4b0c-b527-df9a9b37c886-proxy-tls\") pod \"machine-config-operator-74547568cd-ffd8l\" (UID: \"04b15432-c193-4b0c-b527-df9a9b37c886\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ffd8l" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.479736 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3-config-volume\") pod \"collect-profiles-29539260-g6qtn\" (UID: \"6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539260-g6qtn" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.479945 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/cc60e4e1-1b94-4913-879c-fbd25ff314b9-profile-collector-cert\") pod \"olm-operator-6b444d44fb-qsc9d\" (UID: \"cc60e4e1-1b94-4913-879c-fbd25ff314b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qsc9d" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.480535 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/644b4b74-7ce9-4d36-8938-58a1e2b2b49f-metrics-tls\") pod \"dns-operator-744455d44c-wpgg2\" (UID: \"644b4b74-7ce9-4d36-8938-58a1e2b2b49f\") " pod="openshift-dns-operator/dns-operator-744455d44c-wpgg2" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.480527 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9ab8f156-05d7-47d8-b849-a49f1c5cf03b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-nc4gs\" (UID: \"9ab8f156-05d7-47d8-b849-a49f1c5cf03b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nc4gs" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.481674 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5kp" event={"ID":"77e0e285-570c-47bd-854e-538c9367486b","Type":"ContainerStarted","Data":"0a6766d4e40e2675b67316cc7d3c875fd936047845e1c3f01d0d7522bb2a7505"} Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.489222 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7b2lz" event={"ID":"47d3ee47-7c75-4321-8b9c-5e119a92a311","Type":"ContainerStarted","Data":"9203c066c00632815d880164f77515e0b915244aec43f169595ab08d34a5df06"} Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.493746 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" event={"ID":"020a8218-62f4-4abf-a8d2-fed602de5f7f","Type":"ContainerStarted","Data":"87d106e344fa51bb8e5e92cf97b7c6070e8daa571dd37784f078b0bfdb5ba165"} Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.495955 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e37e6dcb-be13-4787-8555-3ba1050f7b77-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gk6c6\" (UID: \"e37e6dcb-be13-4787-8555-3ba1050f7b77\") " pod="openshift-marketplace/marketplace-operator-79b997595-gk6c6" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.503049 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b72d8baa-f3f8-4263-a36d-9741ad4243d5-apiservice-cert\") pod \"packageserver-d55dfcdfc-4jk5c\" (UID: \"b72d8baa-f3f8-4263-a36d-9741ad4243d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.503548 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d25464be-fe72-4409-a934-9e8c70542ed6-signing-key\") pod \"service-ca-9c57cc56f-r7d4f\" (UID: \"d25464be-fe72-4409-a934-9e8c70542ed6\") " pod="openshift-service-ca/service-ca-9c57cc56f-r7d4f" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.504080 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/011a4c5f-1a18-4f0d-884f-43bb6477efb6-metrics-tls\") pod \"ingress-operator-5b745b69d9-vgrb8\" (UID: \"011a4c5f-1a18-4f0d-884f-43bb6477efb6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vgrb8" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.505962 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2l2w7" event={"ID":"9ee9e9d4-e788-41cb-b601-035551b5338c","Type":"ContainerStarted","Data":"b3f78f535e977cde4c2e5ec2a45cfe5751a4144dd17031be1b1eeada78b1af76"} Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.508727 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-qtg4x" event={"ID":"6d0571e3-5089-4157-a36a-25ecfe6a67f2","Type":"ContainerStarted","Data":"1448bc8f4ddb26ee5a66d7f86003329ab8a218e46fb4e7920989f0830ee402b2"} Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.509831 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd837cd0-c714-48e1-8771-cc6c419f7639-cert\") pod \"ingress-canary-dgh8q\" (UID: \"bd837cd0-c714-48e1-8771-cc6c419f7639\") " pod="openshift-ingress-canary/ingress-canary-dgh8q" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.511439 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-8qrq4" event={"ID":"eeacfd31-08e1-49e6-afda-95efa2d815d2","Type":"ContainerStarted","Data":"515004a2e31d7d7d7129680a5ec71f040c47d5513ac0abf1bde7b057757f277e"} Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.517274 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k789z\" (UniqueName: \"kubernetes.io/projected/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-kube-api-access-k789z\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.523779 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-smnq2" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.526954 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fb3b55fa-972b-4231-8445-bd4cd9a8b88b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8dwc4\" (UID: \"fb3b55fa-972b-4231-8445-bd4cd9a8b88b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dwc4" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.541624 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dwc4" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.571070 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.571258 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bw279\" (UniqueName: \"kubernetes.io/projected/499393fc-abcf-4998-9e32-3d43a0b1e488-kube-api-access-bw279\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:34 crc kubenswrapper[4792]: E0301 09:11:34.571352 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:35.071335225 +0000 UTC m=+224.313214422 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.573933 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw279\" (UniqueName: \"kubernetes.io/projected/499393fc-abcf-4998-9e32-3d43a0b1e488-kube-api-access-bw279\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.594838 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql5bq\" (UniqueName: \"kubernetes.io/projected/b4130507-2de2-48c2-9c3f-e9474aeca556-kube-api-access-ql5bq\") pod \"auto-csr-approver-29539270-q7hck\" (UID: \"b4130507-2de2-48c2-9c3f-e9474aeca556\") " pod="openshift-infra/auto-csr-approver-29539270-q7hck" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.604058 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539270-q7hck" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.615218 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdrrj\" (UniqueName: \"kubernetes.io/projected/3f7932d3-c8c1-4f66-94fb-ea1a45b46889-kube-api-access-rdrrj\") pod \"machine-config-server-glj9p\" (UID: \"3f7932d3-c8c1-4f66-94fb-ea1a45b46889\") " pod="openshift-machine-config-operator/machine-config-server-glj9p" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.634882 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52bbw\" (UniqueName: \"kubernetes.io/projected/5c679376-bb09-4944-b4ee-3710661612b5-kube-api-access-52bbw\") pod \"catalog-operator-68c6474976-7l7sj\" (UID: \"5c679376-bb09-4944-b4ee-3710661612b5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7l7sj" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.638029 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbv6x\" (UniqueName: \"kubernetes.io/projected/6f49f99d-4119-400a-88d5-6fdf48da4d64-kube-api-access-gbv6x\") pod \"etcd-operator-b45778765-q92nw\" (UID: \"6f49f99d-4119-400a-88d5-6fdf48da4d64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q92nw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.638282 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e37e6dcb-be13-4787-8555-3ba1050f7b77-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gk6c6\" (UID: \"e37e6dcb-be13-4787-8555-3ba1050f7b77\") " pod="openshift-marketplace/marketplace-operator-79b997595-gk6c6" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.638777 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cc60e4e1-1b94-4913-879c-fbd25ff314b9-srv-cert\") pod \"olm-operator-6b444d44fb-qsc9d\" (UID: \"cc60e4e1-1b94-4913-879c-fbd25ff314b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qsc9d" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.639024 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx4p9\" (UniqueName: \"kubernetes.io/projected/b1f7190f-8547-4938-8023-708e4891409d-kube-api-access-kx4p9\") pod \"cluster-image-registry-operator-dc59b4c8b-8dzrg\" (UID: \"b1f7190f-8547-4938-8023-708e4891409d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dzrg" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.640212 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a29641af-98a4-47ca-baca-7e933d7a00d5-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-r9fc7\" (UID: \"a29641af-98a4-47ca-baca-7e933d7a00d5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9fc7" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.657947 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwl7l\" (UniqueName: \"kubernetes.io/projected/04b15432-c193-4b0c-b527-df9a9b37c886-kube-api-access-xwl7l\") pod \"machine-config-operator-74547568cd-ffd8l\" (UID: \"04b15432-c193-4b0c-b527-df9a9b37c886\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ffd8l" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.672086 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:34 crc kubenswrapper[4792]: E0301 09:11:34.672261 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:35.172230368 +0000 UTC m=+224.414109565 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.672334 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: E0301 09:11:34.672649 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:35.172639438 +0000 UTC m=+224.414518635 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.674456 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvmn4\" (UniqueName: \"kubernetes.io/projected/cc60e4e1-1b94-4913-879c-fbd25ff314b9-kube-api-access-pvmn4\") pod \"olm-operator-6b444d44fb-qsc9d\" (UID: \"cc60e4e1-1b94-4913-879c-fbd25ff314b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qsc9d" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.696846 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdjxb\" (UniqueName: \"kubernetes.io/projected/9c283b49-5e58-4c99-97c2-d53ab428265f-kube-api-access-pdjxb\") pod \"dns-default-rjwhk\" (UID: \"9c283b49-5e58-4c99-97c2-d53ab428265f\") " pod="openshift-dns/dns-default-rjwhk" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.717131 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-glj9p" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.719104 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzw4q\" (UniqueName: \"kubernetes.io/projected/6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3-kube-api-access-lzw4q\") pod \"collect-profiles-29539260-g6qtn\" (UID: \"6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539260-g6qtn" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.745058 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkjx7\" (UniqueName: \"kubernetes.io/projected/9ab8f156-05d7-47d8-b849-a49f1c5cf03b-kube-api-access-bkjx7\") pod \"package-server-manager-789f6589d5-nc4gs\" (UID: \"9ab8f156-05d7-47d8-b849-a49f1c5cf03b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nc4gs" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.761866 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dzrg" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.763330 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7pkt\" (UniqueName: \"kubernetes.io/projected/e37e6dcb-be13-4787-8555-3ba1050f7b77-kube-api-access-r7pkt\") pod \"marketplace-operator-79b997595-gk6c6\" (UID: \"e37e6dcb-be13-4787-8555-3ba1050f7b77\") " pod="openshift-marketplace/marketplace-operator-79b997595-gk6c6" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.771679 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-q92nw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.773610 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:34 crc kubenswrapper[4792]: E0301 09:11:34.774052 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:35.274030565 +0000 UTC m=+224.515909762 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.779738 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b6np\" (UniqueName: \"kubernetes.io/projected/b72d8baa-f3f8-4263-a36d-9741ad4243d5-kube-api-access-6b6np\") pod \"packageserver-d55dfcdfc-4jk5c\" (UID: \"b72d8baa-f3f8-4263-a36d-9741ad4243d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.814455 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdlfc\" (UniqueName: \"kubernetes.io/projected/644b4b74-7ce9-4d36-8938-58a1e2b2b49f-kube-api-access-jdlfc\") pod \"dns-operator-744455d44c-wpgg2\" (UID: \"644b4b74-7ce9-4d36-8938-58a1e2b2b49f\") " pod="openshift-dns-operator/dns-operator-744455d44c-wpgg2" Mar 01 09:11:34 crc kubenswrapper[4792]: W0301 09:11:34.816543 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f7932d3_c8c1_4f66_94fb_ea1a45b46889.slice/crio-6d467a8e74ffb6cb59877a3d9e8122a50b4dc3efb898dfefa40d1ca91fd8bdd6 WatchSource:0}: Error finding container 6d467a8e74ffb6cb59877a3d9e8122a50b4dc3efb898dfefa40d1ca91fd8bdd6: Status 404 returned error can't find the container with id 6d467a8e74ffb6cb59877a3d9e8122a50b4dc3efb898dfefa40d1ca91fd8bdd6 Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.819278 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/011a4c5f-1a18-4f0d-884f-43bb6477efb6-bound-sa-token\") pod \"ingress-operator-5b745b69d9-vgrb8\" (UID: \"011a4c5f-1a18-4f0d-884f-43bb6477efb6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vgrb8" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.822507 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.831990 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9fc7" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.837366 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xv7w\" (UniqueName: \"kubernetes.io/projected/9b2af767-57b1-4774-9668-6610e9ac1bb9-kube-api-access-6xv7w\") pod \"csi-hostpathplugin-64dsw\" (UID: \"9b2af767-57b1-4774-9668-6610e9ac1bb9\") " pod="hostpath-provisioner/csi-hostpathplugin-64dsw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.859156 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs8s7\" (UniqueName: \"kubernetes.io/projected/f3480f1b-eedb-4bc8-b40f-5c527869096a-kube-api-access-hs8s7\") pod \"migrator-59844c95c7-kxx8s\" (UID: \"f3480f1b-eedb-4bc8-b40f-5c527869096a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kxx8s" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.874780 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.874947 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bntn7\" (UniqueName: \"kubernetes.io/projected/011a4c5f-1a18-4f0d-884f-43bb6477efb6-kube-api-access-bntn7\") pod \"ingress-operator-5b745b69d9-vgrb8\" (UID: \"011a4c5f-1a18-4f0d-884f-43bb6477efb6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vgrb8" Mar 01 09:11:34 crc kubenswrapper[4792]: E0301 09:11:34.875150 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:35.375134854 +0000 UTC m=+224.617014051 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.891251 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nc4gs" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.892155 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7l7sj" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.892247 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ffd8l" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.899311 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dwc4"] Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.900043 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh4rn\" (UniqueName: \"kubernetes.io/projected/d25464be-fe72-4409-a934-9e8c70542ed6-kube-api-access-rh4rn\") pod \"service-ca-9c57cc56f-r7d4f\" (UID: \"d25464be-fe72-4409-a934-9e8c70542ed6\") " pod="openshift-service-ca/service-ca-9c57cc56f-r7d4f" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.914006 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qsc9d" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.923842 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2z9s\" (UniqueName: \"kubernetes.io/projected/03389f1b-2d84-4b8b-879d-545498a154cc-kube-api-access-f2z9s\") pod \"service-ca-operator-777779d784-6rmwm\" (UID: \"03389f1b-2d84-4b8b-879d-545498a154cc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6rmwm" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.924101 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vgrb8" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.926031 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-smnq2"] Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.932883 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.935818 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppw5j\" (UniqueName: \"kubernetes.io/projected/bd837cd0-c714-48e1-8771-cc6c419f7639-kube-api-access-ppw5j\") pod \"ingress-canary-dgh8q\" (UID: \"bd837cd0-c714-48e1-8771-cc6c419f7639\") " pod="openshift-ingress-canary/ingress-canary-dgh8q" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.942588 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kxx8s" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.950091 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gk6c6" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.957779 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539270-q7hck"] Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.959126 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rjwhk" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.967137 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-r7d4f" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.973200 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-wpgg2" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.975182 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:34 crc kubenswrapper[4792]: E0301 09:11:34.975382 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:35.475364791 +0000 UTC m=+224.717243988 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.985685 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539260-g6qtn" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.987240 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-64dsw" Mar 01 09:11:34 crc kubenswrapper[4792]: W0301 09:11:34.998732 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd566570d_4f58_487b_b824_839792e88650.slice/crio-d313b7dea9bd493c603419846fc91ae18a2bb505f05af1d04f1b3989b7d817a2 WatchSource:0}: Error finding container d313b7dea9bd493c603419846fc91ae18a2bb505f05af1d04f1b3989b7d817a2: Status 404 returned error can't find the container with id d313b7dea9bd493c603419846fc91ae18a2bb505f05af1d04f1b3989b7d817a2 Mar 01 09:11:35 crc kubenswrapper[4792]: W0301 09:11:35.010646 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4130507_2de2_48c2_9c3f_e9474aeca556.slice/crio-ff027c882d14f4f304f09047fac83f66f0a08cd7a1d80eea6711ce8cef3e6448 WatchSource:0}: Error finding container ff027c882d14f4f304f09047fac83f66f0a08cd7a1d80eea6711ce8cef3e6448: Status 404 returned error can't find the container with id ff027c882d14f4f304f09047fac83f66f0a08cd7a1d80eea6711ce8cef3e6448 Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.016495 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.035103 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dgh8q" Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.082897 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:35 crc kubenswrapper[4792]: E0301 09:11:35.083548 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:35.583532694 +0000 UTC m=+224.825411891 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.172693 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6rmwm" Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.183499 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:35 crc kubenswrapper[4792]: E0301 09:11:35.183726 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:35.68371246 +0000 UTC m=+224.925591657 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.292461 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-q92nw"] Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.293351 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:35 crc kubenswrapper[4792]: E0301 09:11:35.293811 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:35.79379749 +0000 UTC m=+225.035676687 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.371645 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dzrg"] Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.395407 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:35 crc kubenswrapper[4792]: E0301 09:11:35.395685 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:35.895670667 +0000 UTC m=+225.137549864 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.475841 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9fc7"] Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.494124 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.502843 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:35 crc kubenswrapper[4792]: E0301 09:11:35.503370 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:36.003357408 +0000 UTC m=+225.245236605 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.539259 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-877gr" event={"ID":"0a5ad85c-19b5-432d-aa36-d0db74e44744","Type":"ContainerStarted","Data":"3bb35e6ef9f0da7b40811ef29e32e353346635d604b9b39b0dc1a4ec5904ca0a"} Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.552163 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-8qrq4" event={"ID":"eeacfd31-08e1-49e6-afda-95efa2d815d2","Type":"ContainerStarted","Data":"9eb47924416a8f2a59a0c453bb2ab1e7ec87e083accfd408d3ab0c81005fb1cc"} Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.552955 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-8qrq4" Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.556653 4792 patch_prober.go:28] interesting pod/console-operator-58897d9998-8qrq4 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/readyz\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.556691 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-8qrq4" podUID="eeacfd31-08e1-49e6-afda-95efa2d815d2" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/readyz\": dial tcp 10.217.0.31:8443: connect: connection refused" Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.556989 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539270-q7hck" event={"ID":"b4130507-2de2-48c2-9c3f-e9474aeca556","Type":"ContainerStarted","Data":"ff027c882d14f4f304f09047fac83f66f0a08cd7a1d80eea6711ce8cef3e6448"} Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.562143 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2l2w7" event={"ID":"9ee9e9d4-e788-41cb-b601-035551b5338c","Type":"ContainerStarted","Data":"14601b3d86899c0b127c11ea852ecc09b467fb320b1f1a289b6ea1819e6deeed"} Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.567481 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dwc4" event={"ID":"fb3b55fa-972b-4231-8445-bd4cd9a8b88b","Type":"ContainerStarted","Data":"69eaae17ca0c3b10dd436fcaa8b0dceed744cc2c6da2e2950888ab59627b5abd"} Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.574122 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-glj9p" event={"ID":"3f7932d3-c8c1-4f66-94fb-ea1a45b46889","Type":"ContainerStarted","Data":"6d467a8e74ffb6cb59877a3d9e8122a50b4dc3efb898dfefa40d1ca91fd8bdd6"} Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.618535 4792 generic.go:334] "Generic (PLEG): container finished" podID="9d95b2fd-64be-4688-a596-c41bb31cb9c4" containerID="828da0440d2ef9c9dc733beff09cbe9cffe6d7dcfa7675ba2e268a387a977a9f" exitCode=0 Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.618622 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" event={"ID":"9d95b2fd-64be-4688-a596-c41bb31cb9c4","Type":"ContainerDied","Data":"828da0440d2ef9c9dc733beff09cbe9cffe6d7dcfa7675ba2e268a387a977a9f"} Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.623691 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:35 crc kubenswrapper[4792]: E0301 09:11:35.624022 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:36.124005318 +0000 UTC m=+225.365884515 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.698848 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dbmff" event={"ID":"d3e5ed20-a91a-4d9b-a42c-0bf43b8b9842","Type":"ContainerStarted","Data":"ce6c0cb2394b93a60649419ffb968cb456bb229f9d79aeb5525f4e38a90d0561"} Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.724578 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-qtg4x" event={"ID":"6d0571e3-5089-4157-a36a-25ecfe6a67f2","Type":"ContainerStarted","Data":"107f80305c6b81ce3b32773a2d4c2df641bfa3c110285efe9782296b8c374688"} Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.726110 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:35 crc kubenswrapper[4792]: E0301 09:11:35.729499 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:36.229484535 +0000 UTC m=+225.471363722 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.733305 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9smfd" event={"ID":"e0b63d94-59de-45da-8058-89714bea7a90","Type":"ContainerStarted","Data":"435c0c14fbeae7027911e02febcca02089f1b06f392117a8649865a86018cf8c"} Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.761262 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-kxx8s"] Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.768092 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-wpgg2"] Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.783876 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dbmff" podStartSLOduration=163.783853854 podStartE2EDuration="2m43.783853854s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:35.775131339 +0000 UTC m=+225.017010536" watchObservedRunningTime="2026-03-01 09:11:35.783853854 +0000 UTC m=+225.025733051" Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.801719 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" event={"ID":"8578f8dc-143c-423c-b62b-b3190444bafd","Type":"ContainerStarted","Data":"c13739edba999be4dbdb0675da20f693f5c2f873f236f49fe221573f802f1672"} Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.802123 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.809806 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" event={"ID":"020a8218-62f4-4abf-a8d2-fed602de5f7f","Type":"ContainerStarted","Data":"63337be8a65e2fc4ccac6c7a4ef78cbcdcc5f66cc06d8b699c08445a9271a940"} Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.809867 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.811663 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-zrzcg" event={"ID":"86788093-42e5-4fa0-9595-97a910e6557e","Type":"ContainerStarted","Data":"80316d2117c800289193f2a4dcfeab49966269af79c4c3f4ca03bd69d11da184"} Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.816885 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" podStartSLOduration=163.816872576 podStartE2EDuration="2m43.816872576s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:35.816844196 +0000 UTC m=+225.058723393" watchObservedRunningTime="2026-03-01 09:11:35.816872576 +0000 UTC m=+225.058751773" Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.817323 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-tswcj" event={"ID":"1f432b26-7417-4b71-a63a-5cb9a142bd43","Type":"ContainerStarted","Data":"5d5f80f7330bb6afa57c8bfae40cb90db4a709e518e9787a06868810dc3ed802"} Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.826974 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:35 crc kubenswrapper[4792]: E0301 09:11:35.827232 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:36.327204351 +0000 UTC m=+225.569083548 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.835675 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-nv4bp" event={"ID":"51683a24-edad-4808-b2ec-6a628bfdd937","Type":"ContainerStarted","Data":"23990c96fef4c71bf759430f4b2c3b47ab3757999edb6510973ca0d075879c46"} Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.843013 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5kp" event={"ID":"77e0e285-570c-47bd-854e-538c9367486b","Type":"ContainerStarted","Data":"3960184a2f5a6701136e24eaaa9e26902f8b3063a31ad16fe3f57ca6c4c15e29"} Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.847917 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7b2lz" event={"ID":"47d3ee47-7c75-4321-8b9c-5e119a92a311","Type":"ContainerStarted","Data":"7d54ee435dd185de59e9d4fd069cd0c74f0b8dc2e24b0519e8bb2b8344d3f21e"} Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.851407 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-smnq2" event={"ID":"d566570d-4f58-487b-b824-839792e88650","Type":"ContainerStarted","Data":"d313b7dea9bd493c603419846fc91ae18a2bb505f05af1d04f1b3989b7d817a2"} Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.854851 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-wxl8v" Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.899123 4792 patch_prober.go:28] interesting pod/downloads-7954f5f757-wxl8v container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" start-of-body= Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.899177 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wxl8v" podUID="6cc55bdf-6c0f-4d35-879f-c64c2dc4897c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.899419 4792 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-wl9zt container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.899435 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" podUID="8578f8dc-143c-423c-b62b-b3190444bafd" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.899484 4792 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-prqqp container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.16:6443/healthz\": dial tcp 10.217.0.16:6443: connect: connection refused" start-of-body= Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.899496 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" podUID="020a8218-62f4-4abf-a8d2-fed602de5f7f" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.16:6443/healthz\": dial tcp 10.217.0.16:6443: connect: connection refused" Mar 01 09:11:35 crc kubenswrapper[4792]: W0301 09:11:35.918080 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3480f1b_eedb_4bc8_b40f_5c527869096a.slice/crio-4134ab0e66c0f4d4a6c95c020bf21196fa893d296dd143d28e5c16f18fad1e82 WatchSource:0}: Error finding container 4134ab0e66c0f4d4a6c95c020bf21196fa893d296dd143d28e5c16f18fad1e82: Status 404 returned error can't find the container with id 4134ab0e66c0f4d4a6c95c020bf21196fa893d296dd143d28e5c16f18fad1e82 Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.938379 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lk5qk" podStartSLOduration=164.938354307 podStartE2EDuration="2m44.938354307s" podCreationTimestamp="2026-03-01 09:08:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:35.861388282 +0000 UTC m=+225.103267469" watchObservedRunningTime="2026-03-01 09:11:35.938354307 +0000 UTC m=+225.180233514" Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.943114 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:35 crc kubenswrapper[4792]: E0301 09:11:35.949364 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:36.449348028 +0000 UTC m=+225.691227225 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.044969 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:36 crc kubenswrapper[4792]: E0301 09:11:36.046109 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:36.546094209 +0000 UTC m=+225.787973406 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.166111 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.166458 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9smfd" podStartSLOduration=164.166442312 podStartE2EDuration="2m44.166442312s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:36.138140575 +0000 UTC m=+225.380019772" watchObservedRunningTime="2026-03-01 09:11:36.166442312 +0000 UTC m=+225.408321509" Mar 01 09:11:36 crc kubenswrapper[4792]: E0301 09:11:36.178678 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:36.678637772 +0000 UTC m=+225.920517069 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.245441 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.251575 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s68hg" podStartSLOduration=164.251554698 podStartE2EDuration="2m44.251554698s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:36.224125772 +0000 UTC m=+225.466004969" watchObservedRunningTime="2026-03-01 09:11:36.251554698 +0000 UTC m=+225.493433895" Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.271416 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:36 crc kubenswrapper[4792]: E0301 09:11:36.271745 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:36.771729734 +0000 UTC m=+226.013608931 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.373819 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:36 crc kubenswrapper[4792]: E0301 09:11:36.374353 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:36.87434197 +0000 UTC m=+226.116221167 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.410037 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qsc9d"] Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.485457 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:36 crc kubenswrapper[4792]: E0301 09:11:36.485891 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:36.985874096 +0000 UTC m=+226.227753293 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.549504 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-6lk5b"] Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.572011 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zk9bv" podStartSLOduration=165.571996856 podStartE2EDuration="2m45.571996856s" podCreationTimestamp="2026-03-01 09:08:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:36.567046554 +0000 UTC m=+225.808925751" watchObservedRunningTime="2026-03-01 09:11:36.571996856 +0000 UTC m=+225.813876053" Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.587419 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:36 crc kubenswrapper[4792]: E0301 09:11:36.587850 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:37.087833156 +0000 UTC m=+226.329712353 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.603410 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-qtg4x" Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.610787 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nc4gs"] Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.618309 4792 patch_prober.go:28] interesting pod/router-default-5444994796-qtg4x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 01 09:11:36 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 01 09:11:36 crc kubenswrapper[4792]: [+]process-running ok Mar 01 09:11:36 crc kubenswrapper[4792]: healthz check failed Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.618364 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtg4x" podUID="6d0571e3-5089-4157-a36a-25ecfe6a67f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.632046 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-qtg4x" podStartSLOduration=164.632027764 podStartE2EDuration="2m44.632027764s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:36.63064046 +0000 UTC m=+225.872519657" watchObservedRunningTime="2026-03-01 09:11:36.632027764 +0000 UTC m=+225.873906961" Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.658235 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-r7d4f"] Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.689130 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:36 crc kubenswrapper[4792]: E0301 09:11:36.689393 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:37.189375696 +0000 UTC m=+226.431254893 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.689632 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:36 crc kubenswrapper[4792]: E0301 09:11:36.689937 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:37.189895189 +0000 UTC m=+226.431774386 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.754670 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-8qrq4" podStartSLOduration=165.754650593 podStartE2EDuration="2m45.754650593s" podCreationTimestamp="2026-03-01 09:08:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:36.739936081 +0000 UTC m=+225.981815278" watchObservedRunningTime="2026-03-01 09:11:36.754650593 +0000 UTC m=+225.996529790" Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.756812 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-vgrb8"] Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.790883 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:36 crc kubenswrapper[4792]: E0301 09:11:36.791248 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:37.291233174 +0000 UTC m=+226.533112371 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.830144 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7l7sj"] Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.834707 4792 ???:1] "http: TLS handshake error from 192.168.126.11:50278: no serving certificate available for the kubelet" Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.881144 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-ffd8l"] Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.891961 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:36 crc kubenswrapper[4792]: E0301 09:11:36.892245 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:37.39223352 +0000 UTC m=+226.634112717 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.909271 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-wpgg2" event={"ID":"644b4b74-7ce9-4d36-8938-58a1e2b2b49f","Type":"ContainerStarted","Data":"deb4cb205db772c57f32cd07de729e7db366c1ef515aecd7666695017d360fb0"} Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.920996 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c"] Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.969510 4792 ???:1] "http: TLS handshake error from 192.168.126.11:50294: no serving certificate available for the kubelet" Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.993253 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:36 crc kubenswrapper[4792]: E0301 09:11:36.993877 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:37.493846022 +0000 UTC m=+226.735725219 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.006464 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-64dsw"] Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.041664 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qsc9d" event={"ID":"cc60e4e1-1b94-4913-879c-fbd25ff314b9","Type":"ContainerStarted","Data":"1e8deb08c9230c5fafd4cb5d4f271455cb951a1172b27605143da8179962768a"} Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.042543 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-6rmwm"] Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.094325 4792 ???:1] "http: TLS handshake error from 192.168.126.11:50296: no serving certificate available for the kubelet" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.095324 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:37 crc kubenswrapper[4792]: E0301 09:11:37.096706 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:37.596692644 +0000 UTC m=+226.838571841 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.122040 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9fc7" event={"ID":"a29641af-98a4-47ca-baca-7e933d7a00d5","Type":"ContainerStarted","Data":"a669e0b4482e33ffdb8d53bbc5ad2cf935a146e10be2babeb35ee8efc1288240"} Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.148471 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539260-g6qtn"] Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.164560 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-r7d4f" event={"ID":"d25464be-fe72-4409-a934-9e8c70542ed6","Type":"ContainerStarted","Data":"bceb9c902218cbbaf0ce238fdc30fb7b9ec3eb272099123b2052cead5b010326"} Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.199257 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:37 crc kubenswrapper[4792]: E0301 09:11:37.199669 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:37.699652948 +0000 UTC m=+226.941532145 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.205163 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-877gr" event={"ID":"0a5ad85c-19b5-432d-aa36-d0db74e44744","Type":"ContainerStarted","Data":"9e342f117a951fbdc417be13b431b7e6f53f8a26812ff4e26f3b920583eaf4fa"} Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.217425 4792 ???:1] "http: TLS handshake error from 192.168.126.11:50310: no serving certificate available for the kubelet" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.217822 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-glj9p" event={"ID":"3f7932d3-c8c1-4f66-94fb-ea1a45b46889","Type":"ContainerStarted","Data":"4ccee4d48a72de23a14c4744d4a706ffdc8caaae4c1f3e4c7c88707cc64282b1"} Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.228817 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-wxl8v" podStartSLOduration=165.228797466 podStartE2EDuration="2m45.228797466s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:37.18349433 +0000 UTC m=+226.425373527" watchObservedRunningTime="2026-03-01 09:11:37.228797466 +0000 UTC m=+226.470676663" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.229385 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-zrzcg" podStartSLOduration=165.22937988 podStartE2EDuration="2m45.22937988s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:37.22570912 +0000 UTC m=+226.467588317" watchObservedRunningTime="2026-03-01 09:11:37.22937988 +0000 UTC m=+226.471259077" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.249174 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dwc4" event={"ID":"fb3b55fa-972b-4231-8445-bd4cd9a8b88b","Type":"ContainerStarted","Data":"d0f8b4226a10c737a4e4e1fb0342d829ed6a9a55651af6451d89aad4be0eb6c3"} Mar 01 09:11:37 crc kubenswrapper[4792]: W0301 09:11:37.270761 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c1a0aad_45a6_45d3_bc5d_bbbf2e4fdcc3.slice/crio-e332fd58bac36920eb992b11914f683b07598dd8216c452a72973706723d6019 WatchSource:0}: Error finding container e332fd58bac36920eb992b11914f683b07598dd8216c452a72973706723d6019: Status 404 returned error can't find the container with id e332fd58bac36920eb992b11914f683b07598dd8216c452a72973706723d6019 Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.281129 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gk6c6"] Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.290689 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-tswcj" podStartSLOduration=166.290668899 podStartE2EDuration="2m46.290668899s" podCreationTimestamp="2026-03-01 09:08:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:37.289259594 +0000 UTC m=+226.531138791" watchObservedRunningTime="2026-03-01 09:11:37.290668899 +0000 UTC m=+226.532548086" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.296968 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dgh8q"] Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.304507 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:37 crc kubenswrapper[4792]: E0301 09:11:37.305667 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:37.805653048 +0000 UTC m=+227.047532245 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.331769 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rjwhk"] Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.339977 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kxx8s" event={"ID":"f3480f1b-eedb-4bc8-b40f-5c527869096a","Type":"ContainerStarted","Data":"4134ab0e66c0f4d4a6c95c020bf21196fa893d296dd143d28e5c16f18fad1e82"} Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.357560 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" podStartSLOduration=165.357542185 podStartE2EDuration="2m45.357542185s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:37.338894106 +0000 UTC m=+226.580773303" watchObservedRunningTime="2026-03-01 09:11:37.357542185 +0000 UTC m=+226.599421382" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.366001 4792 ???:1] "http: TLS handshake error from 192.168.126.11:50316: no serving certificate available for the kubelet" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.376683 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dzrg" event={"ID":"b1f7190f-8547-4938-8023-708e4891409d","Type":"ContainerStarted","Data":"a7da6efe1203d3322dcc417e6fdf30da39440d2bce52db39523da3be26f9f320"} Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.406579 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:37 crc kubenswrapper[4792]: E0301 09:11:37.407712 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:37.907664049 +0000 UTC m=+227.149543246 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.427534 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7b2lz" podStartSLOduration=165.427519298 podStartE2EDuration="2m45.427519298s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:37.418968867 +0000 UTC m=+226.660848064" watchObservedRunningTime="2026-03-01 09:11:37.427519298 +0000 UTC m=+226.669398495" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.447318 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" podStartSLOduration=166.447301875 podStartE2EDuration="2m46.447301875s" podCreationTimestamp="2026-03-01 09:08:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:37.444459405 +0000 UTC m=+226.686338602" watchObservedRunningTime="2026-03-01 09:11:37.447301875 +0000 UTC m=+226.689181072" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.474535 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-nv4bp" podStartSLOduration=165.474520015 podStartE2EDuration="2m45.474520015s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:37.47268005 +0000 UTC m=+226.714559247" watchObservedRunningTime="2026-03-01 09:11:37.474520015 +0000 UTC m=+226.716399212" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.499177 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-smnq2" event={"ID":"d566570d-4f58-487b-b824-839792e88650","Type":"ContainerStarted","Data":"b437d533301ec1a8bf145b28fa75526b9244e8515ab202b1f8f1ae9e07990c81"} Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.499743 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-glj9p" podStartSLOduration=6.499726045 podStartE2EDuration="6.499726045s" podCreationTimestamp="2026-03-01 09:11:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:37.499120781 +0000 UTC m=+226.740999978" watchObservedRunningTime="2026-03-01 09:11:37.499726045 +0000 UTC m=+226.741605242" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.507598 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:37 crc kubenswrapper[4792]: E0301 09:11:37.511362 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:38.011331181 +0000 UTC m=+227.253210378 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.516101 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" event={"ID":"499393fc-abcf-4998-9e32-3d43a0b1e488","Type":"ContainerStarted","Data":"038db8caba2d61180a1409412260d624f871b2f05eaa4b1054a8c506f49680ee"} Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.539606 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dwc4" podStartSLOduration=165.539589497 podStartE2EDuration="2m45.539589497s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:37.538975432 +0000 UTC m=+226.780854629" watchObservedRunningTime="2026-03-01 09:11:37.539589497 +0000 UTC m=+226.781468694" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.589329 4792 ???:1] "http: TLS handshake error from 192.168.126.11:50318: no serving certificate available for the kubelet" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.590480 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dzrg" podStartSLOduration=165.590464849 podStartE2EDuration="2m45.590464849s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:37.589744992 +0000 UTC m=+226.831624189" watchObservedRunningTime="2026-03-01 09:11:37.590464849 +0000 UTC m=+226.832344036" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.591016 4792 generic.go:334] "Generic (PLEG): container finished" podID="9ee9e9d4-e788-41cb-b601-035551b5338c" containerID="14601b3d86899c0b127c11ea852ecc09b467fb320b1f1a289b6ea1819e6deeed" exitCode=0 Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.591109 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2l2w7" event={"ID":"9ee9e9d4-e788-41cb-b601-035551b5338c","Type":"ContainerDied","Data":"14601b3d86899c0b127c11ea852ecc09b467fb320b1f1a289b6ea1819e6deeed"} Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.591749 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2l2w7" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.603350 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nc4gs" event={"ID":"9ab8f156-05d7-47d8-b849-a49f1c5cf03b","Type":"ContainerStarted","Data":"0ee72213a44cb66137fb2d70f1a1b01a4afa69cb60d872ae25778f630b635928"} Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.613638 4792 patch_prober.go:28] interesting pod/router-default-5444994796-qtg4x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 01 09:11:37 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 01 09:11:37 crc kubenswrapper[4792]: [+]process-running ok Mar 01 09:11:37 crc kubenswrapper[4792]: healthz check failed Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.613696 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtg4x" podUID="6d0571e3-5089-4157-a36a-25ecfe6a67f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.614072 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:37 crc kubenswrapper[4792]: E0301 09:11:37.614443 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:38.114422649 +0000 UTC m=+227.356301856 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.637517 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-q92nw" event={"ID":"6f49f99d-4119-400a-88d5-6fdf48da4d64","Type":"ContainerStarted","Data":"d9feede3d11684c685f93f46f379d2cdbd7d87e0cfb7e43c4bccc7132411f21d"} Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.655581 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-877gr" podStartSLOduration=165.655568262 podStartE2EDuration="2m45.655568262s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:37.653386648 +0000 UTC m=+226.895265845" watchObservedRunningTime="2026-03-01 09:11:37.655568262 +0000 UTC m=+226.897447449" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.684495 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-q92nw" podStartSLOduration=165.684476294 podStartE2EDuration="2m45.684476294s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:37.682950726 +0000 UTC m=+226.924829923" watchObservedRunningTime="2026-03-01 09:11:37.684476294 +0000 UTC m=+226.926355491" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.698774 4792 ???:1] "http: TLS handshake error from 192.168.126.11:50322: no serving certificate available for the kubelet" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.709123 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vgrb8" event={"ID":"011a4c5f-1a18-4f0d-884f-43bb6477efb6","Type":"ContainerStarted","Data":"2dc75b9ebc33c1710fdb48d0d0e0e4568bf56596b98b86a29ae12f220ad2dc9a"} Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.710110 4792 patch_prober.go:28] interesting pod/downloads-7954f5f757-wxl8v container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" start-of-body= Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.710152 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wxl8v" podUID="6cc55bdf-6c0f-4d35-879f-c64c2dc4897c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.715183 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.728675 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" Mar 01 09:11:37 crc kubenswrapper[4792]: E0301 09:11:37.728799 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:38.228776884 +0000 UTC m=+227.470656081 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.732880 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2l2w7" podStartSLOduration=166.732860715 podStartE2EDuration="2m46.732860715s" podCreationTimestamp="2026-03-01 09:08:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:37.731610164 +0000 UTC m=+226.973489381" watchObservedRunningTime="2026-03-01 09:11:37.732860715 +0000 UTC m=+226.974739912" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.779575 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.816534 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:37 crc kubenswrapper[4792]: E0301 09:11:37.816998 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:38.316982476 +0000 UTC m=+227.558861673 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.819869 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:37 crc kubenswrapper[4792]: E0301 09:11:37.820199 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:38.320189595 +0000 UTC m=+227.562068792 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.853279 4792 ???:1] "http: TLS handshake error from 192.168.126.11:50338: no serving certificate available for the kubelet" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.926418 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:37 crc kubenswrapper[4792]: E0301 09:11:37.926770 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:38.426755718 +0000 UTC m=+227.668634915 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.031540 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:38 crc kubenswrapper[4792]: E0301 09:11:38.031790 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:38.531779334 +0000 UTC m=+227.773658531 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.132586 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:38 crc kubenswrapper[4792]: E0301 09:11:38.138789 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:38.638765128 +0000 UTC m=+227.880644325 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.235608 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:38 crc kubenswrapper[4792]: E0301 09:11:38.235983 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:38.735971481 +0000 UTC m=+227.977850678 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.336291 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:38 crc kubenswrapper[4792]: E0301 09:11:38.336463 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:38.836430744 +0000 UTC m=+228.078309941 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.336969 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:38 crc kubenswrapper[4792]: E0301 09:11:38.337326 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:38.837313836 +0000 UTC m=+228.079193033 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.437511 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:38 crc kubenswrapper[4792]: E0301 09:11:38.437942 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:38.937923562 +0000 UTC m=+228.179802759 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.525205 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wl9zt"] Mar 01 09:11:38 crc kubenswrapper[4792]: E0301 09:11:38.543279 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:39.043267936 +0000 UTC m=+228.285147133 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.543001 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.608548 4792 patch_prober.go:28] interesting pod/router-default-5444994796-qtg4x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 01 09:11:38 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 01 09:11:38 crc kubenswrapper[4792]: [+]process-running ok Mar 01 09:11:38 crc kubenswrapper[4792]: healthz check failed Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.608823 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtg4x" podUID="6d0571e3-5089-4157-a36a-25ecfe6a67f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.625997 4792 ???:1] "http: TLS handshake error from 192.168.126.11:37460: no serving certificate available for the kubelet" Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.628406 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62"] Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.645340 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:38 crc kubenswrapper[4792]: E0301 09:11:38.645694 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:39.145665607 +0000 UTC m=+228.387544794 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.721466 4792 patch_prober.go:28] interesting pod/console-operator-58897d9998-8qrq4 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.721505 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-8qrq4" podUID="eeacfd31-08e1-49e6-afda-95efa2d815d2" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.745338 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6rmwm" event={"ID":"03389f1b-2d84-4b8b-879d-545498a154cc","Type":"ContainerStarted","Data":"be5dca9df6311edb2af7a577d893ee5693464113e3b6cc2ba2321f2d46420c86"} Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.745400 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6rmwm" event={"ID":"03389f1b-2d84-4b8b-879d-545498a154cc","Type":"ContainerStarted","Data":"dfb9fd1ab6ea1a0f91d85ac829d57bd567353f35088f191e271584b0ccf4e0ec"} Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.746645 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:38 crc kubenswrapper[4792]: E0301 09:11:38.747033 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:39.247020752 +0000 UTC m=+228.488899949 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.762890 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-q92nw" event={"ID":"6f49f99d-4119-400a-88d5-6fdf48da4d64","Type":"ContainerStarted","Data":"b18c2370e5e8fce7f5f9bb58fb2c42790591fd6902256b0182fd699718ea3bb4"} Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.773208 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nc4gs" event={"ID":"9ab8f156-05d7-47d8-b849-a49f1c5cf03b","Type":"ContainerStarted","Data":"5c5a2b17f732f695da3d4866b48be8b4675888ddeed2320710da5fc39d06bbdb"} Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.775180 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-wpgg2" event={"ID":"644b4b74-7ce9-4d36-8938-58a1e2b2b49f","Type":"ContainerStarted","Data":"1d1e2d22704d416cbd49572f2e0f199e4f1cbc9524b80c29bc32d87307e59b79"} Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.776523 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-nv4bp" event={"ID":"51683a24-edad-4808-b2ec-6a628bfdd937","Type":"ContainerStarted","Data":"7330b89e50cc3be0cf56d7d80385f1422cd9b492692ed2007be996954dfaf2cd"} Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.792252 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gk6c6" event={"ID":"e37e6dcb-be13-4787-8555-3ba1050f7b77","Type":"ContainerStarted","Data":"8c78eddf897c2741ce992ebcf0cd416c60d059664ef1623df2239b0550809bd0"} Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.792301 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gk6c6" event={"ID":"e37e6dcb-be13-4787-8555-3ba1050f7b77","Type":"ContainerStarted","Data":"5fe7291196c34fc28ce9dd0b3ec6175bb85545475a2aafe649770e45dc76a617"} Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.792673 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-gk6c6" Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.799755 4792 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-gk6c6 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.799810 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-gk6c6" podUID="e37e6dcb-be13-4787-8555-3ba1050f7b77" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.817199 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6rmwm" podStartSLOduration=166.817180909 podStartE2EDuration="2m46.817180909s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:38.814839811 +0000 UTC m=+228.056719008" watchObservedRunningTime="2026-03-01 09:11:38.817180909 +0000 UTC m=+228.059060106" Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.833592 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-64dsw" event={"ID":"9b2af767-57b1-4774-9668-6610e9ac1bb9","Type":"ContainerStarted","Data":"abaead56598a20893bfa16ce2ecd0037f3a66d16ae9b1a77ca85a4700fdf9590"} Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.840062 4792 generic.go:334] "Generic (PLEG): container finished" podID="499393fc-abcf-4998-9e32-3d43a0b1e488" containerID="d19e02d3394aff309dd88dd48aab49226b866dfa35122864b9992aeb050f8afd" exitCode=0 Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.840114 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" event={"ID":"499393fc-abcf-4998-9e32-3d43a0b1e488","Type":"ContainerDied","Data":"d19e02d3394aff309dd88dd48aab49226b866dfa35122864b9992aeb050f8afd"} Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.845459 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dgh8q" event={"ID":"bd837cd0-c714-48e1-8771-cc6c419f7639","Type":"ContainerStarted","Data":"1212105a46de0acf3cd80080586379061d3be2534ce8e94ecb5417d6b6b7e92c"} Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.850685 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:38 crc kubenswrapper[4792]: E0301 09:11:38.851645 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:39.351627427 +0000 UTC m=+228.593506624 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.867709 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dzrg" event={"ID":"b1f7190f-8547-4938-8023-708e4891409d","Type":"ContainerStarted","Data":"925c80281e9b04110590674a902f3a43bab170a2e87b3fb5f642b3323460bb96"} Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.899248 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" event={"ID":"9d95b2fd-64be-4688-a596-c41bb31cb9c4","Type":"ContainerStarted","Data":"60179549a9f0b5dbaf0c97716a3f646403d0000e866b3ec196196bebb111fdf6"} Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.904605 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7l7sj" event={"ID":"5c679376-bb09-4944-b4ee-3710661612b5","Type":"ContainerStarted","Data":"b4afdc9e206052462c89c3fd37ca081334d6d0b37f35a823344ae1d150a66765"} Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.904626 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7l7sj" event={"ID":"5c679376-bb09-4944-b4ee-3710661612b5","Type":"ContainerStarted","Data":"65210aeaea8be68edc35f13511f341f60e638663e33c4559312f6d0fb836a124"} Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.905437 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7l7sj" Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.906864 4792 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-7l7sj container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.906925 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7l7sj" podUID="5c679376-bb09-4944-b4ee-3710661612b5" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.952936 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.953275 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kxx8s" event={"ID":"f3480f1b-eedb-4bc8-b40f-5c527869096a","Type":"ContainerStarted","Data":"ed567a0606fbc6632e6937fff7b5a0bb538e5c37024978b1b514629d510fa5e4"} Mar 01 09:11:38 crc kubenswrapper[4792]: E0301 09:11:38.953677 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:39.453658308 +0000 UTC m=+228.695537595 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.975290 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9fc7" event={"ID":"a29641af-98a4-47ca-baca-7e933d7a00d5","Type":"ContainerStarted","Data":"310e2042f2dd50873aa63a05decd48e217cbb9950371820b3adbbd2fa7609208"} Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.989571 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-gk6c6" podStartSLOduration=166.989548531 podStartE2EDuration="2m46.989548531s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:38.896274646 +0000 UTC m=+228.138153833" watchObservedRunningTime="2026-03-01 09:11:38.989548531 +0000 UTC m=+228.231427728" Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.003470 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5kp" event={"ID":"77e0e285-570c-47bd-854e-538c9367486b","Type":"ContainerStarted","Data":"cca2ec9af1276f9b1ad87b56b9c2e2a8e7900da42011183a721ef128a1b49dc1"} Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.015926 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qsc9d" event={"ID":"cc60e4e1-1b94-4913-879c-fbd25ff314b9","Type":"ContainerStarted","Data":"20fb1032019b139ae4a359755a225f6fce14f8690da82b73371ec721bf4673cb"} Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.017302 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qsc9d" Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.018216 4792 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-qsc9d container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.018282 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qsc9d" podUID="cc60e4e1-1b94-4913-879c-fbd25ff314b9" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.026561 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c" event={"ID":"b72d8baa-f3f8-4263-a36d-9741ad4243d5","Type":"ContainerStarted","Data":"1ab007922347514bc5d518da115315068e031c931fbd22701561beac17eb7fe2"} Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.026629 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c" event={"ID":"b72d8baa-f3f8-4263-a36d-9741ad4243d5","Type":"ContainerStarted","Data":"93d57c76821b6e6e039439288be1d0054cddb9b84b3c596c23b2a851a1266b29"} Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.027212 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c" Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.028256 4792 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-4jk5c container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.028294 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c" podUID="b72d8baa-f3f8-4263-a36d-9741ad4243d5" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.039050 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2l2w7" event={"ID":"9ee9e9d4-e788-41cb-b601-035551b5338c","Type":"ContainerStarted","Data":"3dcbc32a302bd6ebdeb601d73aa8576d35bceb3405d2c12ddb512b11a49f528f"} Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.054518 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:39 crc kubenswrapper[4792]: E0301 09:11:39.055568 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:39.555541256 +0000 UTC m=+228.797420453 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.059088 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ffd8l" event={"ID":"04b15432-c193-4b0c-b527-df9a9b37c886","Type":"ContainerStarted","Data":"36808e59f4f8c78214d7893f962d00204ea81733bd6ea7647736d9ad3a1e0d3c"} Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.059136 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ffd8l" event={"ID":"04b15432-c193-4b0c-b527-df9a9b37c886","Type":"ContainerStarted","Data":"118cde52251d3c7901a6a4aaffb7eca4c854199aa8f158dcce2329331bd9556e"} Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.106594 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vgrb8" event={"ID":"011a4c5f-1a18-4f0d-884f-43bb6477efb6","Type":"ContainerStarted","Data":"6921e70e382ff7e5603326ccb1dd2ca4e61c2b0663734c079c837b7a00b56460"} Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.138768 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539260-g6qtn" event={"ID":"6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3","Type":"ContainerStarted","Data":"db93ce9b530ac529df661a0d9b5aa2418392875dd62f9aecf46dc33ff7dfd43c"} Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.138813 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539260-g6qtn" event={"ID":"6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3","Type":"ContainerStarted","Data":"e332fd58bac36920eb992b11914f683b07598dd8216c452a72973706723d6019"} Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.156284 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.156738 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" podStartSLOduration=167.156722867 podStartE2EDuration="2m47.156722867s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:39.155650421 +0000 UTC m=+228.397529618" watchObservedRunningTime="2026-03-01 09:11:39.156722867 +0000 UTC m=+228.398602064" Mar 01 09:11:39 crc kubenswrapper[4792]: E0301 09:11:39.159555 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:39.659541296 +0000 UTC m=+228.901420573 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.176289 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-r7d4f" event={"ID":"d25464be-fe72-4409-a934-9e8c70542ed6","Type":"ContainerStarted","Data":"d2e31f8f3a3cfe7df3dfe74597fc3ce0a32ea60b69cfa1435f8678ab36607202"} Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.215166 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rjwhk" event={"ID":"9c283b49-5e58-4c99-97c2-d53ab428265f","Type":"ContainerStarted","Data":"4880a13bcfcb4c6ccb79a8e246e35f9ebc144536cdd75e778c6a07e018feab1d"} Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.248296 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" podUID="44c647cb-a9e2-4e75-abb3-5d3cdbe881a2" containerName="route-controller-manager" containerID="cri-o://ee95da5910e6e2c13434125c99652d9a77af8d0b5457ae7a7ce20a9282ee3b00" gracePeriod=30 Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.249052 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-smnq2" event={"ID":"d566570d-4f58-487b-b824-839792e88650","Type":"ContainerStarted","Data":"f0dac4830900a0ec6f28445041d7a79b7ef7e37a2991e9e677f9b5b3a8edb055"} Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.259391 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:39 crc kubenswrapper[4792]: E0301 09:11:39.260043 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:39.76002848 +0000 UTC m=+229.001907677 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.320418 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5kp" podStartSLOduration=168.320401057 podStartE2EDuration="2m48.320401057s" podCreationTimestamp="2026-03-01 09:08:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:39.240356676 +0000 UTC m=+228.482235873" watchObservedRunningTime="2026-03-01 09:11:39.320401057 +0000 UTC m=+228.562280254" Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.361160 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:39 crc kubenswrapper[4792]: E0301 09:11:39.361998 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:39.86198053 +0000 UTC m=+229.103859727 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.410215 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7l7sj" podStartSLOduration=167.410194277 podStartE2EDuration="2m47.410194277s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:39.329202903 +0000 UTC m=+228.571082100" watchObservedRunningTime="2026-03-01 09:11:39.410194277 +0000 UTC m=+228.652073474" Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.410828 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9fc7" podStartSLOduration=167.410823283 podStartE2EDuration="2m47.410823283s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:39.404986569 +0000 UTC m=+228.646865766" watchObservedRunningTime="2026-03-01 09:11:39.410823283 +0000 UTC m=+228.652702480" Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.463259 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:39 crc kubenswrapper[4792]: E0301 09:11:39.463443 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:39.963416557 +0000 UTC m=+229.205295754 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.463509 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:39 crc kubenswrapper[4792]: E0301 09:11:39.463832 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:39.963824867 +0000 UTC m=+229.205704064 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.539052 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-8qrq4" Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.541604 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29539260-g6qtn" podStartSLOduration=167.541594322 podStartE2EDuration="2m47.541594322s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:39.53135845 +0000 UTC m=+228.773237637" watchObservedRunningTime="2026-03-01 09:11:39.541594322 +0000 UTC m=+228.783473519" Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.542074 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c" podStartSLOduration=167.542069834 podStartE2EDuration="2m47.542069834s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:39.504824727 +0000 UTC m=+228.746703924" watchObservedRunningTime="2026-03-01 09:11:39.542069834 +0000 UTC m=+228.783949031" Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.564462 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:39 crc kubenswrapper[4792]: E0301 09:11:39.564720 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:40.064704191 +0000 UTC m=+229.306583388 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.564830 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:39 crc kubenswrapper[4792]: E0301 09:11:39.565177 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:40.065167422 +0000 UTC m=+229.307046619 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.605773 4792 patch_prober.go:28] interesting pod/router-default-5444994796-qtg4x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 01 09:11:39 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 01 09:11:39 crc kubenswrapper[4792]: [+]process-running ok Mar 01 09:11:39 crc kubenswrapper[4792]: healthz check failed Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.605822 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtg4x" podUID="6d0571e3-5089-4157-a36a-25ecfe6a67f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.665610 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:39 crc kubenswrapper[4792]: E0301 09:11:39.665756 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:40.165731958 +0000 UTC m=+229.407611155 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.666011 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:39 crc kubenswrapper[4792]: E0301 09:11:39.666290 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:40.166277941 +0000 UTC m=+229.408157138 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.729928 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-r7d4f" podStartSLOduration=167.729914208 podStartE2EDuration="2m47.729914208s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:39.723581342 +0000 UTC m=+228.965460539" watchObservedRunningTime="2026-03-01 09:11:39.729914208 +0000 UTC m=+228.971793395" Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.730355 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-smnq2" podStartSLOduration=167.730350979 podStartE2EDuration="2m47.730350979s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:39.661173046 +0000 UTC m=+228.903052243" watchObservedRunningTime="2026-03-01 09:11:39.730350979 +0000 UTC m=+228.972230176" Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.767468 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:39 crc kubenswrapper[4792]: E0301 09:11:39.767892 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:40.267873482 +0000 UTC m=+229.509752679 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.832942 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qsc9d" podStartSLOduration=167.832899433 podStartE2EDuration="2m47.832899433s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:39.83237306 +0000 UTC m=+229.074252257" watchObservedRunningTime="2026-03-01 09:11:39.832899433 +0000 UTC m=+229.074778620" Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.862580 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ffd8l" podStartSLOduration=167.862566614 podStartE2EDuration="2m47.862566614s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:39.861283422 +0000 UTC m=+229.103162619" watchObservedRunningTime="2026-03-01 09:11:39.862566614 +0000 UTC m=+229.104445811" Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.869031 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:39 crc kubenswrapper[4792]: E0301 09:11:39.869372 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:40.369358351 +0000 UTC m=+229.611237548 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.970645 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:39 crc kubenswrapper[4792]: E0301 09:11:39.971109 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:40.471092965 +0000 UTC m=+229.712972162 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.971206 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:39 crc kubenswrapper[4792]: E0301 09:11:39.971506 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:40.471498395 +0000 UTC m=+229.713377582 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.072760 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:40 crc kubenswrapper[4792]: E0301 09:11:40.073101 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:40.573086576 +0000 UTC m=+229.814965773 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.176576 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:40 crc kubenswrapper[4792]: E0301 09:11:40.177052 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:40.677031745 +0000 UTC m=+229.918910942 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.230070 4792 ???:1] "http: TLS handshake error from 192.168.126.11:37472: no serving certificate available for the kubelet" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.259917 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kxx8s" event={"ID":"f3480f1b-eedb-4bc8-b40f-5c527869096a","Type":"ContainerStarted","Data":"2cd2b1c1fbe79104b79a7890f1c68da705c6c5e1f5dbbfa0920403dc7e96efba"} Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.264640 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rjwhk" event={"ID":"9c283b49-5e58-4c99-97c2-d53ab428265f","Type":"ContainerStarted","Data":"e53e3a76c13c685b981c673a044faf023e87e94415d3dcf45af467741eee12f5"} Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.268286 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nc4gs" event={"ID":"9ab8f156-05d7-47d8-b849-a49f1c5cf03b","Type":"ContainerStarted","Data":"84811a8ce860a6a4cc0a951effc898bb30e8e3deb4250372376b597312011ceb"} Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.268458 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nc4gs" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.270754 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-64dsw" event={"ID":"9b2af767-57b1-4774-9668-6610e9ac1bb9","Type":"ContainerStarted","Data":"6fb9ac3e107496f78583b693f70c6629624ebf04472a6d608ffab0ddb5dba2bc"} Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.279576 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:40 crc kubenswrapper[4792]: E0301 09:11:40.280002 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:40.77998429 +0000 UTC m=+230.021863487 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.304315 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ffd8l" event={"ID":"04b15432-c193-4b0c-b527-df9a9b37c886","Type":"ContainerStarted","Data":"c365ade0cbb23c31a7311c8d7fe1c2e6fab58e5adf5f5b03ac2553d4a5c21143"} Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.326425 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vgrb8" event={"ID":"011a4c5f-1a18-4f0d-884f-43bb6477efb6","Type":"ContainerStarted","Data":"5dc3ca91e3a6774ef6c0336afe891e3614c113107660d02914bf8d650bab1f0a"} Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.328817 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-wpgg2" event={"ID":"644b4b74-7ce9-4d36-8938-58a1e2b2b49f","Type":"ContainerStarted","Data":"cbc7ed6e024b61eac779c0b97e21c1f04fb520a328aa8f9223497fda65407cf5"} Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.340523 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" event={"ID":"499393fc-abcf-4998-9e32-3d43a0b1e488","Type":"ContainerStarted","Data":"84b95879ba5e2c4dff9cbfdb619c290685a9892fa4f1d69242e07c9ebdc9f878"} Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.340570 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" event={"ID":"499393fc-abcf-4998-9e32-3d43a0b1e488","Type":"ContainerStarted","Data":"6a2e2c0b304e235ef71551cb9a82a7f076605c19141d9dc370a420d2898365ee"} Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.343514 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kxx8s" podStartSLOduration=168.343498203 podStartE2EDuration="2m48.343498203s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:40.343169835 +0000 UTC m=+229.585049032" watchObservedRunningTime="2026-03-01 09:11:40.343498203 +0000 UTC m=+229.585377400" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.346602 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dgh8q" event={"ID":"bd837cd0-c714-48e1-8771-cc6c419f7639","Type":"ContainerStarted","Data":"80b46179a65824ed46675ea3d847fe75d4b6f791a318c5df26408d25529b470b"} Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.356233 4792 generic.go:334] "Generic (PLEG): container finished" podID="44c647cb-a9e2-4e75-abb3-5d3cdbe881a2" containerID="ee95da5910e6e2c13434125c99652d9a77af8d0b5457ae7a7ce20a9282ee3b00" exitCode=0 Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.357002 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" event={"ID":"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2","Type":"ContainerDied","Data":"ee95da5910e6e2c13434125c99652d9a77af8d0b5457ae7a7ce20a9282ee3b00"} Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.357483 4792 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-4jk5c container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.357519 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c" podUID="b72d8baa-f3f8-4263-a36d-9741ad4243d5" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.357572 4792 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-7l7sj container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.357608 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7l7sj" podUID="5c679376-bb09-4944-b4ee-3710661612b5" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.368177 4792 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-gk6c6 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.368245 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-gk6c6" podUID="e37e6dcb-be13-4787-8555-3ba1050f7b77" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.368403 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" podUID="8578f8dc-143c-423c-b62b-b3190444bafd" containerName="controller-manager" containerID="cri-o://c13739edba999be4dbdb0675da20f693f5c2f873f236f49fe221573f802f1672" gracePeriod=30 Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.380932 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:40 crc kubenswrapper[4792]: E0301 09:11:40.382698 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:40.882676608 +0000 UTC m=+230.124555805 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.455348 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qsc9d" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.482660 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:40 crc kubenswrapper[4792]: E0301 09:11:40.482859 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:40.982827693 +0000 UTC m=+230.224706890 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.487608 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:40 crc kubenswrapper[4792]: E0301 09:11:40.488143 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:40.988130314 +0000 UTC m=+230.230009511 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.513265 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vgrb8" podStartSLOduration=168.513248482 podStartE2EDuration="2m48.513248482s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:40.42947762 +0000 UTC m=+229.671356817" watchObservedRunningTime="2026-03-01 09:11:40.513248482 +0000 UTC m=+229.755127679" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.568016 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" podStartSLOduration=169.56799688 podStartE2EDuration="2m49.56799688s" podCreationTimestamp="2026-03-01 09:08:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:40.515687362 +0000 UTC m=+229.757566569" watchObservedRunningTime="2026-03-01 09:11:40.56799688 +0000 UTC m=+229.809876077" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.589639 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:40 crc kubenswrapper[4792]: E0301 09:11:40.590283 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:41.090268839 +0000 UTC m=+230.332148036 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.609769 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nc4gs" podStartSLOduration=168.609751268 podStartE2EDuration="2m48.609751268s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:40.570885631 +0000 UTC m=+229.812764828" watchObservedRunningTime="2026-03-01 09:11:40.609751268 +0000 UTC m=+229.851630465" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.644136 4792 patch_prober.go:28] interesting pod/router-default-5444994796-qtg4x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 01 09:11:40 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 01 09:11:40 crc kubenswrapper[4792]: [+]process-running ok Mar 01 09:11:40 crc kubenswrapper[4792]: healthz check failed Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.644195 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtg4x" podUID="6d0571e3-5089-4157-a36a-25ecfe6a67f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.652554 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-wpgg2" podStartSLOduration=168.652537511 podStartE2EDuration="2m48.652537511s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:40.610317812 +0000 UTC m=+229.852197009" watchObservedRunningTime="2026-03-01 09:11:40.652537511 +0000 UTC m=+229.894416708" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.691518 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:40 crc kubenswrapper[4792]: E0301 09:11:40.691865 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:41.191854019 +0000 UTC m=+230.433733216 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.792627 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:40 crc kubenswrapper[4792]: E0301 09:11:40.793010 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:41.292992899 +0000 UTC m=+230.534872096 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.793075 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:40 crc kubenswrapper[4792]: E0301 09:11:40.793328 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:41.293317907 +0000 UTC m=+230.535197104 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.851310 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.886586 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-dgh8q" podStartSLOduration=9.886564413 podStartE2EDuration="9.886564413s" podCreationTimestamp="2026-03-01 09:11:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:40.716482686 +0000 UTC m=+229.958361883" watchObservedRunningTime="2026-03-01 09:11:40.886564413 +0000 UTC m=+230.128443610" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.889214 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp"] Mar 01 09:11:40 crc kubenswrapper[4792]: E0301 09:11:40.889403 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44c647cb-a9e2-4e75-abb3-5d3cdbe881a2" containerName="route-controller-manager" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.889418 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="44c647cb-a9e2-4e75-abb3-5d3cdbe881a2" containerName="route-controller-manager" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.889512 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="44c647cb-a9e2-4e75-abb3-5d3cdbe881a2" containerName="route-controller-manager" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.889852 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.893848 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:40 crc kubenswrapper[4792]: E0301 09:11:40.894181 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:41.3941634 +0000 UTC m=+230.636042597 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.908088 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp"] Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.995373 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-config\") pod \"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2\" (UID: \"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2\") " Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.996127 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-client-ca" (OuterVolumeSpecName: "client-ca") pod "44c647cb-a9e2-4e75-abb3-5d3cdbe881a2" (UID: "44c647cb-a9e2-4e75-abb3-5d3cdbe881a2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.996161 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-client-ca\") pod \"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2\" (UID: \"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2\") " Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.996227 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnjdm\" (UniqueName: \"kubernetes.io/projected/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-kube-api-access-fnjdm\") pod \"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2\" (UID: \"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2\") " Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.996254 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-serving-cert\") pod \"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2\" (UID: \"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2\") " Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.996375 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e3e715b-f024-4842-94ed-1f1e054e89c6-client-ca\") pod \"route-controller-manager-8ddf4cb6d-chpwp\" (UID: \"7e3e715b-f024-4842-94ed-1f1e054e89c6\") " pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.996424 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwpwk\" (UniqueName: \"kubernetes.io/projected/7e3e715b-f024-4842-94ed-1f1e054e89c6-kube-api-access-pwpwk\") pod \"route-controller-manager-8ddf4cb6d-chpwp\" (UID: \"7e3e715b-f024-4842-94ed-1f1e054e89c6\") " pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.996451 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e3e715b-f024-4842-94ed-1f1e054e89c6-config\") pod \"route-controller-manager-8ddf4cb6d-chpwp\" (UID: \"7e3e715b-f024-4842-94ed-1f1e054e89c6\") " pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.996484 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.996510 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e3e715b-f024-4842-94ed-1f1e054e89c6-serving-cert\") pod \"route-controller-manager-8ddf4cb6d-chpwp\" (UID: \"7e3e715b-f024-4842-94ed-1f1e054e89c6\") " pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.996572 4792 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-client-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:11:40 crc kubenswrapper[4792]: E0301 09:11:40.997555 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:41.497543775 +0000 UTC m=+230.739422972 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.997742 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-config" (OuterVolumeSpecName: "config") pod "44c647cb-a9e2-4e75-abb3-5d3cdbe881a2" (UID: "44c647cb-a9e2-4e75-abb3-5d3cdbe881a2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.017164 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-kube-api-access-fnjdm" (OuterVolumeSpecName: "kube-api-access-fnjdm") pod "44c647cb-a9e2-4e75-abb3-5d3cdbe881a2" (UID: "44c647cb-a9e2-4e75-abb3-5d3cdbe881a2"). InnerVolumeSpecName "kube-api-access-fnjdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.021237 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "44c647cb-a9e2-4e75-abb3-5d3cdbe881a2" (UID: "44c647cb-a9e2-4e75-abb3-5d3cdbe881a2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.097033 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:41 crc kubenswrapper[4792]: E0301 09:11:41.097367 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:41.597315401 +0000 UTC m=+230.839194598 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.097516 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e3e715b-f024-4842-94ed-1f1e054e89c6-serving-cert\") pod \"route-controller-manager-8ddf4cb6d-chpwp\" (UID: \"7e3e715b-f024-4842-94ed-1f1e054e89c6\") " pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.097583 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e3e715b-f024-4842-94ed-1f1e054e89c6-client-ca\") pod \"route-controller-manager-8ddf4cb6d-chpwp\" (UID: \"7e3e715b-f024-4842-94ed-1f1e054e89c6\") " pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.097623 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwpwk\" (UniqueName: \"kubernetes.io/projected/7e3e715b-f024-4842-94ed-1f1e054e89c6-kube-api-access-pwpwk\") pod \"route-controller-manager-8ddf4cb6d-chpwp\" (UID: \"7e3e715b-f024-4842-94ed-1f1e054e89c6\") " pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.097648 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e3e715b-f024-4842-94ed-1f1e054e89c6-config\") pod \"route-controller-manager-8ddf4cb6d-chpwp\" (UID: \"7e3e715b-f024-4842-94ed-1f1e054e89c6\") " pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.097677 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.097708 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.097718 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnjdm\" (UniqueName: \"kubernetes.io/projected/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-kube-api-access-fnjdm\") on node \"crc\" DevicePath \"\"" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.097730 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:11:41 crc kubenswrapper[4792]: E0301 09:11:41.097992 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:41.597979808 +0000 UTC m=+230.839859005 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.100063 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e3e715b-f024-4842-94ed-1f1e054e89c6-client-ca\") pod \"route-controller-manager-8ddf4cb6d-chpwp\" (UID: \"7e3e715b-f024-4842-94ed-1f1e054e89c6\") " pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.101405 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e3e715b-f024-4842-94ed-1f1e054e89c6-config\") pod \"route-controller-manager-8ddf4cb6d-chpwp\" (UID: \"7e3e715b-f024-4842-94ed-1f1e054e89c6\") " pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.103148 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e3e715b-f024-4842-94ed-1f1e054e89c6-serving-cert\") pod \"route-controller-manager-8ddf4cb6d-chpwp\" (UID: \"7e3e715b-f024-4842-94ed-1f1e054e89c6\") " pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.124209 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwpwk\" (UniqueName: \"kubernetes.io/projected/7e3e715b-f024-4842-94ed-1f1e054e89c6-kube-api-access-pwpwk\") pod \"route-controller-manager-8ddf4cb6d-chpwp\" (UID: \"7e3e715b-f024-4842-94ed-1f1e054e89c6\") " pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.198603 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:41 crc kubenswrapper[4792]: E0301 09:11:41.198800 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:41.698774659 +0000 UTC m=+230.940653856 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.199019 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:41 crc kubenswrapper[4792]: E0301 09:11:41.199323 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:41.699310842 +0000 UTC m=+230.941190039 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.203262 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.302398 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:41 crc kubenswrapper[4792]: E0301 09:11:41.302872 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:41.802855101 +0000 UTC m=+231.044734298 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.380495 4792 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-2l2w7 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.380559 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2l2w7" podUID="9ee9e9d4-e788-41cb-b601-035551b5338c" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.403871 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:41 crc kubenswrapper[4792]: E0301 09:11:41.404228 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:41.904212107 +0000 UTC m=+231.146091304 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.421835 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.427377 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" event={"ID":"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2","Type":"ContainerDied","Data":"c11caa71735c72b6244f64898704ac0350a94ec2fdec7c12cfb52f25b0fabfa9"} Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.427421 4792 scope.go:117] "RemoveContainer" containerID="ee95da5910e6e2c13434125c99652d9a77af8d0b5457ae7a7ce20a9282ee3b00" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.448418 4792 generic.go:334] "Generic (PLEG): container finished" podID="8578f8dc-143c-423c-b62b-b3190444bafd" containerID="c13739edba999be4dbdb0675da20f693f5c2f873f236f49fe221573f802f1672" exitCode=0 Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.448479 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" event={"ID":"8578f8dc-143c-423c-b62b-b3190444bafd","Type":"ContainerDied","Data":"c13739edba999be4dbdb0675da20f693f5c2f873f236f49fe221573f802f1672"} Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.457251 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.462169 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rjwhk" event={"ID":"9c283b49-5e58-4c99-97c2-d53ab428265f","Type":"ContainerStarted","Data":"f4727a8f5737e6c7fab94f58e28153995e9689650f62a525e2ac2573957fbb21"} Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.462642 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-rjwhk" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.479646 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7l7sj" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.504552 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:41 crc kubenswrapper[4792]: E0301 09:11:41.504707 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:42.00467728 +0000 UTC m=+231.246556477 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.504738 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:41 crc kubenswrapper[4792]: E0301 09:11:41.505163 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:42.005150751 +0000 UTC m=+231.247029948 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.624159 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.624533 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8578f8dc-143c-423c-b62b-b3190444bafd-proxy-ca-bundles\") pod \"8578f8dc-143c-423c-b62b-b3190444bafd\" (UID: \"8578f8dc-143c-423c-b62b-b3190444bafd\") " Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.624568 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8578f8dc-143c-423c-b62b-b3190444bafd-config\") pod \"8578f8dc-143c-423c-b62b-b3190444bafd\" (UID: \"8578f8dc-143c-423c-b62b-b3190444bafd\") " Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.624601 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8578f8dc-143c-423c-b62b-b3190444bafd-client-ca\") pod \"8578f8dc-143c-423c-b62b-b3190444bafd\" (UID: \"8578f8dc-143c-423c-b62b-b3190444bafd\") " Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.624628 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8578f8dc-143c-423c-b62b-b3190444bafd-serving-cert\") pod \"8578f8dc-143c-423c-b62b-b3190444bafd\" (UID: \"8578f8dc-143c-423c-b62b-b3190444bafd\") " Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.624647 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qq5x\" (UniqueName: \"kubernetes.io/projected/8578f8dc-143c-423c-b62b-b3190444bafd-kube-api-access-6qq5x\") pod \"8578f8dc-143c-423c-b62b-b3190444bafd\" (UID: \"8578f8dc-143c-423c-b62b-b3190444bafd\") " Mar 01 09:11:41 crc kubenswrapper[4792]: E0301 09:11:41.626679 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:42.126659583 +0000 UTC m=+231.368538780 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.630806 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8578f8dc-143c-423c-b62b-b3190444bafd-client-ca" (OuterVolumeSpecName: "client-ca") pod "8578f8dc-143c-423c-b62b-b3190444bafd" (UID: "8578f8dc-143c-423c-b62b-b3190444bafd"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.630866 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8578f8dc-143c-423c-b62b-b3190444bafd-config" (OuterVolumeSpecName: "config") pod "8578f8dc-143c-423c-b62b-b3190444bafd" (UID: "8578f8dc-143c-423c-b62b-b3190444bafd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.636528 4792 patch_prober.go:28] interesting pod/router-default-5444994796-qtg4x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 01 09:11:41 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 01 09:11:41 crc kubenswrapper[4792]: [+]process-running ok Mar 01 09:11:41 crc kubenswrapper[4792]: healthz check failed Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.636578 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtg4x" podUID="6d0571e3-5089-4157-a36a-25ecfe6a67f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.645473 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8578f8dc-143c-423c-b62b-b3190444bafd-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "8578f8dc-143c-423c-b62b-b3190444bafd" (UID: "8578f8dc-143c-423c-b62b-b3190444bafd"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.659899 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8578f8dc-143c-423c-b62b-b3190444bafd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8578f8dc-143c-423c-b62b-b3190444bafd" (UID: "8578f8dc-143c-423c-b62b-b3190444bafd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.732416 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8578f8dc-143c-423c-b62b-b3190444bafd-kube-api-access-6qq5x" (OuterVolumeSpecName: "kube-api-access-6qq5x") pod "8578f8dc-143c-423c-b62b-b3190444bafd" (UID: "8578f8dc-143c-423c-b62b-b3190444bafd"). InnerVolumeSpecName "kube-api-access-6qq5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.732928 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.733030 4792 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8578f8dc-143c-423c-b62b-b3190444bafd-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.733041 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8578f8dc-143c-423c-b62b-b3190444bafd-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.733051 4792 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8578f8dc-143c-423c-b62b-b3190444bafd-client-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.733058 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8578f8dc-143c-423c-b62b-b3190444bafd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.733068 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qq5x\" (UniqueName: \"kubernetes.io/projected/8578f8dc-143c-423c-b62b-b3190444bafd-kube-api-access-6qq5x\") on node \"crc\" DevicePath \"\"" Mar 01 09:11:41 crc kubenswrapper[4792]: E0301 09:11:41.733514 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:42.233502173 +0000 UTC m=+231.475381370 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.772493 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-rjwhk" podStartSLOduration=10.772470993 podStartE2EDuration="10.772470993s" podCreationTimestamp="2026-03-01 09:11:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:41.648270765 +0000 UTC m=+230.890149962" watchObservedRunningTime="2026-03-01 09:11:41.772470993 +0000 UTC m=+231.014350190" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.821504 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62"] Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.837334 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.837867 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.837934 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.837957 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.837979 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:11:41 crc kubenswrapper[4792]: E0301 09:11:41.848234 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:42.348208707 +0000 UTC m=+231.590087904 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.849162 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62"] Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.850509 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.851570 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.859885 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.866681 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.876865 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.950666 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:41 crc kubenswrapper[4792]: E0301 09:11:41.951046 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:42.451034338 +0000 UTC m=+231.692913535 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.980501 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp"] Mar 01 09:11:42 crc kubenswrapper[4792]: W0301 09:11:42.003275 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e3e715b_f024_4842_94ed_1f1e054e89c6.slice/crio-92e007ff2e490cd943aeebddedb6f93d838bbb65607bec7ae9a1be207a92d259 WatchSource:0}: Error finding container 92e007ff2e490cd943aeebddedb6f93d838bbb65607bec7ae9a1be207a92d259: Status 404 returned error can't find the container with id 92e007ff2e490cd943aeebddedb6f93d838bbb65607bec7ae9a1be207a92d259 Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.051939 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:42 crc kubenswrapper[4792]: E0301 09:11:42.052167 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:42.552135787 +0000 UTC m=+231.794014984 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.052311 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:42 crc kubenswrapper[4792]: E0301 09:11:42.052686 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:42.552670701 +0000 UTC m=+231.794549958 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.134968 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.153369 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:42 crc kubenswrapper[4792]: E0301 09:11:42.153775 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:42.653756589 +0000 UTC m=+231.895635786 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.178872 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.228661 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.255743 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:42 crc kubenswrapper[4792]: E0301 09:11:42.256103 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:42.756091308 +0000 UTC m=+231.997970505 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.357419 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:42 crc kubenswrapper[4792]: E0301 09:11:42.357712 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:42.85769898 +0000 UTC m=+232.099578177 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.448588 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2l2w7" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.459646 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:42 crc kubenswrapper[4792]: E0301 09:11:42.459953 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:42.959941927 +0000 UTC m=+232.201821124 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.528269 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.536126 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" event={"ID":"8578f8dc-143c-423c-b62b-b3190444bafd","Type":"ContainerDied","Data":"452f21dc7923df996fc4ebcc58043ac03b69b3315c7778ffe5676b68b45c4e4f"} Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.536185 4792 scope.go:117] "RemoveContainer" containerID="c13739edba999be4dbdb0675da20f693f5c2f873f236f49fe221573f802f1672" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.558974 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" event={"ID":"7e3e715b-f024-4842-94ed-1f1e054e89c6","Type":"ContainerStarted","Data":"3e137621be8fd281e91660714989b3524d34fe04d0d9acd500e98e9bc50b05d8"} Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.559013 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" event={"ID":"7e3e715b-f024-4842-94ed-1f1e054e89c6","Type":"ContainerStarted","Data":"92e007ff2e490cd943aeebddedb6f93d838bbb65607bec7ae9a1be207a92d259"} Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.559741 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.560012 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:42 crc kubenswrapper[4792]: E0301 09:11:42.564997 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:43.064969612 +0000 UTC m=+232.306848809 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.567002 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-64dsw" event={"ID":"9b2af767-57b1-4774-9668-6610e9ac1bb9","Type":"ContainerStarted","Data":"8af0acb3b09b5977996abac07474f1904f77cdc88726857516e42095b8e57ae7"} Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.583724 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wl9zt"] Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.585240 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wl9zt"] Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.597368 4792 patch_prober.go:28] interesting pod/route-controller-manager-8ddf4cb6d-chpwp container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" start-of-body= Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.597560 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" podUID="7e3e715b-f024-4842-94ed-1f1e054e89c6" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.606623 4792 patch_prober.go:28] interesting pod/router-default-5444994796-qtg4x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 01 09:11:42 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 01 09:11:42 crc kubenswrapper[4792]: [+]process-running ok Mar 01 09:11:42 crc kubenswrapper[4792]: healthz check failed Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.606677 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtg4x" podUID="6d0571e3-5089-4157-a36a-25ecfe6a67f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.609177 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" podStartSLOduration=3.609163379 podStartE2EDuration="3.609163379s" podCreationTimestamp="2026-03-01 09:11:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:42.604353981 +0000 UTC m=+231.846233188" watchObservedRunningTime="2026-03-01 09:11:42.609163379 +0000 UTC m=+231.851042576" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.660984 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:42 crc kubenswrapper[4792]: E0301 09:11:42.662041 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:43.162029761 +0000 UTC m=+232.403908958 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.669288 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wxb87"] Mar 01 09:11:42 crc kubenswrapper[4792]: E0301 09:11:42.669492 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8578f8dc-143c-423c-b62b-b3190444bafd" containerName="controller-manager" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.669502 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8578f8dc-143c-423c-b62b-b3190444bafd" containerName="controller-manager" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.669623 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="8578f8dc-143c-423c-b62b-b3190444bafd" containerName="controller-manager" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.670520 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wxb87" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.674549 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.707822 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wxb87"] Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.764024 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:42 crc kubenswrapper[4792]: E0301 09:11:42.764462 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:43.264435262 +0000 UTC m=+232.506314459 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.764556 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9073e3da-2d6f-48a3-907a-e347f28559ae-catalog-content\") pod \"community-operators-wxb87\" (UID: \"9073e3da-2d6f-48a3-907a-e347f28559ae\") " pod="openshift-marketplace/community-operators-wxb87" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.764580 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc28k\" (UniqueName: \"kubernetes.io/projected/9073e3da-2d6f-48a3-907a-e347f28559ae-kube-api-access-nc28k\") pod \"community-operators-wxb87\" (UID: \"9073e3da-2d6f-48a3-907a-e347f28559ae\") " pod="openshift-marketplace/community-operators-wxb87" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.764614 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.764674 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9073e3da-2d6f-48a3-907a-e347f28559ae-utilities\") pod \"community-operators-wxb87\" (UID: \"9073e3da-2d6f-48a3-907a-e347f28559ae\") " pod="openshift-marketplace/community-operators-wxb87" Mar 01 09:11:42 crc kubenswrapper[4792]: E0301 09:11:42.765174 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:43.26516457 +0000 UTC m=+232.507043767 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.829306 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cw675"] Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.852384 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cw675" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.856708 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.865342 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.865512 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9073e3da-2d6f-48a3-907a-e347f28559ae-catalog-content\") pod \"community-operators-wxb87\" (UID: \"9073e3da-2d6f-48a3-907a-e347f28559ae\") " pod="openshift-marketplace/community-operators-wxb87" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.865538 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nc28k\" (UniqueName: \"kubernetes.io/projected/9073e3da-2d6f-48a3-907a-e347f28559ae-kube-api-access-nc28k\") pod \"community-operators-wxb87\" (UID: \"9073e3da-2d6f-48a3-907a-e347f28559ae\") " pod="openshift-marketplace/community-operators-wxb87" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.865588 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9073e3da-2d6f-48a3-907a-e347f28559ae-utilities\") pod \"community-operators-wxb87\" (UID: \"9073e3da-2d6f-48a3-907a-e347f28559ae\") " pod="openshift-marketplace/community-operators-wxb87" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.866039 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9073e3da-2d6f-48a3-907a-e347f28559ae-utilities\") pod \"community-operators-wxb87\" (UID: \"9073e3da-2d6f-48a3-907a-e347f28559ae\") " pod="openshift-marketplace/community-operators-wxb87" Mar 01 09:11:42 crc kubenswrapper[4792]: E0301 09:11:42.866123 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:43.366104765 +0000 UTC m=+232.607983972 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.866291 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9073e3da-2d6f-48a3-907a-e347f28559ae-catalog-content\") pod \"community-operators-wxb87\" (UID: \"9073e3da-2d6f-48a3-907a-e347f28559ae\") " pod="openshift-marketplace/community-operators-wxb87" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.870348 4792 ???:1] "http: TLS handshake error from 192.168.126.11:37478: no serving certificate available for the kubelet" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.881838 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cw675"] Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.914274 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc28k\" (UniqueName: \"kubernetes.io/projected/9073e3da-2d6f-48a3-907a-e347f28559ae-kube-api-access-nc28k\") pod \"community-operators-wxb87\" (UID: \"9073e3da-2d6f-48a3-907a-e347f28559ae\") " pod="openshift-marketplace/community-operators-wxb87" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.970390 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thnsb\" (UniqueName: \"kubernetes.io/projected/dff0d675-52dd-4cac-a7be-8750333c28e3-kube-api-access-thnsb\") pod \"certified-operators-cw675\" (UID: \"dff0d675-52dd-4cac-a7be-8750333c28e3\") " pod="openshift-marketplace/certified-operators-cw675" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.970655 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dff0d675-52dd-4cac-a7be-8750333c28e3-catalog-content\") pod \"certified-operators-cw675\" (UID: \"dff0d675-52dd-4cac-a7be-8750333c28e3\") " pod="openshift-marketplace/certified-operators-cw675" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.970690 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.970733 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dff0d675-52dd-4cac-a7be-8750333c28e3-utilities\") pod \"certified-operators-cw675\" (UID: \"dff0d675-52dd-4cac-a7be-8750333c28e3\") " pod="openshift-marketplace/certified-operators-cw675" Mar 01 09:11:42 crc kubenswrapper[4792]: E0301 09:11:42.971015 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:43.471003657 +0000 UTC m=+232.712882854 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.020188 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p6b9w"] Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.020238 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wxb87" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.027457 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p6b9w" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.035435 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p6b9w"] Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.073298 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.073545 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dff0d675-52dd-4cac-a7be-8750333c28e3-utilities\") pod \"certified-operators-cw675\" (UID: \"dff0d675-52dd-4cac-a7be-8750333c28e3\") " pod="openshift-marketplace/certified-operators-cw675" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.073584 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thnsb\" (UniqueName: \"kubernetes.io/projected/dff0d675-52dd-4cac-a7be-8750333c28e3-kube-api-access-thnsb\") pod \"certified-operators-cw675\" (UID: \"dff0d675-52dd-4cac-a7be-8750333c28e3\") " pod="openshift-marketplace/certified-operators-cw675" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.073643 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dff0d675-52dd-4cac-a7be-8750333c28e3-catalog-content\") pod \"certified-operators-cw675\" (UID: \"dff0d675-52dd-4cac-a7be-8750333c28e3\") " pod="openshift-marketplace/certified-operators-cw675" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.074322 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dff0d675-52dd-4cac-a7be-8750333c28e3-catalog-content\") pod \"certified-operators-cw675\" (UID: \"dff0d675-52dd-4cac-a7be-8750333c28e3\") " pod="openshift-marketplace/certified-operators-cw675" Mar 01 09:11:43 crc kubenswrapper[4792]: E0301 09:11:43.074389 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:43.574375692 +0000 UTC m=+232.816254889 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.078797 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dff0d675-52dd-4cac-a7be-8750333c28e3-utilities\") pod \"certified-operators-cw675\" (UID: \"dff0d675-52dd-4cac-a7be-8750333c28e3\") " pod="openshift-marketplace/certified-operators-cw675" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.105976 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thnsb\" (UniqueName: \"kubernetes.io/projected/dff0d675-52dd-4cac-a7be-8750333c28e3-kube-api-access-thnsb\") pod \"certified-operators-cw675\" (UID: \"dff0d675-52dd-4cac-a7be-8750333c28e3\") " pod="openshift-marketplace/certified-operators-cw675" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.135534 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.143427 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.145870 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.151339 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.154974 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.175667 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fd91972-6bfc-4041-abc2-8f4298584603-utilities\") pod \"community-operators-p6b9w\" (UID: \"6fd91972-6bfc-4041-abc2-8f4298584603\") " pod="openshift-marketplace/community-operators-p6b9w" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.175723 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.175748 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fd91972-6bfc-4041-abc2-8f4298584603-catalog-content\") pod \"community-operators-p6b9w\" (UID: \"6fd91972-6bfc-4041-abc2-8f4298584603\") " pod="openshift-marketplace/community-operators-p6b9w" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.175802 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whj6h\" (UniqueName: \"kubernetes.io/projected/6fd91972-6bfc-4041-abc2-8f4298584603-kube-api-access-whj6h\") pod \"community-operators-p6b9w\" (UID: \"6fd91972-6bfc-4041-abc2-8f4298584603\") " pod="openshift-marketplace/community-operators-p6b9w" Mar 01 09:11:43 crc kubenswrapper[4792]: E0301 09:11:43.176069 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:43.676058706 +0000 UTC m=+232.917937903 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.188629 4792 patch_prober.go:28] interesting pod/downloads-7954f5f757-wxl8v container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" start-of-body= Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.188837 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wxl8v" podUID="6cc55bdf-6c0f-4d35-879f-c64c2dc4897c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.189056 4792 patch_prober.go:28] interesting pod/downloads-7954f5f757-wxl8v container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" start-of-body= Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.189111 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-wxl8v" podUID="6cc55bdf-6c0f-4d35-879f-c64c2dc4897c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.206812 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.206842 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.211784 4792 patch_prober.go:28] interesting pod/console-f9d7485db-zrzcg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.211852 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-zrzcg" podUID="86788093-42e5-4fa0-9595-97a910e6557e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.226464 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-l7ngd"] Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.228164 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cw675" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.242249 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l7ngd" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.250033 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l7ngd"] Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.276399 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.276549 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fd91972-6bfc-4041-abc2-8f4298584603-catalog-content\") pod \"community-operators-p6b9w\" (UID: \"6fd91972-6bfc-4041-abc2-8f4298584603\") " pod="openshift-marketplace/community-operators-p6b9w" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.276606 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab5a35f7-19e5-4496-b030-59dfa49a64cf-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ab5a35f7-19e5-4496-b030-59dfa49a64cf\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.276669 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab5a35f7-19e5-4496-b030-59dfa49a64cf-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ab5a35f7-19e5-4496-b030-59dfa49a64cf\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.276691 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whj6h\" (UniqueName: \"kubernetes.io/projected/6fd91972-6bfc-4041-abc2-8f4298584603-kube-api-access-whj6h\") pod \"community-operators-p6b9w\" (UID: \"6fd91972-6bfc-4041-abc2-8f4298584603\") " pod="openshift-marketplace/community-operators-p6b9w" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.276729 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fd91972-6bfc-4041-abc2-8f4298584603-utilities\") pod \"community-operators-p6b9w\" (UID: \"6fd91972-6bfc-4041-abc2-8f4298584603\") " pod="openshift-marketplace/community-operators-p6b9w" Mar 01 09:11:43 crc kubenswrapper[4792]: E0301 09:11:43.277422 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:43.777407531 +0000 UTC m=+233.019286718 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.277756 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fd91972-6bfc-4041-abc2-8f4298584603-catalog-content\") pod \"community-operators-p6b9w\" (UID: \"6fd91972-6bfc-4041-abc2-8f4298584603\") " pod="openshift-marketplace/community-operators-p6b9w" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.279155 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fd91972-6bfc-4041-abc2-8f4298584603-utilities\") pod \"community-operators-p6b9w\" (UID: \"6fd91972-6bfc-4041-abc2-8f4298584603\") " pod="openshift-marketplace/community-operators-p6b9w" Mar 01 09:11:43 crc kubenswrapper[4792]: W0301 09:11:43.329225 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-9bfd625c54787a996c945bf828a16cb9a2d42053182e44eae51137d2a911fd8e WatchSource:0}: Error finding container 9bfd625c54787a996c945bf828a16cb9a2d42053182e44eae51137d2a911fd8e: Status 404 returned error can't find the container with id 9bfd625c54787a996c945bf828a16cb9a2d42053182e44eae51137d2a911fd8e Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.336727 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whj6h\" (UniqueName: \"kubernetes.io/projected/6fd91972-6bfc-4041-abc2-8f4298584603-kube-api-access-whj6h\") pod \"community-operators-p6b9w\" (UID: \"6fd91972-6bfc-4041-abc2-8f4298584603\") " pod="openshift-marketplace/community-operators-p6b9w" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.361308 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p6b9w" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.378244 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.378346 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab5a35f7-19e5-4496-b030-59dfa49a64cf-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ab5a35f7-19e5-4496-b030-59dfa49a64cf\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.378382 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8333a325-229b-4dfd-a1f8-966f39bf55fc-utilities\") pod \"certified-operators-l7ngd\" (UID: \"8333a325-229b-4dfd-a1f8-966f39bf55fc\") " pod="openshift-marketplace/certified-operators-l7ngd" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.378442 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab5a35f7-19e5-4496-b030-59dfa49a64cf-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ab5a35f7-19e5-4496-b030-59dfa49a64cf\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.378500 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn884\" (UniqueName: \"kubernetes.io/projected/8333a325-229b-4dfd-a1f8-966f39bf55fc-kube-api-access-jn884\") pod \"certified-operators-l7ngd\" (UID: \"8333a325-229b-4dfd-a1f8-966f39bf55fc\") " pod="openshift-marketplace/certified-operators-l7ngd" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.378527 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab5a35f7-19e5-4496-b030-59dfa49a64cf-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ab5a35f7-19e5-4496-b030-59dfa49a64cf\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.378543 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8333a325-229b-4dfd-a1f8-966f39bf55fc-catalog-content\") pod \"certified-operators-l7ngd\" (UID: \"8333a325-229b-4dfd-a1f8-966f39bf55fc\") " pod="openshift-marketplace/certified-operators-l7ngd" Mar 01 09:11:43 crc kubenswrapper[4792]: E0301 09:11:43.378826 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:43.878810037 +0000 UTC m=+233.120689234 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.404223 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5bf56554b8-qv8hr"] Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.404845 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.421403 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.421657 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.421759 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.421918 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.422018 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.422184 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.429438 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab5a35f7-19e5-4496-b030-59dfa49a64cf-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ab5a35f7-19e5-4496-b030-59dfa49a64cf\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.448204 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44c647cb-a9e2-4e75-abb3-5d3cdbe881a2" path="/var/lib/kubelet/pods/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2/volumes" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.448849 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8578f8dc-143c-423c-b62b-b3190444bafd" path="/var/lib/kubelet/pods/8578f8dc-143c-423c-b62b-b3190444bafd/volumes" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.449299 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5bf56554b8-qv8hr"] Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.450554 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.479873 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.480093 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8333a325-229b-4dfd-a1f8-966f39bf55fc-utilities\") pod \"certified-operators-l7ngd\" (UID: \"8333a325-229b-4dfd-a1f8-966f39bf55fc\") " pod="openshift-marketplace/certified-operators-l7ngd" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.480123 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fca3ad3-b093-4857-85cb-3db2b6516dcf-config\") pod \"controller-manager-5bf56554b8-qv8hr\" (UID: \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\") " pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.480161 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fca3ad3-b093-4857-85cb-3db2b6516dcf-serving-cert\") pod \"controller-manager-5bf56554b8-qv8hr\" (UID: \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\") " pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.480181 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn884\" (UniqueName: \"kubernetes.io/projected/8333a325-229b-4dfd-a1f8-966f39bf55fc-kube-api-access-jn884\") pod \"certified-operators-l7ngd\" (UID: \"8333a325-229b-4dfd-a1f8-966f39bf55fc\") " pod="openshift-marketplace/certified-operators-l7ngd" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.480206 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8333a325-229b-4dfd-a1f8-966f39bf55fc-catalog-content\") pod \"certified-operators-l7ngd\" (UID: \"8333a325-229b-4dfd-a1f8-966f39bf55fc\") " pod="openshift-marketplace/certified-operators-l7ngd" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.480230 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2fca3ad3-b093-4857-85cb-3db2b6516dcf-proxy-ca-bundles\") pod \"controller-manager-5bf56554b8-qv8hr\" (UID: \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\") " pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.480259 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2fca3ad3-b093-4857-85cb-3db2b6516dcf-client-ca\") pod \"controller-manager-5bf56554b8-qv8hr\" (UID: \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\") " pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.480329 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pp7l\" (UniqueName: \"kubernetes.io/projected/2fca3ad3-b093-4857-85cb-3db2b6516dcf-kube-api-access-4pp7l\") pod \"controller-manager-5bf56554b8-qv8hr\" (UID: \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\") " pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" Mar 01 09:11:43 crc kubenswrapper[4792]: E0301 09:11:43.480454 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:43.980439749 +0000 UTC m=+233.222318946 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.480897 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8333a325-229b-4dfd-a1f8-966f39bf55fc-utilities\") pod \"certified-operators-l7ngd\" (UID: \"8333a325-229b-4dfd-a1f8-966f39bf55fc\") " pod="openshift-marketplace/certified-operators-l7ngd" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.481832 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8333a325-229b-4dfd-a1f8-966f39bf55fc-catalog-content\") pod \"certified-operators-l7ngd\" (UID: \"8333a325-229b-4dfd-a1f8-966f39bf55fc\") " pod="openshift-marketplace/certified-operators-l7ngd" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.508672 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn884\" (UniqueName: \"kubernetes.io/projected/8333a325-229b-4dfd-a1f8-966f39bf55fc-kube-api-access-jn884\") pod \"certified-operators-l7ngd\" (UID: \"8333a325-229b-4dfd-a1f8-966f39bf55fc\") " pod="openshift-marketplace/certified-operators-l7ngd" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.518234 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.562629 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l7ngd" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.581084 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2fca3ad3-b093-4857-85cb-3db2b6516dcf-proxy-ca-bundles\") pod \"controller-manager-5bf56554b8-qv8hr\" (UID: \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\") " pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.581320 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.581373 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2fca3ad3-b093-4857-85cb-3db2b6516dcf-client-ca\") pod \"controller-manager-5bf56554b8-qv8hr\" (UID: \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\") " pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.581409 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pp7l\" (UniqueName: \"kubernetes.io/projected/2fca3ad3-b093-4857-85cb-3db2b6516dcf-kube-api-access-4pp7l\") pod \"controller-manager-5bf56554b8-qv8hr\" (UID: \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\") " pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.581453 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fca3ad3-b093-4857-85cb-3db2b6516dcf-config\") pod \"controller-manager-5bf56554b8-qv8hr\" (UID: \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\") " pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.581483 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fca3ad3-b093-4857-85cb-3db2b6516dcf-serving-cert\") pod \"controller-manager-5bf56554b8-qv8hr\" (UID: \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\") " pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.583287 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2fca3ad3-b093-4857-85cb-3db2b6516dcf-client-ca\") pod \"controller-manager-5bf56554b8-qv8hr\" (UID: \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\") " pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.583827 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fca3ad3-b093-4857-85cb-3db2b6516dcf-config\") pod \"controller-manager-5bf56554b8-qv8hr\" (UID: \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\") " pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" Mar 01 09:11:43 crc kubenswrapper[4792]: E0301 09:11:43.584106 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:44.084092601 +0000 UTC m=+233.325971798 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.604587 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2fca3ad3-b093-4857-85cb-3db2b6516dcf-proxy-ca-bundles\") pod \"controller-manager-5bf56554b8-qv8hr\" (UID: \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\") " pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.607001 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-qtg4x" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.622372 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pp7l\" (UniqueName: \"kubernetes.io/projected/2fca3ad3-b093-4857-85cb-3db2b6516dcf-kube-api-access-4pp7l\") pod \"controller-manager-5bf56554b8-qv8hr\" (UID: \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\") " pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.631881 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"84214f99543f0b1e0849c6f8c11a2101441281fafaac18eb335087b26b29526d"} Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.632021 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"0dbc10d9857ea2b6db4cc8848cb7c7653c840c463dc02da4147a3da7ef6a9a30"} Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.648607 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.651931 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.661654 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fca3ad3-b093-4857-85cb-3db2b6516dcf-serving-cert\") pod \"controller-manager-5bf56554b8-qv8hr\" (UID: \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\") " pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.666201 4792 patch_prober.go:28] interesting pod/router-default-5444994796-qtg4x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 01 09:11:43 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 01 09:11:43 crc kubenswrapper[4792]: [+]process-running ok Mar 01 09:11:43 crc kubenswrapper[4792]: healthz check failed Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.666262 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtg4x" podUID="6d0571e3-5089-4157-a36a-25ecfe6a67f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.683625 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:43 crc kubenswrapper[4792]: E0301 09:11:43.683865 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:44.183828616 +0000 UTC m=+233.425707813 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.684287 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.684664 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:43 crc kubenswrapper[4792]: E0301 09:11:43.688094 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:44.18806828 +0000 UTC m=+233.429947477 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.733796 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"9bfd625c54787a996c945bf828a16cb9a2d42053182e44eae51137d2a911fd8e"} Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.757047 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"702340f9d880baf765b9e70f1d41d931dc8733c24078f88256d94345792f4b16"} Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.757107 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"61a3e72aa73e0fa3505422f8f0578d5003513bebae5119711655632907b066b5"} Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.757576 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.763377 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.767365 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-64dsw" event={"ID":"9b2af767-57b1-4774-9668-6610e9ac1bb9","Type":"ContainerStarted","Data":"7e2a93d4148b8e054c7f0ca5a48dbf78d0e0ace6e87d813fc0b575cc94d8e3cc"} Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.787473 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wxb87"] Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.788089 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:43 crc kubenswrapper[4792]: E0301 09:11:43.788864 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:44.288852542 +0000 UTC m=+233.530731739 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.827591 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.889687 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:43 crc kubenswrapper[4792]: E0301 09:11:43.892453 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:44.392425921 +0000 UTC m=+233.634305118 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.991312 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:43 crc kubenswrapper[4792]: E0301 09:11:43.991704 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:44.491688615 +0000 UTC m=+233.733567812 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.064920 4792 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.079788 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cw675"] Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.093034 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:44 crc kubenswrapper[4792]: E0301 09:11:44.093383 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:44.593371508 +0000 UTC m=+233.835250705 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.117491 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.194664 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:44 crc kubenswrapper[4792]: E0301 09:11:44.195013 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:44.69499877 +0000 UTC m=+233.936877967 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.236418 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l7ngd"] Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.288386 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p6b9w"] Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.294094 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5bf56554b8-qv8hr"] Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.296497 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:44 crc kubenswrapper[4792]: E0301 09:11:44.296801 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:44.796788296 +0000 UTC m=+234.038667493 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.397658 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:44 crc kubenswrapper[4792]: E0301 09:11:44.397979 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:44.897950987 +0000 UTC m=+234.139830184 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.398226 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:44 crc kubenswrapper[4792]: E0301 09:11:44.398565 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:44.898553701 +0000 UTC m=+234.140432898 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.506340 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:44 crc kubenswrapper[4792]: E0301 09:11:44.506731 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:45.006713964 +0000 UTC m=+234.248593161 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.606187 4792 patch_prober.go:28] interesting pod/router-default-5444994796-qtg4x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 01 09:11:44 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 01 09:11:44 crc kubenswrapper[4792]: [+]process-running ok Mar 01 09:11:44 crc kubenswrapper[4792]: healthz check failed Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.606280 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtg4x" podUID="6d0571e3-5089-4157-a36a-25ecfe6a67f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.607420 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:44 crc kubenswrapper[4792]: E0301 09:11:44.607808 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:45.107758062 +0000 UTC m=+234.349637259 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.708556 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:44 crc kubenswrapper[4792]: E0301 09:11:44.708700 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:45.208668076 +0000 UTC m=+234.450547273 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.709009 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:44 crc kubenswrapper[4792]: E0301 09:11:44.709372 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:45.209363993 +0000 UTC m=+234.451243190 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.780218 4792 generic.go:334] "Generic (PLEG): container finished" podID="6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3" containerID="db93ce9b530ac529df661a0d9b5aa2418392875dd62f9aecf46dc33ff7dfd43c" exitCode=0 Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.780293 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539260-g6qtn" event={"ID":"6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3","Type":"ContainerDied","Data":"db93ce9b530ac529df661a0d9b5aa2418392875dd62f9aecf46dc33ff7dfd43c"} Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.786098 4792 generic.go:334] "Generic (PLEG): container finished" podID="dff0d675-52dd-4cac-a7be-8750333c28e3" containerID="2793d9d53a76fa1bceb0317b96453e19780c382a91f9c7fd6f680978a1c2a121" exitCode=0 Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.786138 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cw675" event={"ID":"dff0d675-52dd-4cac-a7be-8750333c28e3","Type":"ContainerDied","Data":"2793d9d53a76fa1bceb0317b96453e19780c382a91f9c7fd6f680978a1c2a121"} Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.786184 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cw675" event={"ID":"dff0d675-52dd-4cac-a7be-8750333c28e3","Type":"ContainerStarted","Data":"339bd57c1cc13f4119c941d59da1a1ea961d6edda5147620d400c9a04f2e8ceb"} Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.788644 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ab5a35f7-19e5-4496-b030-59dfa49a64cf","Type":"ContainerStarted","Data":"671fb79672e1adda67b9686140cc8c8e23feef6fc30c770412920df0821a0f37"} Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.788674 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ab5a35f7-19e5-4496-b030-59dfa49a64cf","Type":"ContainerStarted","Data":"9b3b8533ecddfd3cef9d38d86a78b540d4a7fe462e6ed508f6f81c47012cffc7"} Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.793660 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" event={"ID":"2fca3ad3-b093-4857-85cb-3db2b6516dcf","Type":"ContainerStarted","Data":"5e459860652998dbc975fd66e05ff3eab25257081065483e039f02fb979238f7"} Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.793707 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" event={"ID":"2fca3ad3-b093-4857-85cb-3db2b6516dcf","Type":"ContainerStarted","Data":"834122f1bb558979716c9151687e893e13e6120e36e39817e6beb2432dd0f0fd"} Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.794022 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.800532 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-64dsw" event={"ID":"9b2af767-57b1-4774-9668-6610e9ac1bb9","Type":"ContainerStarted","Data":"ad6fbd3af39fd89c1a3dcc34e6e3ab5110eacb45dc7bd773cb3231d5c336f69a"} Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.801557 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.802798 4792 generic.go:334] "Generic (PLEG): container finished" podID="6fd91972-6bfc-4041-abc2-8f4298584603" containerID="aa50e0a49246a84ff0433571a1c37df20be8eda2033b2980a1562df09196f210" exitCode=0 Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.802861 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p6b9w" event={"ID":"6fd91972-6bfc-4041-abc2-8f4298584603","Type":"ContainerDied","Data":"aa50e0a49246a84ff0433571a1c37df20be8eda2033b2980a1562df09196f210"} Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.802884 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p6b9w" event={"ID":"6fd91972-6bfc-4041-abc2-8f4298584603","Type":"ContainerStarted","Data":"477ff8111ce50f202ac20e7801f2ac3bd82c5ed546b9e6aeee69367ec0d09908"} Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.806673 4792 generic.go:334] "Generic (PLEG): container finished" podID="9073e3da-2d6f-48a3-907a-e347f28559ae" containerID="cabcf6681e53e945ec2b29b89c123577ba10471c42575321a6f1da2be37ffe2a" exitCode=0 Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.806734 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wxb87" event={"ID":"9073e3da-2d6f-48a3-907a-e347f28559ae","Type":"ContainerDied","Data":"cabcf6681e53e945ec2b29b89c123577ba10471c42575321a6f1da2be37ffe2a"} Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.806760 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wxb87" event={"ID":"9073e3da-2d6f-48a3-907a-e347f28559ae","Type":"ContainerStarted","Data":"f1bc5d15012443ccc9990bce32da439c2d26cd8bac7e62fa7b81593bfe925710"} Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.808722 4792 generic.go:334] "Generic (PLEG): container finished" podID="8333a325-229b-4dfd-a1f8-966f39bf55fc" containerID="ea0440bd6858d820e5ab5d60d7085504804067b9ef42039aabaf07d6f0cd7730" exitCode=0 Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.808800 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l7ngd" event={"ID":"8333a325-229b-4dfd-a1f8-966f39bf55fc","Type":"ContainerDied","Data":"ea0440bd6858d820e5ab5d60d7085504804067b9ef42039aabaf07d6f0cd7730"} Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.808835 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l7ngd" event={"ID":"8333a325-229b-4dfd-a1f8-966f39bf55fc","Type":"ContainerStarted","Data":"90fafa3ed5c527a4e521d0e12598d211c5be0e011b432479d9a243fe086b9d81"} Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.809450 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:44 crc kubenswrapper[4792]: E0301 09:11:44.809562 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:45.309537229 +0000 UTC m=+234.551416426 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.809693 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:44 crc kubenswrapper[4792]: E0301 09:11:44.810010 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:45.30999858 +0000 UTC m=+234.551877777 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.816842 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"4e25f294f98231b81432bed1d2a4d2d921bf0ecad4575f5829c0b9e753030adb"} Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.823080 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.823917 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.824491 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-n28r8"] Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.827100 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.827213 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n28r8" Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.833375 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.839119 4792 patch_prober.go:28] interesting pod/apiserver-76f77b778f-6lk5b container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 01 09:11:44 crc kubenswrapper[4792]: [+]log ok Mar 01 09:11:44 crc kubenswrapper[4792]: [+]etcd ok Mar 01 09:11:44 crc kubenswrapper[4792]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 01 09:11:44 crc kubenswrapper[4792]: [+]poststarthook/generic-apiserver-start-informers ok Mar 01 09:11:44 crc kubenswrapper[4792]: [+]poststarthook/max-in-flight-filter ok Mar 01 09:11:44 crc kubenswrapper[4792]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 01 09:11:44 crc kubenswrapper[4792]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 01 09:11:44 crc kubenswrapper[4792]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 01 09:11:44 crc kubenswrapper[4792]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 01 09:11:44 crc kubenswrapper[4792]: [+]poststarthook/project.openshift.io-projectcache ok Mar 01 09:11:44 crc kubenswrapper[4792]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 01 09:11:44 crc kubenswrapper[4792]: [+]poststarthook/openshift.io-startinformers ok Mar 01 09:11:44 crc kubenswrapper[4792]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 01 09:11:44 crc kubenswrapper[4792]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 01 09:11:44 crc kubenswrapper[4792]: livez check failed Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.839182 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" podUID="499393fc-abcf-4998-9e32-3d43a0b1e488" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.847358 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-64dsw" podStartSLOduration=13.84733928 podStartE2EDuration="13.84733928s" podCreationTimestamp="2026-03-01 09:11:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:44.844842348 +0000 UTC m=+234.086721555" watchObservedRunningTime="2026-03-01 09:11:44.84733928 +0000 UTC m=+234.089218477" Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.852454 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n28r8"] Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.860341 4792 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-01T09:11:44.064951459Z","Handler":null,"Name":""} Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.869671 4792 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.869703 4792 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.901928 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=1.9018922630000001 podStartE2EDuration="1.901892263s" podCreationTimestamp="2026-03-01 09:11:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:44.861461837 +0000 UTC m=+234.103341034" watchObservedRunningTime="2026-03-01 09:11:44.901892263 +0000 UTC m=+234.143771470" Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.910435 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.911331 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc5gl\" (UniqueName: \"kubernetes.io/projected/e03dee1d-d7ca-422c-8af6-faa0c1af3863-kube-api-access-kc5gl\") pod \"redhat-marketplace-n28r8\" (UID: \"e03dee1d-d7ca-422c-8af6-faa0c1af3863\") " pod="openshift-marketplace/redhat-marketplace-n28r8" Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.911695 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e03dee1d-d7ca-422c-8af6-faa0c1af3863-catalog-content\") pod \"redhat-marketplace-n28r8\" (UID: \"e03dee1d-d7ca-422c-8af6-faa0c1af3863\") " pod="openshift-marketplace/redhat-marketplace-n28r8" Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.911899 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e03dee1d-d7ca-422c-8af6-faa0c1af3863-utilities\") pod \"redhat-marketplace-n28r8\" (UID: \"e03dee1d-d7ca-422c-8af6-faa0c1af3863\") " pod="openshift-marketplace/redhat-marketplace-n28r8" Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.961335 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-gk6c6" Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.964050 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" podStartSLOduration=5.964032602 podStartE2EDuration="5.964032602s" podCreationTimestamp="2026-03-01 09:11:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:44.96230743 +0000 UTC m=+234.204186627" watchObservedRunningTime="2026-03-01 09:11:44.964032602 +0000 UTC m=+234.205911799" Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.979855 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.013952 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc5gl\" (UniqueName: \"kubernetes.io/projected/e03dee1d-d7ca-422c-8af6-faa0c1af3863-kube-api-access-kc5gl\") pod \"redhat-marketplace-n28r8\" (UID: \"e03dee1d-d7ca-422c-8af6-faa0c1af3863\") " pod="openshift-marketplace/redhat-marketplace-n28r8" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.014020 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.014067 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e03dee1d-d7ca-422c-8af6-faa0c1af3863-catalog-content\") pod \"redhat-marketplace-n28r8\" (UID: \"e03dee1d-d7ca-422c-8af6-faa0c1af3863\") " pod="openshift-marketplace/redhat-marketplace-n28r8" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.014116 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e03dee1d-d7ca-422c-8af6-faa0c1af3863-utilities\") pod \"redhat-marketplace-n28r8\" (UID: \"e03dee1d-d7ca-422c-8af6-faa0c1af3863\") " pod="openshift-marketplace/redhat-marketplace-n28r8" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.014559 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e03dee1d-d7ca-422c-8af6-faa0c1af3863-utilities\") pod \"redhat-marketplace-n28r8\" (UID: \"e03dee1d-d7ca-422c-8af6-faa0c1af3863\") " pod="openshift-marketplace/redhat-marketplace-n28r8" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.015692 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e03dee1d-d7ca-422c-8af6-faa0c1af3863-catalog-content\") pod \"redhat-marketplace-n28r8\" (UID: \"e03dee1d-d7ca-422c-8af6-faa0c1af3863\") " pod="openshift-marketplace/redhat-marketplace-n28r8" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.048268 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc5gl\" (UniqueName: \"kubernetes.io/projected/e03dee1d-d7ca-422c-8af6-faa0c1af3863-kube-api-access-kc5gl\") pod \"redhat-marketplace-n28r8\" (UID: \"e03dee1d-d7ca-422c-8af6-faa0c1af3863\") " pod="openshift-marketplace/redhat-marketplace-n28r8" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.146120 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n28r8" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.218892 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zwt8t"] Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.219871 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zwt8t" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.224436 4792 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.224532 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.251397 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zwt8t"] Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.320465 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjlvw\" (UniqueName: \"kubernetes.io/projected/fee8fc8f-8d72-4606-b115-4197f599cfcb-kube-api-access-mjlvw\") pod \"redhat-marketplace-zwt8t\" (UID: \"fee8fc8f-8d72-4606-b115-4197f599cfcb\") " pod="openshift-marketplace/redhat-marketplace-zwt8t" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.320546 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fee8fc8f-8d72-4606-b115-4197f599cfcb-utilities\") pod \"redhat-marketplace-zwt8t\" (UID: \"fee8fc8f-8d72-4606-b115-4197f599cfcb\") " pod="openshift-marketplace/redhat-marketplace-zwt8t" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.320571 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fee8fc8f-8d72-4606-b115-4197f599cfcb-catalog-content\") pod \"redhat-marketplace-zwt8t\" (UID: \"fee8fc8f-8d72-4606-b115-4197f599cfcb\") " pod="openshift-marketplace/redhat-marketplace-zwt8t" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.410541 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.422552 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjlvw\" (UniqueName: \"kubernetes.io/projected/fee8fc8f-8d72-4606-b115-4197f599cfcb-kube-api-access-mjlvw\") pod \"redhat-marketplace-zwt8t\" (UID: \"fee8fc8f-8d72-4606-b115-4197f599cfcb\") " pod="openshift-marketplace/redhat-marketplace-zwt8t" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.422624 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fee8fc8f-8d72-4606-b115-4197f599cfcb-utilities\") pod \"redhat-marketplace-zwt8t\" (UID: \"fee8fc8f-8d72-4606-b115-4197f599cfcb\") " pod="openshift-marketplace/redhat-marketplace-zwt8t" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.422647 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fee8fc8f-8d72-4606-b115-4197f599cfcb-catalog-content\") pod \"redhat-marketplace-zwt8t\" (UID: \"fee8fc8f-8d72-4606-b115-4197f599cfcb\") " pod="openshift-marketplace/redhat-marketplace-zwt8t" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.423107 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fee8fc8f-8d72-4606-b115-4197f599cfcb-catalog-content\") pod \"redhat-marketplace-zwt8t\" (UID: \"fee8fc8f-8d72-4606-b115-4197f599cfcb\") " pod="openshift-marketplace/redhat-marketplace-zwt8t" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.423322 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fee8fc8f-8d72-4606-b115-4197f599cfcb-utilities\") pod \"redhat-marketplace-zwt8t\" (UID: \"fee8fc8f-8d72-4606-b115-4197f599cfcb\") " pod="openshift-marketplace/redhat-marketplace-zwt8t" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.454539 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.463754 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjlvw\" (UniqueName: \"kubernetes.io/projected/fee8fc8f-8d72-4606-b115-4197f599cfcb-kube-api-access-mjlvw\") pod \"redhat-marketplace-zwt8t\" (UID: \"fee8fc8f-8d72-4606-b115-4197f599cfcb\") " pod="openshift-marketplace/redhat-marketplace-zwt8t" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.542072 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zwt8t" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.553957 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.607299 4792 patch_prober.go:28] interesting pod/router-default-5444994796-qtg4x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 01 09:11:45 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 01 09:11:45 crc kubenswrapper[4792]: [+]process-running ok Mar 01 09:11:45 crc kubenswrapper[4792]: healthz check failed Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.607352 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtg4x" podUID="6d0571e3-5089-4157-a36a-25ecfe6a67f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.654928 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n28r8"] Mar 01 09:11:45 crc kubenswrapper[4792]: W0301 09:11:45.683043 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode03dee1d_d7ca_422c_8af6_faa0c1af3863.slice/crio-385e5f3c0c4fe34a9ca54a291cae71ccbb846b542d87341115b79874eadd771a WatchSource:0}: Error finding container 385e5f3c0c4fe34a9ca54a291cae71ccbb846b542d87341115b79874eadd771a: Status 404 returned error can't find the container with id 385e5f3c0c4fe34a9ca54a291cae71ccbb846b542d87341115b79874eadd771a Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.837952 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7nwc2"] Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.839163 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7nwc2" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.848217 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.855606 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n28r8" event={"ID":"e03dee1d-d7ca-422c-8af6-faa0c1af3863","Type":"ContainerStarted","Data":"385e5f3c0c4fe34a9ca54a291cae71ccbb846b542d87341115b79874eadd771a"} Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.859150 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7nwc2"] Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.866792 4792 generic.go:334] "Generic (PLEG): container finished" podID="ab5a35f7-19e5-4496-b030-59dfa49a64cf" containerID="671fb79672e1adda67b9686140cc8c8e23feef6fc30c770412920df0821a0f37" exitCode=0 Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.866929 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ab5a35f7-19e5-4496-b030-59dfa49a64cf","Type":"ContainerDied","Data":"671fb79672e1adda67b9686140cc8c8e23feef6fc30c770412920df0821a0f37"} Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.933723 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22c7c368-3523-4224-aebd-59b29640bed0-utilities\") pod \"redhat-operators-7nwc2\" (UID: \"22c7c368-3523-4224-aebd-59b29640bed0\") " pod="openshift-marketplace/redhat-operators-7nwc2" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.933805 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djdkr\" (UniqueName: \"kubernetes.io/projected/22c7c368-3523-4224-aebd-59b29640bed0-kube-api-access-djdkr\") pod \"redhat-operators-7nwc2\" (UID: \"22c7c368-3523-4224-aebd-59b29640bed0\") " pod="openshift-marketplace/redhat-operators-7nwc2" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.934103 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22c7c368-3523-4224-aebd-59b29640bed0-catalog-content\") pod \"redhat-operators-7nwc2\" (UID: \"22c7c368-3523-4224-aebd-59b29640bed0\") " pod="openshift-marketplace/redhat-operators-7nwc2" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.980536 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zwt8t"] Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.037471 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22c7c368-3523-4224-aebd-59b29640bed0-utilities\") pod \"redhat-operators-7nwc2\" (UID: \"22c7c368-3523-4224-aebd-59b29640bed0\") " pod="openshift-marketplace/redhat-operators-7nwc2" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.037502 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djdkr\" (UniqueName: \"kubernetes.io/projected/22c7c368-3523-4224-aebd-59b29640bed0-kube-api-access-djdkr\") pod \"redhat-operators-7nwc2\" (UID: \"22c7c368-3523-4224-aebd-59b29640bed0\") " pod="openshift-marketplace/redhat-operators-7nwc2" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.037557 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22c7c368-3523-4224-aebd-59b29640bed0-catalog-content\") pod \"redhat-operators-7nwc2\" (UID: \"22c7c368-3523-4224-aebd-59b29640bed0\") " pod="openshift-marketplace/redhat-operators-7nwc2" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.037927 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22c7c368-3523-4224-aebd-59b29640bed0-catalog-content\") pod \"redhat-operators-7nwc2\" (UID: \"22c7c368-3523-4224-aebd-59b29640bed0\") " pod="openshift-marketplace/redhat-operators-7nwc2" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.038135 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22c7c368-3523-4224-aebd-59b29640bed0-utilities\") pod \"redhat-operators-7nwc2\" (UID: \"22c7c368-3523-4224-aebd-59b29640bed0\") " pod="openshift-marketplace/redhat-operators-7nwc2" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.092155 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djdkr\" (UniqueName: \"kubernetes.io/projected/22c7c368-3523-4224-aebd-59b29640bed0-kube-api-access-djdkr\") pod \"redhat-operators-7nwc2\" (UID: \"22c7c368-3523-4224-aebd-59b29640bed0\") " pod="openshift-marketplace/redhat-operators-7nwc2" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.110798 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4jwnr"] Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.113475 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.114270 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.123897 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.124346 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 01 09:11:46 crc kubenswrapper[4792]: W0301 09:11:46.136952 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf147eb3a_0f65_4ecb_b1a2_5d561c21253c.slice/crio-21cb9ec22cbd36ee1804b1e2f96533e30f8509709de84a1859f63834b8df4b3f WatchSource:0}: Error finding container 21cb9ec22cbd36ee1804b1e2f96533e30f8509709de84a1859f63834b8df4b3f: Status 404 returned error can't find the container with id 21cb9ec22cbd36ee1804b1e2f96533e30f8509709de84a1859f63834b8df4b3f Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.188312 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7nwc2" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.190767 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.242954 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9a193ef8-bf31-4d61-972e-5772b8fe8c39-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9a193ef8-bf31-4d61-972e-5772b8fe8c39\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.243229 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9a193ef8-bf31-4d61-972e-5772b8fe8c39-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9a193ef8-bf31-4d61-972e-5772b8fe8c39\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.275123 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ftk4v"] Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.276281 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ftk4v" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.322501 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ftk4v"] Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.344602 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e93e87c0-86c5-446d-9f43-71d17960a351-utilities\") pod \"redhat-operators-ftk4v\" (UID: \"e93e87c0-86c5-446d-9f43-71d17960a351\") " pod="openshift-marketplace/redhat-operators-ftk4v" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.344667 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9a193ef8-bf31-4d61-972e-5772b8fe8c39-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9a193ef8-bf31-4d61-972e-5772b8fe8c39\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.344701 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tbnw\" (UniqueName: \"kubernetes.io/projected/e93e87c0-86c5-446d-9f43-71d17960a351-kube-api-access-4tbnw\") pod \"redhat-operators-ftk4v\" (UID: \"e93e87c0-86c5-446d-9f43-71d17960a351\") " pod="openshift-marketplace/redhat-operators-ftk4v" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.344721 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e93e87c0-86c5-446d-9f43-71d17960a351-catalog-content\") pod \"redhat-operators-ftk4v\" (UID: \"e93e87c0-86c5-446d-9f43-71d17960a351\") " pod="openshift-marketplace/redhat-operators-ftk4v" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.344741 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9a193ef8-bf31-4d61-972e-5772b8fe8c39-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9a193ef8-bf31-4d61-972e-5772b8fe8c39\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.344834 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9a193ef8-bf31-4d61-972e-5772b8fe8c39-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9a193ef8-bf31-4d61-972e-5772b8fe8c39\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.403645 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9a193ef8-bf31-4d61-972e-5772b8fe8c39-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9a193ef8-bf31-4d61-972e-5772b8fe8c39\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.447531 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e93e87c0-86c5-446d-9f43-71d17960a351-utilities\") pod \"redhat-operators-ftk4v\" (UID: \"e93e87c0-86c5-446d-9f43-71d17960a351\") " pod="openshift-marketplace/redhat-operators-ftk4v" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.447603 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tbnw\" (UniqueName: \"kubernetes.io/projected/e93e87c0-86c5-446d-9f43-71d17960a351-kube-api-access-4tbnw\") pod \"redhat-operators-ftk4v\" (UID: \"e93e87c0-86c5-446d-9f43-71d17960a351\") " pod="openshift-marketplace/redhat-operators-ftk4v" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.447626 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e93e87c0-86c5-446d-9f43-71d17960a351-catalog-content\") pod \"redhat-operators-ftk4v\" (UID: \"e93e87c0-86c5-446d-9f43-71d17960a351\") " pod="openshift-marketplace/redhat-operators-ftk4v" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.448068 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e93e87c0-86c5-446d-9f43-71d17960a351-catalog-content\") pod \"redhat-operators-ftk4v\" (UID: \"e93e87c0-86c5-446d-9f43-71d17960a351\") " pod="openshift-marketplace/redhat-operators-ftk4v" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.448298 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e93e87c0-86c5-446d-9f43-71d17960a351-utilities\") pod \"redhat-operators-ftk4v\" (UID: \"e93e87c0-86c5-446d-9f43-71d17960a351\") " pod="openshift-marketplace/redhat-operators-ftk4v" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.463058 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.500884 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539260-g6qtn" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.536515 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tbnw\" (UniqueName: \"kubernetes.io/projected/e93e87c0-86c5-446d-9f43-71d17960a351-kube-api-access-4tbnw\") pod \"redhat-operators-ftk4v\" (UID: \"e93e87c0-86c5-446d-9f43-71d17960a351\") " pod="openshift-marketplace/redhat-operators-ftk4v" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.618162 4792 patch_prober.go:28] interesting pod/router-default-5444994796-qtg4x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 01 09:11:46 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 01 09:11:46 crc kubenswrapper[4792]: [+]process-running ok Mar 01 09:11:46 crc kubenswrapper[4792]: healthz check failed Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.618209 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtg4x" podUID="6d0571e3-5089-4157-a36a-25ecfe6a67f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.622242 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ftk4v" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.650288 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3-config-volume\") pod \"6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3\" (UID: \"6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3\") " Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.650539 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3-secret-volume\") pod \"6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3\" (UID: \"6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3\") " Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.650675 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzw4q\" (UniqueName: \"kubernetes.io/projected/6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3-kube-api-access-lzw4q\") pod \"6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3\" (UID: \"6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3\") " Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.651733 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3-config-volume" (OuterVolumeSpecName: "config-volume") pod "6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3" (UID: "6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.674377 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3-kube-api-access-lzw4q" (OuterVolumeSpecName: "kube-api-access-lzw4q") pod "6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3" (UID: "6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3"). InnerVolumeSpecName "kube-api-access-lzw4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.677763 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3" (UID: "6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.752548 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzw4q\" (UniqueName: \"kubernetes.io/projected/6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3-kube-api-access-lzw4q\") on node \"crc\" DevicePath \"\"" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.752583 4792 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3-config-volume\") on node \"crc\" DevicePath \"\"" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.752592 4792 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.889784 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539260-g6qtn" event={"ID":"6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3","Type":"ContainerDied","Data":"e332fd58bac36920eb992b11914f683b07598dd8216c452a72973706723d6019"} Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.889819 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e332fd58bac36920eb992b11914f683b07598dd8216c452a72973706723d6019" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.889890 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539260-g6qtn" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.936145 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7nwc2"] Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.961795 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" event={"ID":"f147eb3a-0f65-4ecb-b1a2-5d561c21253c","Type":"ContainerStarted","Data":"0779d989a019df4f783c6ed27e1b08237b43aef3c94320d704c90c5367c173ef"} Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.962178 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" event={"ID":"f147eb3a-0f65-4ecb-b1a2-5d561c21253c","Type":"ContainerStarted","Data":"21cb9ec22cbd36ee1804b1e2f96533e30f8509709de84a1859f63834b8df4b3f"} Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.962215 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.995837 4792 generic.go:334] "Generic (PLEG): container finished" podID="fee8fc8f-8d72-4606-b115-4197f599cfcb" containerID="e49f509c95cf50378f216231453383d3c1c559a790286ac1d1a0c0a75d1546f4" exitCode=0 Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.996049 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zwt8t" event={"ID":"fee8fc8f-8d72-4606-b115-4197f599cfcb","Type":"ContainerDied","Data":"e49f509c95cf50378f216231453383d3c1c559a790286ac1d1a0c0a75d1546f4"} Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.996081 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zwt8t" event={"ID":"fee8fc8f-8d72-4606-b115-4197f599cfcb","Type":"ContainerStarted","Data":"6a4a6d1ca04b5a791e8cc232ac1bcb86844593ceb569a67c479055eb35e8caec"} Mar 01 09:11:47 crc kubenswrapper[4792]: I0301 09:11:47.000797 4792 generic.go:334] "Generic (PLEG): container finished" podID="e03dee1d-d7ca-422c-8af6-faa0c1af3863" containerID="0c84982bc61501954eca9c9293432f6ed8f745c671471318271733558785ba39" exitCode=0 Mar 01 09:11:47 crc kubenswrapper[4792]: I0301 09:11:47.002445 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n28r8" event={"ID":"e03dee1d-d7ca-422c-8af6-faa0c1af3863","Type":"ContainerDied","Data":"0c84982bc61501954eca9c9293432f6ed8f745c671471318271733558785ba39"} Mar 01 09:11:47 crc kubenswrapper[4792]: I0301 09:11:47.018609 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" podStartSLOduration=175.018591181 podStartE2EDuration="2m55.018591181s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:47.017280019 +0000 UTC m=+236.259159246" watchObservedRunningTime="2026-03-01 09:11:47.018591181 +0000 UTC m=+236.260470378" Mar 01 09:11:47 crc kubenswrapper[4792]: W0301 09:11:47.034642 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22c7c368_3523_4224_aebd_59b29640bed0.slice/crio-52cac0b41f9f5231fc4b99f244ac1d2bd2e66edf0a6640e4bac8c95c340ce560 WatchSource:0}: Error finding container 52cac0b41f9f5231fc4b99f244ac1d2bd2e66edf0a6640e4bac8c95c340ce560: Status 404 returned error can't find the container with id 52cac0b41f9f5231fc4b99f244ac1d2bd2e66edf0a6640e4bac8c95c340ce560 Mar 01 09:11:47 crc kubenswrapper[4792]: I0301 09:11:47.141246 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 01 09:11:47 crc kubenswrapper[4792]: I0301 09:11:47.468640 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ftk4v"] Mar 01 09:11:47 crc kubenswrapper[4792]: I0301 09:11:47.615715 4792 patch_prober.go:28] interesting pod/router-default-5444994796-qtg4x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 01 09:11:47 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 01 09:11:47 crc kubenswrapper[4792]: [+]process-running ok Mar 01 09:11:47 crc kubenswrapper[4792]: healthz check failed Mar 01 09:11:47 crc kubenswrapper[4792]: I0301 09:11:47.616027 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtg4x" podUID="6d0571e3-5089-4157-a36a-25ecfe6a67f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:11:47 crc kubenswrapper[4792]: I0301 09:11:47.724366 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 01 09:11:47 crc kubenswrapper[4792]: I0301 09:11:47.779679 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab5a35f7-19e5-4496-b030-59dfa49a64cf-kube-api-access\") pod \"ab5a35f7-19e5-4496-b030-59dfa49a64cf\" (UID: \"ab5a35f7-19e5-4496-b030-59dfa49a64cf\") " Mar 01 09:11:47 crc kubenswrapper[4792]: I0301 09:11:47.779735 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab5a35f7-19e5-4496-b030-59dfa49a64cf-kubelet-dir\") pod \"ab5a35f7-19e5-4496-b030-59dfa49a64cf\" (UID: \"ab5a35f7-19e5-4496-b030-59dfa49a64cf\") " Mar 01 09:11:47 crc kubenswrapper[4792]: I0301 09:11:47.780125 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ab5a35f7-19e5-4496-b030-59dfa49a64cf-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ab5a35f7-19e5-4496-b030-59dfa49a64cf" (UID: "ab5a35f7-19e5-4496-b030-59dfa49a64cf"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:11:47 crc kubenswrapper[4792]: I0301 09:11:47.788165 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab5a35f7-19e5-4496-b030-59dfa49a64cf-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ab5a35f7-19e5-4496-b030-59dfa49a64cf" (UID: "ab5a35f7-19e5-4496-b030-59dfa49a64cf"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:11:47 crc kubenswrapper[4792]: I0301 09:11:47.882311 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab5a35f7-19e5-4496-b030-59dfa49a64cf-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 01 09:11:47 crc kubenswrapper[4792]: I0301 09:11:47.882375 4792 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab5a35f7-19e5-4496-b030-59dfa49a64cf-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 01 09:11:48 crc kubenswrapper[4792]: I0301 09:11:48.020183 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9a193ef8-bf31-4d61-972e-5772b8fe8c39","Type":"ContainerStarted","Data":"8c3c0f18faf3dfd4f659a530a5e1444bf2456ab64ec87d38ae446eb8262bf116"} Mar 01 09:11:48 crc kubenswrapper[4792]: I0301 09:11:48.026844 4792 ???:1] "http: TLS handshake error from 192.168.126.11:60374: no serving certificate available for the kubelet" Mar 01 09:11:48 crc kubenswrapper[4792]: I0301 09:11:48.033433 4792 generic.go:334] "Generic (PLEG): container finished" podID="22c7c368-3523-4224-aebd-59b29640bed0" containerID="821167de69ed6dbed932952dc938c85a7e5a7b97b04ac4c8184f76750cd36e46" exitCode=0 Mar 01 09:11:48 crc kubenswrapper[4792]: I0301 09:11:48.033615 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7nwc2" event={"ID":"22c7c368-3523-4224-aebd-59b29640bed0","Type":"ContainerDied","Data":"821167de69ed6dbed932952dc938c85a7e5a7b97b04ac4c8184f76750cd36e46"} Mar 01 09:11:48 crc kubenswrapper[4792]: I0301 09:11:48.033762 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7nwc2" event={"ID":"22c7c368-3523-4224-aebd-59b29640bed0","Type":"ContainerStarted","Data":"52cac0b41f9f5231fc4b99f244ac1d2bd2e66edf0a6640e4bac8c95c340ce560"} Mar 01 09:11:48 crc kubenswrapper[4792]: I0301 09:11:48.044851 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ab5a35f7-19e5-4496-b030-59dfa49a64cf","Type":"ContainerDied","Data":"9b3b8533ecddfd3cef9d38d86a78b540d4a7fe462e6ed508f6f81c47012cffc7"} Mar 01 09:11:48 crc kubenswrapper[4792]: I0301 09:11:48.044923 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b3b8533ecddfd3cef9d38d86a78b540d4a7fe462e6ed508f6f81c47012cffc7" Mar 01 09:11:48 crc kubenswrapper[4792]: I0301 09:11:48.044935 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 01 09:11:48 crc kubenswrapper[4792]: I0301 09:11:48.067673 4792 generic.go:334] "Generic (PLEG): container finished" podID="e93e87c0-86c5-446d-9f43-71d17960a351" containerID="c36d8345eeaa99ca3b40ff37727a6e09aab9d36c26bc0292d9c0f3405bd1fe17" exitCode=0 Mar 01 09:11:48 crc kubenswrapper[4792]: I0301 09:11:48.068839 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ftk4v" event={"ID":"e93e87c0-86c5-446d-9f43-71d17960a351","Type":"ContainerDied","Data":"c36d8345eeaa99ca3b40ff37727a6e09aab9d36c26bc0292d9c0f3405bd1fe17"} Mar 01 09:11:48 crc kubenswrapper[4792]: I0301 09:11:48.068863 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ftk4v" event={"ID":"e93e87c0-86c5-446d-9f43-71d17960a351","Type":"ContainerStarted","Data":"4d930733fc56c620ebbeeb1e0668d704b5df033f141d5439fd47f1e4757bacb0"} Mar 01 09:11:48 crc kubenswrapper[4792]: I0301 09:11:48.606411 4792 patch_prober.go:28] interesting pod/router-default-5444994796-qtg4x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 01 09:11:48 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 01 09:11:48 crc kubenswrapper[4792]: [+]process-running ok Mar 01 09:11:48 crc kubenswrapper[4792]: healthz check failed Mar 01 09:11:48 crc kubenswrapper[4792]: I0301 09:11:48.606469 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtg4x" podUID="6d0571e3-5089-4157-a36a-25ecfe6a67f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:11:49 crc kubenswrapper[4792]: I0301 09:11:49.117615 4792 generic.go:334] "Generic (PLEG): container finished" podID="9a193ef8-bf31-4d61-972e-5772b8fe8c39" containerID="c4d130b61a49d2b6adb269d65f6c6f670dd6878b17ef19970b09e3cdeea1a62b" exitCode=0 Mar 01 09:11:49 crc kubenswrapper[4792]: I0301 09:11:49.117694 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9a193ef8-bf31-4d61-972e-5772b8fe8c39","Type":"ContainerDied","Data":"c4d130b61a49d2b6adb269d65f6c6f670dd6878b17ef19970b09e3cdeea1a62b"} Mar 01 09:11:49 crc kubenswrapper[4792]: I0301 09:11:49.607759 4792 patch_prober.go:28] interesting pod/router-default-5444994796-qtg4x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 01 09:11:49 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 01 09:11:49 crc kubenswrapper[4792]: [+]process-running ok Mar 01 09:11:49 crc kubenswrapper[4792]: healthz check failed Mar 01 09:11:49 crc kubenswrapper[4792]: I0301 09:11:49.607831 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtg4x" podUID="6d0571e3-5089-4157-a36a-25ecfe6a67f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:11:49 crc kubenswrapper[4792]: I0301 09:11:49.829087 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:49 crc kubenswrapper[4792]: I0301 09:11:49.833801 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:49 crc kubenswrapper[4792]: I0301 09:11:49.989194 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-rjwhk" Mar 01 09:11:50 crc kubenswrapper[4792]: I0301 09:11:50.606395 4792 patch_prober.go:28] interesting pod/router-default-5444994796-qtg4x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 01 09:11:50 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 01 09:11:50 crc kubenswrapper[4792]: [+]process-running ok Mar 01 09:11:50 crc kubenswrapper[4792]: healthz check failed Mar 01 09:11:50 crc kubenswrapper[4792]: I0301 09:11:50.606461 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtg4x" podUID="6d0571e3-5089-4157-a36a-25ecfe6a67f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:11:50 crc kubenswrapper[4792]: I0301 09:11:50.667658 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 01 09:11:50 crc kubenswrapper[4792]: I0301 09:11:50.740659 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9a193ef8-bf31-4d61-972e-5772b8fe8c39-kube-api-access\") pod \"9a193ef8-bf31-4d61-972e-5772b8fe8c39\" (UID: \"9a193ef8-bf31-4d61-972e-5772b8fe8c39\") " Mar 01 09:11:50 crc kubenswrapper[4792]: I0301 09:11:50.740704 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9a193ef8-bf31-4d61-972e-5772b8fe8c39-kubelet-dir\") pod \"9a193ef8-bf31-4d61-972e-5772b8fe8c39\" (UID: \"9a193ef8-bf31-4d61-972e-5772b8fe8c39\") " Mar 01 09:11:50 crc kubenswrapper[4792]: I0301 09:11:50.740982 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a193ef8-bf31-4d61-972e-5772b8fe8c39-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9a193ef8-bf31-4d61-972e-5772b8fe8c39" (UID: "9a193ef8-bf31-4d61-972e-5772b8fe8c39"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:11:50 crc kubenswrapper[4792]: I0301 09:11:50.757120 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a193ef8-bf31-4d61-972e-5772b8fe8c39-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9a193ef8-bf31-4d61-972e-5772b8fe8c39" (UID: "9a193ef8-bf31-4d61-972e-5772b8fe8c39"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:11:50 crc kubenswrapper[4792]: I0301 09:11:50.843206 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9a193ef8-bf31-4d61-972e-5772b8fe8c39-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 01 09:11:50 crc kubenswrapper[4792]: I0301 09:11:50.843251 4792 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9a193ef8-bf31-4d61-972e-5772b8fe8c39-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 01 09:11:51 crc kubenswrapper[4792]: I0301 09:11:51.212091 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9a193ef8-bf31-4d61-972e-5772b8fe8c39","Type":"ContainerDied","Data":"8c3c0f18faf3dfd4f659a530a5e1444bf2456ab64ec87d38ae446eb8262bf116"} Mar 01 09:11:51 crc kubenswrapper[4792]: I0301 09:11:51.212418 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c3c0f18faf3dfd4f659a530a5e1444bf2456ab64ec87d38ae446eb8262bf116" Mar 01 09:11:51 crc kubenswrapper[4792]: I0301 09:11:51.212170 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 01 09:11:51 crc kubenswrapper[4792]: I0301 09:11:51.605582 4792 patch_prober.go:28] interesting pod/router-default-5444994796-qtg4x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 01 09:11:51 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 01 09:11:51 crc kubenswrapper[4792]: [+]process-running ok Mar 01 09:11:51 crc kubenswrapper[4792]: healthz check failed Mar 01 09:11:51 crc kubenswrapper[4792]: I0301 09:11:51.605635 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtg4x" podUID="6d0571e3-5089-4157-a36a-25ecfe6a67f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:11:52 crc kubenswrapper[4792]: I0301 09:11:52.546622 4792 ???:1] "http: TLS handshake error from 192.168.126.11:60390: no serving certificate available for the kubelet" Mar 01 09:11:52 crc kubenswrapper[4792]: I0301 09:11:52.607790 4792 patch_prober.go:28] interesting pod/router-default-5444994796-qtg4x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 01 09:11:52 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 01 09:11:52 crc kubenswrapper[4792]: [+]process-running ok Mar 01 09:11:52 crc kubenswrapper[4792]: healthz check failed Mar 01 09:11:52 crc kubenswrapper[4792]: I0301 09:11:52.607849 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtg4x" podUID="6d0571e3-5089-4157-a36a-25ecfe6a67f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:11:53 crc kubenswrapper[4792]: I0301 09:11:53.187171 4792 patch_prober.go:28] interesting pod/downloads-7954f5f757-wxl8v container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" start-of-body= Mar 01 09:11:53 crc kubenswrapper[4792]: I0301 09:11:53.187462 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-wxl8v" podUID="6cc55bdf-6c0f-4d35-879f-c64c2dc4897c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" Mar 01 09:11:53 crc kubenswrapper[4792]: I0301 09:11:53.187182 4792 patch_prober.go:28] interesting pod/downloads-7954f5f757-wxl8v container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" start-of-body= Mar 01 09:11:53 crc kubenswrapper[4792]: I0301 09:11:53.187510 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wxl8v" podUID="6cc55bdf-6c0f-4d35-879f-c64c2dc4897c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" Mar 01 09:11:53 crc kubenswrapper[4792]: I0301 09:11:53.205928 4792 patch_prober.go:28] interesting pod/console-f9d7485db-zrzcg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 01 09:11:53 crc kubenswrapper[4792]: I0301 09:11:53.205970 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-zrzcg" podUID="86788093-42e5-4fa0-9595-97a910e6557e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 01 09:11:53 crc kubenswrapper[4792]: I0301 09:11:53.606173 4792 patch_prober.go:28] interesting pod/router-default-5444994796-qtg4x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 01 09:11:53 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 01 09:11:53 crc kubenswrapper[4792]: [+]process-running ok Mar 01 09:11:53 crc kubenswrapper[4792]: healthz check failed Mar 01 09:11:53 crc kubenswrapper[4792]: I0301 09:11:53.606229 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtg4x" podUID="6d0571e3-5089-4157-a36a-25ecfe6a67f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:11:54 crc kubenswrapper[4792]: I0301 09:11:54.628361 4792 patch_prober.go:28] interesting pod/router-default-5444994796-qtg4x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 01 09:11:54 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 01 09:11:54 crc kubenswrapper[4792]: [+]process-running ok Mar 01 09:11:54 crc kubenswrapper[4792]: healthz check failed Mar 01 09:11:54 crc kubenswrapper[4792]: I0301 09:11:54.628476 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtg4x" podUID="6d0571e3-5089-4157-a36a-25ecfe6a67f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:11:55 crc kubenswrapper[4792]: I0301 09:11:55.605296 4792 patch_prober.go:28] interesting pod/router-default-5444994796-qtg4x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 01 09:11:55 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 01 09:11:55 crc kubenswrapper[4792]: [+]process-running ok Mar 01 09:11:55 crc kubenswrapper[4792]: healthz check failed Mar 01 09:11:55 crc kubenswrapper[4792]: I0301 09:11:55.605675 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtg4x" podUID="6d0571e3-5089-4157-a36a-25ecfe6a67f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:11:56 crc kubenswrapper[4792]: I0301 09:11:56.606064 4792 patch_prober.go:28] interesting pod/router-default-5444994796-qtg4x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 01 09:11:56 crc kubenswrapper[4792]: [+]has-synced ok Mar 01 09:11:56 crc kubenswrapper[4792]: [+]process-running ok Mar 01 09:11:56 crc kubenswrapper[4792]: healthz check failed Mar 01 09:11:56 crc kubenswrapper[4792]: I0301 09:11:56.606117 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtg4x" podUID="6d0571e3-5089-4157-a36a-25ecfe6a67f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:11:56 crc kubenswrapper[4792]: I0301 09:11:56.727671 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs\") pod \"network-metrics-daemon-frm7z\" (UID: \"fa0bf523-6582-46b4-9134-28880a50b474\") " pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:11:56 crc kubenswrapper[4792]: I0301 09:11:56.729235 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 01 09:11:56 crc kubenswrapper[4792]: I0301 09:11:56.751875 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs\") pod \"network-metrics-daemon-frm7z\" (UID: \"fa0bf523-6582-46b4-9134-28880a50b474\") " pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:11:56 crc kubenswrapper[4792]: I0301 09:11:56.852630 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 01 09:11:56 crc kubenswrapper[4792]: I0301 09:11:56.861803 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:11:57 crc kubenswrapper[4792]: I0301 09:11:57.605777 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-qtg4x" Mar 01 09:11:57 crc kubenswrapper[4792]: I0301 09:11:57.615263 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-qtg4x" Mar 01 09:11:57 crc kubenswrapper[4792]: I0301 09:11:57.840747 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5bf56554b8-qv8hr"] Mar 01 09:11:57 crc kubenswrapper[4792]: I0301 09:11:57.841024 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" podUID="2fca3ad3-b093-4857-85cb-3db2b6516dcf" containerName="controller-manager" containerID="cri-o://5e459860652998dbc975fd66e05ff3eab25257081065483e039f02fb979238f7" gracePeriod=30 Mar 01 09:11:57 crc kubenswrapper[4792]: I0301 09:11:57.860784 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp"] Mar 01 09:11:57 crc kubenswrapper[4792]: I0301 09:11:57.861097 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" podUID="7e3e715b-f024-4842-94ed-1f1e054e89c6" containerName="route-controller-manager" containerID="cri-o://3e137621be8fd281e91660714989b3524d34fe04d0d9acd500e98e9bc50b05d8" gracePeriod=30 Mar 01 09:11:58 crc kubenswrapper[4792]: I0301 09:11:58.302806 4792 ???:1] "http: TLS handshake error from 192.168.126.11:50104: no serving certificate available for the kubelet" Mar 01 09:11:59 crc kubenswrapper[4792]: I0301 09:11:59.302675 4792 generic.go:334] "Generic (PLEG): container finished" podID="7e3e715b-f024-4842-94ed-1f1e054e89c6" containerID="3e137621be8fd281e91660714989b3524d34fe04d0d9acd500e98e9bc50b05d8" exitCode=0 Mar 01 09:11:59 crc kubenswrapper[4792]: I0301 09:11:59.302710 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" event={"ID":"7e3e715b-f024-4842-94ed-1f1e054e89c6","Type":"ContainerDied","Data":"3e137621be8fd281e91660714989b3524d34fe04d0d9acd500e98e9bc50b05d8"} Mar 01 09:11:59 crc kubenswrapper[4792]: I0301 09:11:59.305119 4792 generic.go:334] "Generic (PLEG): container finished" podID="2fca3ad3-b093-4857-85cb-3db2b6516dcf" containerID="5e459860652998dbc975fd66e05ff3eab25257081065483e039f02fb979238f7" exitCode=0 Mar 01 09:11:59 crc kubenswrapper[4792]: I0301 09:11:59.305152 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" event={"ID":"2fca3ad3-b093-4857-85cb-3db2b6516dcf","Type":"ContainerDied","Data":"5e459860652998dbc975fd66e05ff3eab25257081065483e039f02fb979238f7"} Mar 01 09:12:00 crc kubenswrapper[4792]: I0301 09:12:00.132121 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539272-nq8dk"] Mar 01 09:12:00 crc kubenswrapper[4792]: E0301 09:12:00.132794 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3" containerName="collect-profiles" Mar 01 09:12:00 crc kubenswrapper[4792]: I0301 09:12:00.132814 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3" containerName="collect-profiles" Mar 01 09:12:00 crc kubenswrapper[4792]: E0301 09:12:00.132834 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a193ef8-bf31-4d61-972e-5772b8fe8c39" containerName="pruner" Mar 01 09:12:00 crc kubenswrapper[4792]: I0301 09:12:00.132839 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a193ef8-bf31-4d61-972e-5772b8fe8c39" containerName="pruner" Mar 01 09:12:00 crc kubenswrapper[4792]: E0301 09:12:00.132851 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab5a35f7-19e5-4496-b030-59dfa49a64cf" containerName="pruner" Mar 01 09:12:00 crc kubenswrapper[4792]: I0301 09:12:00.132858 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab5a35f7-19e5-4496-b030-59dfa49a64cf" containerName="pruner" Mar 01 09:12:00 crc kubenswrapper[4792]: I0301 09:12:00.132988 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a193ef8-bf31-4d61-972e-5772b8fe8c39" containerName="pruner" Mar 01 09:12:00 crc kubenswrapper[4792]: I0301 09:12:00.133002 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab5a35f7-19e5-4496-b030-59dfa49a64cf" containerName="pruner" Mar 01 09:12:00 crc kubenswrapper[4792]: I0301 09:12:00.133011 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3" containerName="collect-profiles" Mar 01 09:12:00 crc kubenswrapper[4792]: I0301 09:12:00.133382 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539272-nq8dk" Mar 01 09:12:00 crc kubenswrapper[4792]: I0301 09:12:00.136510 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:12:00 crc kubenswrapper[4792]: I0301 09:12:00.138730 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539272-nq8dk"] Mar 01 09:12:00 crc kubenswrapper[4792]: I0301 09:12:00.305231 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjz8h\" (UniqueName: \"kubernetes.io/projected/8d574e82-f840-4f0c-982d-f6a133bd64ae-kube-api-access-hjz8h\") pod \"auto-csr-approver-29539272-nq8dk\" (UID: \"8d574e82-f840-4f0c-982d-f6a133bd64ae\") " pod="openshift-infra/auto-csr-approver-29539272-nq8dk" Mar 01 09:12:00 crc kubenswrapper[4792]: I0301 09:12:00.411895 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjz8h\" (UniqueName: \"kubernetes.io/projected/8d574e82-f840-4f0c-982d-f6a133bd64ae-kube-api-access-hjz8h\") pod \"auto-csr-approver-29539272-nq8dk\" (UID: \"8d574e82-f840-4f0c-982d-f6a133bd64ae\") " pod="openshift-infra/auto-csr-approver-29539272-nq8dk" Mar 01 09:12:00 crc kubenswrapper[4792]: I0301 09:12:00.450470 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjz8h\" (UniqueName: \"kubernetes.io/projected/8d574e82-f840-4f0c-982d-f6a133bd64ae-kube-api-access-hjz8h\") pod \"auto-csr-approver-29539272-nq8dk\" (UID: \"8d574e82-f840-4f0c-982d-f6a133bd64ae\") " pod="openshift-infra/auto-csr-approver-29539272-nq8dk" Mar 01 09:12:00 crc kubenswrapper[4792]: I0301 09:12:00.455010 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539272-nq8dk" Mar 01 09:12:01 crc kubenswrapper[4792]: I0301 09:12:01.204575 4792 patch_prober.go:28] interesting pod/route-controller-manager-8ddf4cb6d-chpwp container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" start-of-body= Mar 01 09:12:01 crc kubenswrapper[4792]: I0301 09:12:01.204630 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" podUID="7e3e715b-f024-4842-94ed-1f1e054e89c6" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" Mar 01 09:12:03 crc kubenswrapper[4792]: I0301 09:12:03.192018 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-wxl8v" Mar 01 09:12:03 crc kubenswrapper[4792]: I0301 09:12:03.214653 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:12:03 crc kubenswrapper[4792]: I0301 09:12:03.222302 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:12:03 crc kubenswrapper[4792]: I0301 09:12:03.764759 4792 patch_prober.go:28] interesting pod/controller-manager-5bf56554b8-qv8hr container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" start-of-body= Mar 01 09:12:03 crc kubenswrapper[4792]: I0301 09:12:03.764821 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" podUID="2fca3ad3-b093-4857-85cb-3db2b6516dcf" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" Mar 01 09:12:04 crc kubenswrapper[4792]: I0301 09:12:04.943602 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:12:04 crc kubenswrapper[4792]: I0301 09:12:04.943654 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:12:05 crc kubenswrapper[4792]: I0301 09:12:05.561859 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:12:11 crc kubenswrapper[4792]: I0301 09:12:11.204135 4792 patch_prober.go:28] interesting pod/route-controller-manager-8ddf4cb6d-chpwp container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" start-of-body= Mar 01 09:12:11 crc kubenswrapper[4792]: I0301 09:12:11.204183 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" podUID="7e3e715b-f024-4842-94ed-1f1e054e89c6" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" Mar 01 09:12:13 crc kubenswrapper[4792]: I0301 09:12:13.764445 4792 patch_prober.go:28] interesting pod/controller-manager-5bf56554b8-qv8hr container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" start-of-body= Mar 01 09:12:13 crc kubenswrapper[4792]: I0301 09:12:13.764835 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" podUID="2fca3ad3-b093-4857-85cb-3db2b6516dcf" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" Mar 01 09:12:14 crc kubenswrapper[4792]: I0301 09:12:14.903943 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nc4gs" Mar 01 09:12:19 crc kubenswrapper[4792]: I0301 09:12:19.694933 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 01 09:12:19 crc kubenswrapper[4792]: I0301 09:12:19.695848 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 01 09:12:19 crc kubenswrapper[4792]: I0301 09:12:19.697429 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 01 09:12:19 crc kubenswrapper[4792]: I0301 09:12:19.702640 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 01 09:12:19 crc kubenswrapper[4792]: I0301 09:12:19.710091 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 01 09:12:19 crc kubenswrapper[4792]: I0301 09:12:19.763714 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/15c3e52b-97f9-45e8-a7ba-c360739547e7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"15c3e52b-97f9-45e8-a7ba-c360739547e7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 01 09:12:19 crc kubenswrapper[4792]: I0301 09:12:19.763802 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15c3e52b-97f9-45e8-a7ba-c360739547e7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"15c3e52b-97f9-45e8-a7ba-c360739547e7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 01 09:12:19 crc kubenswrapper[4792]: I0301 09:12:19.864501 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/15c3e52b-97f9-45e8-a7ba-c360739547e7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"15c3e52b-97f9-45e8-a7ba-c360739547e7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 01 09:12:19 crc kubenswrapper[4792]: I0301 09:12:19.864567 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15c3e52b-97f9-45e8-a7ba-c360739547e7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"15c3e52b-97f9-45e8-a7ba-c360739547e7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 01 09:12:19 crc kubenswrapper[4792]: I0301 09:12:19.864675 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15c3e52b-97f9-45e8-a7ba-c360739547e7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"15c3e52b-97f9-45e8-a7ba-c360739547e7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 01 09:12:19 crc kubenswrapper[4792]: I0301 09:12:19.887735 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/15c3e52b-97f9-45e8-a7ba-c360739547e7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"15c3e52b-97f9-45e8-a7ba-c360739547e7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 01 09:12:20 crc kubenswrapper[4792]: I0301 09:12:20.071230 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 01 09:12:21 crc kubenswrapper[4792]: I0301 09:12:21.203838 4792 patch_prober.go:28] interesting pod/route-controller-manager-8ddf4cb6d-chpwp container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" start-of-body= Mar 01 09:12:21 crc kubenswrapper[4792]: I0301 09:12:21.204228 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" podUID="7e3e715b-f024-4842-94ed-1f1e054e89c6" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" Mar 01 09:12:21 crc kubenswrapper[4792]: I0301 09:12:21.882994 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:12:24 crc kubenswrapper[4792]: I0301 09:12:24.495069 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 01 09:12:24 crc kubenswrapper[4792]: I0301 09:12:24.495998 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 01 09:12:24 crc kubenswrapper[4792]: I0301 09:12:24.507557 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 01 09:12:24 crc kubenswrapper[4792]: I0301 09:12:24.623299 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4bc9fd32-61ce-4fbb-b67e-1376102f5384-kube-api-access\") pod \"installer-9-crc\" (UID: \"4bc9fd32-61ce-4fbb-b67e-1376102f5384\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 01 09:12:24 crc kubenswrapper[4792]: I0301 09:12:24.623700 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4bc9fd32-61ce-4fbb-b67e-1376102f5384-var-lock\") pod \"installer-9-crc\" (UID: \"4bc9fd32-61ce-4fbb-b67e-1376102f5384\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 01 09:12:24 crc kubenswrapper[4792]: I0301 09:12:24.623845 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4bc9fd32-61ce-4fbb-b67e-1376102f5384-kubelet-dir\") pod \"installer-9-crc\" (UID: \"4bc9fd32-61ce-4fbb-b67e-1376102f5384\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 01 09:12:24 crc kubenswrapper[4792]: I0301 09:12:24.725052 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4bc9fd32-61ce-4fbb-b67e-1376102f5384-kube-api-access\") pod \"installer-9-crc\" (UID: \"4bc9fd32-61ce-4fbb-b67e-1376102f5384\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 01 09:12:24 crc kubenswrapper[4792]: I0301 09:12:24.725837 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4bc9fd32-61ce-4fbb-b67e-1376102f5384-var-lock\") pod \"installer-9-crc\" (UID: \"4bc9fd32-61ce-4fbb-b67e-1376102f5384\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 01 09:12:24 crc kubenswrapper[4792]: I0301 09:12:24.726171 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4bc9fd32-61ce-4fbb-b67e-1376102f5384-kubelet-dir\") pod \"installer-9-crc\" (UID: \"4bc9fd32-61ce-4fbb-b67e-1376102f5384\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 01 09:12:24 crc kubenswrapper[4792]: I0301 09:12:24.726067 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4bc9fd32-61ce-4fbb-b67e-1376102f5384-var-lock\") pod \"installer-9-crc\" (UID: \"4bc9fd32-61ce-4fbb-b67e-1376102f5384\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 01 09:12:24 crc kubenswrapper[4792]: I0301 09:12:24.726345 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4bc9fd32-61ce-4fbb-b67e-1376102f5384-kubelet-dir\") pod \"installer-9-crc\" (UID: \"4bc9fd32-61ce-4fbb-b67e-1376102f5384\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 01 09:12:24 crc kubenswrapper[4792]: I0301 09:12:24.744328 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4bc9fd32-61ce-4fbb-b67e-1376102f5384-kube-api-access\") pod \"installer-9-crc\" (UID: \"4bc9fd32-61ce-4fbb-b67e-1376102f5384\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 01 09:12:24 crc kubenswrapper[4792]: I0301 09:12:24.764665 4792 patch_prober.go:28] interesting pod/controller-manager-5bf56554b8-qv8hr container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 01 09:12:24 crc kubenswrapper[4792]: I0301 09:12:24.766076 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" podUID="2fca3ad3-b093-4857-85cb-3db2b6516dcf" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 01 09:12:24 crc kubenswrapper[4792]: I0301 09:12:24.847207 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 01 09:12:26 crc kubenswrapper[4792]: I0301 09:12:26.981511 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.017119 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn"] Mar 01 09:12:27 crc kubenswrapper[4792]: E0301 09:12:27.017413 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e3e715b-f024-4842-94ed-1f1e054e89c6" containerName="route-controller-manager" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.018055 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e3e715b-f024-4842-94ed-1f1e054e89c6" containerName="route-controller-manager" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.018479 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e3e715b-f024-4842-94ed-1f1e054e89c6" containerName="route-controller-manager" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.022629 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.026826 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.029073 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn"] Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.060049 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e3e715b-f024-4842-94ed-1f1e054e89c6-serving-cert\") pod \"7e3e715b-f024-4842-94ed-1f1e054e89c6\" (UID: \"7e3e715b-f024-4842-94ed-1f1e054e89c6\") " Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.060093 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e3e715b-f024-4842-94ed-1f1e054e89c6-client-ca\") pod \"7e3e715b-f024-4842-94ed-1f1e054e89c6\" (UID: \"7e3e715b-f024-4842-94ed-1f1e054e89c6\") " Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.060138 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwpwk\" (UniqueName: \"kubernetes.io/projected/7e3e715b-f024-4842-94ed-1f1e054e89c6-kube-api-access-pwpwk\") pod \"7e3e715b-f024-4842-94ed-1f1e054e89c6\" (UID: \"7e3e715b-f024-4842-94ed-1f1e054e89c6\") " Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.060258 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e3e715b-f024-4842-94ed-1f1e054e89c6-config\") pod \"7e3e715b-f024-4842-94ed-1f1e054e89c6\" (UID: \"7e3e715b-f024-4842-94ed-1f1e054e89c6\") " Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.061384 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e3e715b-f024-4842-94ed-1f1e054e89c6-config" (OuterVolumeSpecName: "config") pod "7e3e715b-f024-4842-94ed-1f1e054e89c6" (UID: "7e3e715b-f024-4842-94ed-1f1e054e89c6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.062322 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e3e715b-f024-4842-94ed-1f1e054e89c6-client-ca" (OuterVolumeSpecName: "client-ca") pod "7e3e715b-f024-4842-94ed-1f1e054e89c6" (UID: "7e3e715b-f024-4842-94ed-1f1e054e89c6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.069737 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e3e715b-f024-4842-94ed-1f1e054e89c6-kube-api-access-pwpwk" (OuterVolumeSpecName: "kube-api-access-pwpwk") pod "7e3e715b-f024-4842-94ed-1f1e054e89c6" (UID: "7e3e715b-f024-4842-94ed-1f1e054e89c6"). InnerVolumeSpecName "kube-api-access-pwpwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.094613 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e3e715b-f024-4842-94ed-1f1e054e89c6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7e3e715b-f024-4842-94ed-1f1e054e89c6" (UID: "7e3e715b-f024-4842-94ed-1f1e054e89c6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.161647 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fca3ad3-b093-4857-85cb-3db2b6516dcf-config\") pod \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\" (UID: \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\") " Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.161735 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fca3ad3-b093-4857-85cb-3db2b6516dcf-serving-cert\") pod \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\" (UID: \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\") " Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.161788 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pp7l\" (UniqueName: \"kubernetes.io/projected/2fca3ad3-b093-4857-85cb-3db2b6516dcf-kube-api-access-4pp7l\") pod \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\" (UID: \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\") " Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.161820 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2fca3ad3-b093-4857-85cb-3db2b6516dcf-client-ca\") pod \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\" (UID: \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\") " Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.161842 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2fca3ad3-b093-4857-85cb-3db2b6516dcf-proxy-ca-bundles\") pod \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\" (UID: \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\") " Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.162093 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz4dk\" (UniqueName: \"kubernetes.io/projected/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-kube-api-access-pz4dk\") pod \"route-controller-manager-6947bdc57d-4m2vn\" (UID: \"810605ea-bf2f-4cd2-87a9-a09e9d5e7110\") " pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.162139 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-serving-cert\") pod \"route-controller-manager-6947bdc57d-4m2vn\" (UID: \"810605ea-bf2f-4cd2-87a9-a09e9d5e7110\") " pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.162184 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-client-ca\") pod \"route-controller-manager-6947bdc57d-4m2vn\" (UID: \"810605ea-bf2f-4cd2-87a9-a09e9d5e7110\") " pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.162247 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-config\") pod \"route-controller-manager-6947bdc57d-4m2vn\" (UID: \"810605ea-bf2f-4cd2-87a9-a09e9d5e7110\") " pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.162290 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e3e715b-f024-4842-94ed-1f1e054e89c6-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.162304 4792 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e3e715b-f024-4842-94ed-1f1e054e89c6-client-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.162321 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwpwk\" (UniqueName: \"kubernetes.io/projected/7e3e715b-f024-4842-94ed-1f1e054e89c6-kube-api-access-pwpwk\") on node \"crc\" DevicePath \"\"" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.162363 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e3e715b-f024-4842-94ed-1f1e054e89c6-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.162643 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fca3ad3-b093-4857-85cb-3db2b6516dcf-client-ca" (OuterVolumeSpecName: "client-ca") pod "2fca3ad3-b093-4857-85cb-3db2b6516dcf" (UID: "2fca3ad3-b093-4857-85cb-3db2b6516dcf"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.162733 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fca3ad3-b093-4857-85cb-3db2b6516dcf-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2fca3ad3-b093-4857-85cb-3db2b6516dcf" (UID: "2fca3ad3-b093-4857-85cb-3db2b6516dcf"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.162896 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fca3ad3-b093-4857-85cb-3db2b6516dcf-config" (OuterVolumeSpecName: "config") pod "2fca3ad3-b093-4857-85cb-3db2b6516dcf" (UID: "2fca3ad3-b093-4857-85cb-3db2b6516dcf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.165064 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fca3ad3-b093-4857-85cb-3db2b6516dcf-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2fca3ad3-b093-4857-85cb-3db2b6516dcf" (UID: "2fca3ad3-b093-4857-85cb-3db2b6516dcf"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.165134 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fca3ad3-b093-4857-85cb-3db2b6516dcf-kube-api-access-4pp7l" (OuterVolumeSpecName: "kube-api-access-4pp7l") pod "2fca3ad3-b093-4857-85cb-3db2b6516dcf" (UID: "2fca3ad3-b093-4857-85cb-3db2b6516dcf"). InnerVolumeSpecName "kube-api-access-4pp7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.263589 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-config\") pod \"route-controller-manager-6947bdc57d-4m2vn\" (UID: \"810605ea-bf2f-4cd2-87a9-a09e9d5e7110\") " pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.263642 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz4dk\" (UniqueName: \"kubernetes.io/projected/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-kube-api-access-pz4dk\") pod \"route-controller-manager-6947bdc57d-4m2vn\" (UID: \"810605ea-bf2f-4cd2-87a9-a09e9d5e7110\") " pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.263676 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-serving-cert\") pod \"route-controller-manager-6947bdc57d-4m2vn\" (UID: \"810605ea-bf2f-4cd2-87a9-a09e9d5e7110\") " pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.263706 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-client-ca\") pod \"route-controller-manager-6947bdc57d-4m2vn\" (UID: \"810605ea-bf2f-4cd2-87a9-a09e9d5e7110\") " pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.263761 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pp7l\" (UniqueName: \"kubernetes.io/projected/2fca3ad3-b093-4857-85cb-3db2b6516dcf-kube-api-access-4pp7l\") on node \"crc\" DevicePath \"\"" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.263773 4792 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2fca3ad3-b093-4857-85cb-3db2b6516dcf-client-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.263782 4792 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2fca3ad3-b093-4857-85cb-3db2b6516dcf-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.263793 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fca3ad3-b093-4857-85cb-3db2b6516dcf-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.263801 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fca3ad3-b093-4857-85cb-3db2b6516dcf-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.264791 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-client-ca\") pod \"route-controller-manager-6947bdc57d-4m2vn\" (UID: \"810605ea-bf2f-4cd2-87a9-a09e9d5e7110\") " pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.265078 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-config\") pod \"route-controller-manager-6947bdc57d-4m2vn\" (UID: \"810605ea-bf2f-4cd2-87a9-a09e9d5e7110\") " pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.266882 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-serving-cert\") pod \"route-controller-manager-6947bdc57d-4m2vn\" (UID: \"810605ea-bf2f-4cd2-87a9-a09e9d5e7110\") " pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.279311 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz4dk\" (UniqueName: \"kubernetes.io/projected/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-kube-api-access-pz4dk\") pod \"route-controller-manager-6947bdc57d-4m2vn\" (UID: \"810605ea-bf2f-4cd2-87a9-a09e9d5e7110\") " pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.341119 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.469595 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" event={"ID":"7e3e715b-f024-4842-94ed-1f1e054e89c6","Type":"ContainerDied","Data":"92e007ff2e490cd943aeebddedb6f93d838bbb65607bec7ae9a1be207a92d259"} Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.469629 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.469650 4792 scope.go:117] "RemoveContainer" containerID="3e137621be8fd281e91660714989b3524d34fe04d0d9acd500e98e9bc50b05d8" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.473550 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" event={"ID":"2fca3ad3-b093-4857-85cb-3db2b6516dcf","Type":"ContainerDied","Data":"834122f1bb558979716c9151687e893e13e6120e36e39817e6beb2432dd0f0fd"} Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.473608 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.490870 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp"] Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.495095 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp"] Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.498940 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5bf56554b8-qv8hr"] Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.502040 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5bf56554b8-qv8hr"] Mar 01 09:12:28 crc kubenswrapper[4792]: E0301 09:12:28.172172 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 01 09:12:28 crc kubenswrapper[4792]: E0301 09:12:28.172339 4792 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 01 09:12:28 crc kubenswrapper[4792]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 01 09:12:28 crc kubenswrapper[4792]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ql5bq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29539270-q7hck_openshift-infra(b4130507-2de2-48c2-9c3f-e9474aeca556): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 01 09:12:28 crc kubenswrapper[4792]: > logger="UnhandledError" Mar 01 09:12:28 crc kubenswrapper[4792]: E0301 09:12:28.173866 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29539270-q7hck" podUID="b4130507-2de2-48c2-9c3f-e9474aeca556" Mar 01 09:12:28 crc kubenswrapper[4792]: E0301 09:12:28.481353 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29539270-q7hck" podUID="b4130507-2de2-48c2-9c3f-e9474aeca556" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.416028 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fca3ad3-b093-4857-85cb-3db2b6516dcf" path="/var/lib/kubelet/pods/2fca3ad3-b093-4857-85cb-3db2b6516dcf/volumes" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.416991 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e3e715b-f024-4842-94ed-1f1e054e89c6" path="/var/lib/kubelet/pods/7e3e715b-f024-4842-94ed-1f1e054e89c6/volumes" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.487043 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-848d8759d6-kmmxk"] Mar 01 09:12:29 crc kubenswrapper[4792]: E0301 09:12:29.487279 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fca3ad3-b093-4857-85cb-3db2b6516dcf" containerName="controller-manager" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.487295 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fca3ad3-b093-4857-85cb-3db2b6516dcf" containerName="controller-manager" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.487393 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fca3ad3-b093-4857-85cb-3db2b6516dcf" containerName="controller-manager" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.487808 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.490003 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.490161 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.491282 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.491514 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.492177 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.492387 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-848d8759d6-kmmxk"] Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.492480 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.499475 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.591701 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l558v\" (UniqueName: \"kubernetes.io/projected/1761bae5-8e03-478f-938c-df41041a062c-kube-api-access-l558v\") pod \"controller-manager-848d8759d6-kmmxk\" (UID: \"1761bae5-8e03-478f-938c-df41041a062c\") " pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.591759 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1761bae5-8e03-478f-938c-df41041a062c-client-ca\") pod \"controller-manager-848d8759d6-kmmxk\" (UID: \"1761bae5-8e03-478f-938c-df41041a062c\") " pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.591876 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1761bae5-8e03-478f-938c-df41041a062c-config\") pod \"controller-manager-848d8759d6-kmmxk\" (UID: \"1761bae5-8e03-478f-938c-df41041a062c\") " pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.592087 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1761bae5-8e03-478f-938c-df41041a062c-proxy-ca-bundles\") pod \"controller-manager-848d8759d6-kmmxk\" (UID: \"1761bae5-8e03-478f-938c-df41041a062c\") " pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.592186 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1761bae5-8e03-478f-938c-df41041a062c-serving-cert\") pod \"controller-manager-848d8759d6-kmmxk\" (UID: \"1761bae5-8e03-478f-938c-df41041a062c\") " pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.693869 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1761bae5-8e03-478f-938c-df41041a062c-config\") pod \"controller-manager-848d8759d6-kmmxk\" (UID: \"1761bae5-8e03-478f-938c-df41041a062c\") " pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.693973 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1761bae5-8e03-478f-938c-df41041a062c-proxy-ca-bundles\") pod \"controller-manager-848d8759d6-kmmxk\" (UID: \"1761bae5-8e03-478f-938c-df41041a062c\") " pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.694034 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1761bae5-8e03-478f-938c-df41041a062c-serving-cert\") pod \"controller-manager-848d8759d6-kmmxk\" (UID: \"1761bae5-8e03-478f-938c-df41041a062c\") " pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.694107 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l558v\" (UniqueName: \"kubernetes.io/projected/1761bae5-8e03-478f-938c-df41041a062c-kube-api-access-l558v\") pod \"controller-manager-848d8759d6-kmmxk\" (UID: \"1761bae5-8e03-478f-938c-df41041a062c\") " pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.694146 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1761bae5-8e03-478f-938c-df41041a062c-client-ca\") pod \"controller-manager-848d8759d6-kmmxk\" (UID: \"1761bae5-8e03-478f-938c-df41041a062c\") " pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.694946 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1761bae5-8e03-478f-938c-df41041a062c-client-ca\") pod \"controller-manager-848d8759d6-kmmxk\" (UID: \"1761bae5-8e03-478f-938c-df41041a062c\") " pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.695368 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1761bae5-8e03-478f-938c-df41041a062c-config\") pod \"controller-manager-848d8759d6-kmmxk\" (UID: \"1761bae5-8e03-478f-938c-df41041a062c\") " pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.695741 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1761bae5-8e03-478f-938c-df41041a062c-proxy-ca-bundles\") pod \"controller-manager-848d8759d6-kmmxk\" (UID: \"1761bae5-8e03-478f-938c-df41041a062c\") " pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.699631 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1761bae5-8e03-478f-938c-df41041a062c-serving-cert\") pod \"controller-manager-848d8759d6-kmmxk\" (UID: \"1761bae5-8e03-478f-938c-df41041a062c\") " pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.720452 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l558v\" (UniqueName: \"kubernetes.io/projected/1761bae5-8e03-478f-938c-df41041a062c-kube-api-access-l558v\") pod \"controller-manager-848d8759d6-kmmxk\" (UID: \"1761bae5-8e03-478f-938c-df41041a062c\") " pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.811153 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" Mar 01 09:12:29 crc kubenswrapper[4792]: E0301 09:12:29.953301 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 01 09:12:30 crc kubenswrapper[4792]: E0301 09:12:29.954159 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mjlvw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-zwt8t_openshift-marketplace(fee8fc8f-8d72-4606-b115-4197f599cfcb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 01 09:12:30 crc kubenswrapper[4792]: E0301 09:12:29.955396 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-zwt8t" podUID="fee8fc8f-8d72-4606-b115-4197f599cfcb" Mar 01 09:12:30 crc kubenswrapper[4792]: E0301 09:12:30.133604 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 01 09:12:30 crc kubenswrapper[4792]: E0301 09:12:30.133760 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kc5gl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-n28r8_openshift-marketplace(e03dee1d-d7ca-422c-8af6-faa0c1af3863): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 01 09:12:30 crc kubenswrapper[4792]: E0301 09:12:30.134928 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-n28r8" podUID="e03dee1d-d7ca-422c-8af6-faa0c1af3863" Mar 01 09:12:33 crc kubenswrapper[4792]: E0301 09:12:33.628378 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-zwt8t" podUID="fee8fc8f-8d72-4606-b115-4197f599cfcb" Mar 01 09:12:33 crc kubenswrapper[4792]: E0301 09:12:33.628382 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-n28r8" podUID="e03dee1d-d7ca-422c-8af6-faa0c1af3863" Mar 01 09:12:33 crc kubenswrapper[4792]: E0301 09:12:33.704230 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 01 09:12:33 crc kubenswrapper[4792]: E0301 09:12:33.704603 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4tbnw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-ftk4v_openshift-marketplace(e93e87c0-86c5-446d-9f43-71d17960a351): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 01 09:12:33 crc kubenswrapper[4792]: E0301 09:12:33.705858 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-ftk4v" podUID="e93e87c0-86c5-446d-9f43-71d17960a351" Mar 01 09:12:34 crc kubenswrapper[4792]: I0301 09:12:34.942894 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:12:34 crc kubenswrapper[4792]: I0301 09:12:34.942968 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:12:36 crc kubenswrapper[4792]: E0301 09:12:36.494144 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-ftk4v" podUID="e93e87c0-86c5-446d-9f43-71d17960a351" Mar 01 09:12:37 crc kubenswrapper[4792]: E0301 09:12:37.706811 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 01 09:12:37 crc kubenswrapper[4792]: E0301 09:12:37.706968 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jn884,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-l7ngd_openshift-marketplace(8333a325-229b-4dfd-a1f8-966f39bf55fc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 01 09:12:37 crc kubenswrapper[4792]: E0301 09:12:37.708267 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-l7ngd" podUID="8333a325-229b-4dfd-a1f8-966f39bf55fc" Mar 01 09:12:39 crc kubenswrapper[4792]: I0301 09:12:39.291172 4792 ???:1] "http: TLS handshake error from 192.168.126.11:47816: no serving certificate available for the kubelet" Mar 01 09:12:40 crc kubenswrapper[4792]: E0301 09:12:40.009697 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-l7ngd" podUID="8333a325-229b-4dfd-a1f8-966f39bf55fc" Mar 01 09:12:40 crc kubenswrapper[4792]: I0301 09:12:40.047886 4792 scope.go:117] "RemoveContainer" containerID="5e459860652998dbc975fd66e05ff3eab25257081065483e039f02fb979238f7" Mar 01 09:12:40 crc kubenswrapper[4792]: I0301 09:12:40.397018 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn"] Mar 01 09:12:40 crc kubenswrapper[4792]: I0301 09:12:40.503538 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539272-nq8dk"] Mar 01 09:12:40 crc kubenswrapper[4792]: I0301 09:12:40.523989 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-frm7z"] Mar 01 09:12:40 crc kubenswrapper[4792]: I0301 09:12:40.541107 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-frm7z" event={"ID":"fa0bf523-6582-46b4-9134-28880a50b474","Type":"ContainerStarted","Data":"d68ef00952b1357bb1e49103f75298e3f4ea20890953a998565a8e4d1e6a7bca"} Mar 01 09:12:40 crc kubenswrapper[4792]: I0301 09:12:40.548049 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" event={"ID":"810605ea-bf2f-4cd2-87a9-a09e9d5e7110","Type":"ContainerStarted","Data":"ec7c242400c3788f70c2a91c8d7378bde577b0383ee21a7d115406b72e0bfbe5"} Mar 01 09:12:40 crc kubenswrapper[4792]: I0301 09:12:40.549312 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539272-nq8dk" event={"ID":"8d574e82-f840-4f0c-982d-f6a133bd64ae","Type":"ContainerStarted","Data":"41f9d0eeab6a97d7a4669fab8bc8a58ebb5b5c501c74bede3e0472eab0fbf2e6"} Mar 01 09:12:40 crc kubenswrapper[4792]: I0301 09:12:40.609606 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 01 09:12:40 crc kubenswrapper[4792]: W0301 09:12:40.615112 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4bc9fd32_61ce_4fbb_b67e_1376102f5384.slice/crio-e1aa5214c0457299056866d70c565c5094c4dfa52c85c6428fa756810ed8da10 WatchSource:0}: Error finding container e1aa5214c0457299056866d70c565c5094c4dfa52c85c6428fa756810ed8da10: Status 404 returned error can't find the container with id e1aa5214c0457299056866d70c565c5094c4dfa52c85c6428fa756810ed8da10 Mar 01 09:12:40 crc kubenswrapper[4792]: I0301 09:12:40.620796 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 01 09:12:40 crc kubenswrapper[4792]: I0301 09:12:40.683374 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-848d8759d6-kmmxk"] Mar 01 09:12:40 crc kubenswrapper[4792]: W0301 09:12:40.689595 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1761bae5_8e03_478f_938c_df41041a062c.slice/crio-7470d72dfdecb7555879e3e75bbc32cdde09ed41a6f66ab3ea84fdfcb9d1191a WatchSource:0}: Error finding container 7470d72dfdecb7555879e3e75bbc32cdde09ed41a6f66ab3ea84fdfcb9d1191a: Status 404 returned error can't find the container with id 7470d72dfdecb7555879e3e75bbc32cdde09ed41a6f66ab3ea84fdfcb9d1191a Mar 01 09:12:41 crc kubenswrapper[4792]: I0301 09:12:41.556644 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"4bc9fd32-61ce-4fbb-b67e-1376102f5384","Type":"ContainerStarted","Data":"e1aa5214c0457299056866d70c565c5094c4dfa52c85c6428fa756810ed8da10"} Mar 01 09:12:41 crc kubenswrapper[4792]: I0301 09:12:41.559498 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"15c3e52b-97f9-45e8-a7ba-c360739547e7","Type":"ContainerStarted","Data":"f382784376f22fb9daf42b2e14133ec72dd7414c987c151edf216564d1c7d317"} Mar 01 09:12:41 crc kubenswrapper[4792]: I0301 09:12:41.559548 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"15c3e52b-97f9-45e8-a7ba-c360739547e7","Type":"ContainerStarted","Data":"6dc5a90f68bbb98f1b1144b481316b3233e7eb33cd6b34c20b580259bfcc2ec8"} Mar 01 09:12:41 crc kubenswrapper[4792]: I0301 09:12:41.560797 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" event={"ID":"1761bae5-8e03-478f-938c-df41041a062c","Type":"ContainerStarted","Data":"7470d72dfdecb7555879e3e75bbc32cdde09ed41a6f66ab3ea84fdfcb9d1191a"} Mar 01 09:12:41 crc kubenswrapper[4792]: I0301 09:12:41.562044 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" event={"ID":"810605ea-bf2f-4cd2-87a9-a09e9d5e7110","Type":"ContainerStarted","Data":"6c2c0cfa98afb0d975335d6142781b6265b7d1ccb65a4ca10172168101d7316d"} Mar 01 09:12:42 crc kubenswrapper[4792]: E0301 09:12:42.238463 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 01 09:12:42 crc kubenswrapper[4792]: E0301 09:12:42.238827 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-whj6h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-p6b9w_openshift-marketplace(6fd91972-6bfc-4041-abc2-8f4298584603): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 01 09:12:42 crc kubenswrapper[4792]: E0301 09:12:42.240187 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-p6b9w" podUID="6fd91972-6bfc-4041-abc2-8f4298584603" Mar 01 09:12:42 crc kubenswrapper[4792]: I0301 09:12:42.568201 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-frm7z" event={"ID":"fa0bf523-6582-46b4-9134-28880a50b474","Type":"ContainerStarted","Data":"41bb3bce4ab3fe37dbba668e29eee509bd06b39592fb65b56c393ef53fd7337f"} Mar 01 09:12:42 crc kubenswrapper[4792]: I0301 09:12:42.571374 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"4bc9fd32-61ce-4fbb-b67e-1376102f5384","Type":"ContainerStarted","Data":"7ee832f8488525fdd9e2872f5f8c217b74292b7dc7a5f5c0537fec99a3845c5f"} Mar 01 09:12:42 crc kubenswrapper[4792]: I0301 09:12:42.573162 4792 generic.go:334] "Generic (PLEG): container finished" podID="15c3e52b-97f9-45e8-a7ba-c360739547e7" containerID="f382784376f22fb9daf42b2e14133ec72dd7414c987c151edf216564d1c7d317" exitCode=0 Mar 01 09:12:42 crc kubenswrapper[4792]: I0301 09:12:42.573219 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"15c3e52b-97f9-45e8-a7ba-c360739547e7","Type":"ContainerDied","Data":"f382784376f22fb9daf42b2e14133ec72dd7414c987c151edf216564d1c7d317"} Mar 01 09:12:42 crc kubenswrapper[4792]: I0301 09:12:42.574970 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" event={"ID":"1761bae5-8e03-478f-938c-df41041a062c","Type":"ContainerStarted","Data":"c6c92a3162c61d2929d59a565ad6956d560ccb189eb5098f9d86e3c3e62e3291"} Mar 01 09:12:42 crc kubenswrapper[4792]: I0301 09:12:42.575403 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" Mar 01 09:12:42 crc kubenswrapper[4792]: E0301 09:12:42.586628 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-p6b9w" podUID="6fd91972-6bfc-4041-abc2-8f4298584603" Mar 01 09:12:42 crc kubenswrapper[4792]: I0301 09:12:42.592069 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=18.592054354 podStartE2EDuration="18.592054354s" podCreationTimestamp="2026-03-01 09:12:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:12:42.588864194 +0000 UTC m=+291.830743391" watchObservedRunningTime="2026-03-01 09:12:42.592054354 +0000 UTC m=+291.833933541" Mar 01 09:12:42 crc kubenswrapper[4792]: I0301 09:12:42.611078 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" Mar 01 09:12:42 crc kubenswrapper[4792]: I0301 09:12:42.690839 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" podStartSLOduration=25.690822812 podStartE2EDuration="25.690822812s" podCreationTimestamp="2026-03-01 09:12:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:12:42.688997146 +0000 UTC m=+291.930876343" watchObservedRunningTime="2026-03-01 09:12:42.690822812 +0000 UTC m=+291.932702009" Mar 01 09:12:42 crc kubenswrapper[4792]: I0301 09:12:42.725403 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" podStartSLOduration=25.725386745 podStartE2EDuration="25.725386745s" podCreationTimestamp="2026-03-01 09:12:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:12:42.722993755 +0000 UTC m=+291.964872952" watchObservedRunningTime="2026-03-01 09:12:42.725386745 +0000 UTC m=+291.967265942" Mar 01 09:12:42 crc kubenswrapper[4792]: E0301 09:12:42.977597 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 01 09:12:42 crc kubenswrapper[4792]: E0301 09:12:42.977740 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nc28k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-wxb87_openshift-marketplace(9073e3da-2d6f-48a3-907a-e347f28559ae): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 01 09:12:42 crc kubenswrapper[4792]: E0301 09:12:42.980056 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-wxb87" podUID="9073e3da-2d6f-48a3-907a-e347f28559ae" Mar 01 09:12:43 crc kubenswrapper[4792]: I0301 09:12:43.259708 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-prqqp"] Mar 01 09:12:43 crc kubenswrapper[4792]: I0301 09:12:43.580496 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-frm7z" event={"ID":"fa0bf523-6582-46b4-9134-28880a50b474","Type":"ContainerStarted","Data":"842d74ac38d89984faf44a260f0f962bba2e310a1347d674617143778b1ee622"} Mar 01 09:12:43 crc kubenswrapper[4792]: E0301 09:12:43.581836 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-wxb87" podUID="9073e3da-2d6f-48a3-907a-e347f28559ae" Mar 01 09:12:43 crc kubenswrapper[4792]: I0301 09:12:43.602394 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-frm7z" podStartSLOduration=231.602365598 podStartE2EDuration="3m51.602365598s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:12:43.599111926 +0000 UTC m=+292.840991133" watchObservedRunningTime="2026-03-01 09:12:43.602365598 +0000 UTC m=+292.844244795" Mar 01 09:12:43 crc kubenswrapper[4792]: I0301 09:12:43.833185 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 01 09:12:44 crc kubenswrapper[4792]: I0301 09:12:44.020759 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/15c3e52b-97f9-45e8-a7ba-c360739547e7-kube-api-access\") pod \"15c3e52b-97f9-45e8-a7ba-c360739547e7\" (UID: \"15c3e52b-97f9-45e8-a7ba-c360739547e7\") " Mar 01 09:12:44 crc kubenswrapper[4792]: I0301 09:12:44.020835 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15c3e52b-97f9-45e8-a7ba-c360739547e7-kubelet-dir\") pod \"15c3e52b-97f9-45e8-a7ba-c360739547e7\" (UID: \"15c3e52b-97f9-45e8-a7ba-c360739547e7\") " Mar 01 09:12:44 crc kubenswrapper[4792]: I0301 09:12:44.021213 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/15c3e52b-97f9-45e8-a7ba-c360739547e7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "15c3e52b-97f9-45e8-a7ba-c360739547e7" (UID: "15c3e52b-97f9-45e8-a7ba-c360739547e7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:12:44 crc kubenswrapper[4792]: I0301 09:12:44.025848 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15c3e52b-97f9-45e8-a7ba-c360739547e7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "15c3e52b-97f9-45e8-a7ba-c360739547e7" (UID: "15c3e52b-97f9-45e8-a7ba-c360739547e7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:12:44 crc kubenswrapper[4792]: I0301 09:12:44.121839 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/15c3e52b-97f9-45e8-a7ba-c360739547e7-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 01 09:12:44 crc kubenswrapper[4792]: I0301 09:12:44.121882 4792 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15c3e52b-97f9-45e8-a7ba-c360739547e7-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 01 09:12:44 crc kubenswrapper[4792]: I0301 09:12:44.589973 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 01 09:12:44 crc kubenswrapper[4792]: I0301 09:12:44.591991 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"15c3e52b-97f9-45e8-a7ba-c360739547e7","Type":"ContainerDied","Data":"6dc5a90f68bbb98f1b1144b481316b3233e7eb33cd6b34c20b580259bfcc2ec8"} Mar 01 09:12:44 crc kubenswrapper[4792]: I0301 09:12:44.592027 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6dc5a90f68bbb98f1b1144b481316b3233e7eb33cd6b34c20b580259bfcc2ec8" Mar 01 09:12:44 crc kubenswrapper[4792]: E0301 09:12:44.643521 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 01 09:12:44 crc kubenswrapper[4792]: E0301 09:12:44.643947 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-thnsb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-cw675_openshift-marketplace(dff0d675-52dd-4cac-a7be-8750333c28e3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 01 09:12:44 crc kubenswrapper[4792]: E0301 09:12:44.645192 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-cw675" podUID="dff0d675-52dd-4cac-a7be-8750333c28e3" Mar 01 09:12:44 crc kubenswrapper[4792]: E0301 09:12:44.848016 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 01 09:12:44 crc kubenswrapper[4792]: E0301 09:12:44.848204 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-djdkr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-7nwc2_openshift-marketplace(22c7c368-3523-4224-aebd-59b29640bed0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 01 09:12:44 crc kubenswrapper[4792]: E0301 09:12:44.849414 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-7nwc2" podUID="22c7c368-3523-4224-aebd-59b29640bed0" Mar 01 09:12:45 crc kubenswrapper[4792]: I0301 09:12:45.416718 4792 csr.go:261] certificate signing request csr-tfd7m is approved, waiting to be issued Mar 01 09:12:45 crc kubenswrapper[4792]: I0301 09:12:45.422590 4792 csr.go:257] certificate signing request csr-tfd7m is issued Mar 01 09:12:45 crc kubenswrapper[4792]: I0301 09:12:45.595991 4792 generic.go:334] "Generic (PLEG): container finished" podID="b4130507-2de2-48c2-9c3f-e9474aeca556" containerID="f692f356115e5b53ef6a4d81f9a4c258c05c49397508f23df7e1bd78fc94331c" exitCode=0 Mar 01 09:12:45 crc kubenswrapper[4792]: I0301 09:12:45.596065 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539270-q7hck" event={"ID":"b4130507-2de2-48c2-9c3f-e9474aeca556","Type":"ContainerDied","Data":"f692f356115e5b53ef6a4d81f9a4c258c05c49397508f23df7e1bd78fc94331c"} Mar 01 09:12:45 crc kubenswrapper[4792]: E0301 09:12:45.598164 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-cw675" podUID="dff0d675-52dd-4cac-a7be-8750333c28e3" Mar 01 09:12:45 crc kubenswrapper[4792]: E0301 09:12:45.598182 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7nwc2" podUID="22c7c368-3523-4224-aebd-59b29640bed0" Mar 01 09:12:46 crc kubenswrapper[4792]: I0301 09:12:46.423995 4792 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-19 07:52:47.639347513 +0000 UTC Mar 01 09:12:46 crc kubenswrapper[4792]: I0301 09:12:46.424028 4792 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7030h40m1.215321809s for next certificate rotation Mar 01 09:12:46 crc kubenswrapper[4792]: I0301 09:12:46.601521 4792 generic.go:334] "Generic (PLEG): container finished" podID="8d574e82-f840-4f0c-982d-f6a133bd64ae" containerID="5f51f2f66c61a102a6b43ee525bfb8b5ff9da77472d4107c4db4ba5e29f6a9ee" exitCode=0 Mar 01 09:12:46 crc kubenswrapper[4792]: I0301 09:12:46.601615 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539272-nq8dk" event={"ID":"8d574e82-f840-4f0c-982d-f6a133bd64ae","Type":"ContainerDied","Data":"5f51f2f66c61a102a6b43ee525bfb8b5ff9da77472d4107c4db4ba5e29f6a9ee"} Mar 01 09:12:46 crc kubenswrapper[4792]: I0301 09:12:46.890542 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539270-q7hck" Mar 01 09:12:47 crc kubenswrapper[4792]: I0301 09:12:47.059839 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ql5bq\" (UniqueName: \"kubernetes.io/projected/b4130507-2de2-48c2-9c3f-e9474aeca556-kube-api-access-ql5bq\") pod \"b4130507-2de2-48c2-9c3f-e9474aeca556\" (UID: \"b4130507-2de2-48c2-9c3f-e9474aeca556\") " Mar 01 09:12:47 crc kubenswrapper[4792]: I0301 09:12:47.065367 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4130507-2de2-48c2-9c3f-e9474aeca556-kube-api-access-ql5bq" (OuterVolumeSpecName: "kube-api-access-ql5bq") pod "b4130507-2de2-48c2-9c3f-e9474aeca556" (UID: "b4130507-2de2-48c2-9c3f-e9474aeca556"). InnerVolumeSpecName "kube-api-access-ql5bq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:12:47 crc kubenswrapper[4792]: I0301 09:12:47.161717 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ql5bq\" (UniqueName: \"kubernetes.io/projected/b4130507-2de2-48c2-9c3f-e9474aeca556-kube-api-access-ql5bq\") on node \"crc\" DevicePath \"\"" Mar 01 09:12:47 crc kubenswrapper[4792]: I0301 09:12:47.342016 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" Mar 01 09:12:47 crc kubenswrapper[4792]: I0301 09:12:47.346870 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" Mar 01 09:12:47 crc kubenswrapper[4792]: I0301 09:12:47.431110 4792 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-25 19:58:31.172266767 +0000 UTC Mar 01 09:12:47 crc kubenswrapper[4792]: I0301 09:12:47.431138 4792 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7186h45m43.741130849s for next certificate rotation Mar 01 09:12:47 crc kubenswrapper[4792]: I0301 09:12:47.607998 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539270-q7hck" event={"ID":"b4130507-2de2-48c2-9c3f-e9474aeca556","Type":"ContainerDied","Data":"ff027c882d14f4f304f09047fac83f66f0a08cd7a1d80eea6711ce8cef3e6448"} Mar 01 09:12:47 crc kubenswrapper[4792]: I0301 09:12:47.608034 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff027c882d14f4f304f09047fac83f66f0a08cd7a1d80eea6711ce8cef3e6448" Mar 01 09:12:47 crc kubenswrapper[4792]: I0301 09:12:47.608118 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539270-q7hck" Mar 01 09:12:47 crc kubenswrapper[4792]: I0301 09:12:47.884358 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539272-nq8dk" Mar 01 09:12:47 crc kubenswrapper[4792]: I0301 09:12:47.972366 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjz8h\" (UniqueName: \"kubernetes.io/projected/8d574e82-f840-4f0c-982d-f6a133bd64ae-kube-api-access-hjz8h\") pod \"8d574e82-f840-4f0c-982d-f6a133bd64ae\" (UID: \"8d574e82-f840-4f0c-982d-f6a133bd64ae\") " Mar 01 09:12:47 crc kubenswrapper[4792]: I0301 09:12:47.975830 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d574e82-f840-4f0c-982d-f6a133bd64ae-kube-api-access-hjz8h" (OuterVolumeSpecName: "kube-api-access-hjz8h") pod "8d574e82-f840-4f0c-982d-f6a133bd64ae" (UID: "8d574e82-f840-4f0c-982d-f6a133bd64ae"). InnerVolumeSpecName "kube-api-access-hjz8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:12:48 crc kubenswrapper[4792]: I0301 09:12:48.074469 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjz8h\" (UniqueName: \"kubernetes.io/projected/8d574e82-f840-4f0c-982d-f6a133bd64ae-kube-api-access-hjz8h\") on node \"crc\" DevicePath \"\"" Mar 01 09:12:48 crc kubenswrapper[4792]: I0301 09:12:48.615245 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n28r8" event={"ID":"e03dee1d-d7ca-422c-8af6-faa0c1af3863","Type":"ContainerStarted","Data":"d7a22eb25032f48508905d8110fc2779a9b8e1d8380aa44e93999853b14a1f56"} Mar 01 09:12:48 crc kubenswrapper[4792]: I0301 09:12:48.616521 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539272-nq8dk" event={"ID":"8d574e82-f840-4f0c-982d-f6a133bd64ae","Type":"ContainerDied","Data":"41f9d0eeab6a97d7a4669fab8bc8a58ebb5b5c501c74bede3e0472eab0fbf2e6"} Mar 01 09:12:48 crc kubenswrapper[4792]: I0301 09:12:48.616599 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41f9d0eeab6a97d7a4669fab8bc8a58ebb5b5c501c74bede3e0472eab0fbf2e6" Mar 01 09:12:48 crc kubenswrapper[4792]: I0301 09:12:48.616612 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539272-nq8dk" Mar 01 09:12:49 crc kubenswrapper[4792]: I0301 09:12:49.621606 4792 generic.go:334] "Generic (PLEG): container finished" podID="e03dee1d-d7ca-422c-8af6-faa0c1af3863" containerID="d7a22eb25032f48508905d8110fc2779a9b8e1d8380aa44e93999853b14a1f56" exitCode=0 Mar 01 09:12:49 crc kubenswrapper[4792]: I0301 09:12:49.622658 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n28r8" event={"ID":"e03dee1d-d7ca-422c-8af6-faa0c1af3863","Type":"ContainerDied","Data":"d7a22eb25032f48508905d8110fc2779a9b8e1d8380aa44e93999853b14a1f56"} Mar 01 09:12:49 crc kubenswrapper[4792]: I0301 09:12:49.625006 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ftk4v" event={"ID":"e93e87c0-86c5-446d-9f43-71d17960a351","Type":"ContainerStarted","Data":"4ae585f0378f09e8557ad2a19a0474c0c5f7679c6fe6f603a65e86c389831b72"} Mar 01 09:12:50 crc kubenswrapper[4792]: I0301 09:12:50.636294 4792 generic.go:334] "Generic (PLEG): container finished" podID="fee8fc8f-8d72-4606-b115-4197f599cfcb" containerID="8da1a6a75bb09923b49fb00136c53f3ad6da4b84f38958d2ae68061cac2e183c" exitCode=0 Mar 01 09:12:50 crc kubenswrapper[4792]: I0301 09:12:50.636331 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zwt8t" event={"ID":"fee8fc8f-8d72-4606-b115-4197f599cfcb","Type":"ContainerDied","Data":"8da1a6a75bb09923b49fb00136c53f3ad6da4b84f38958d2ae68061cac2e183c"} Mar 01 09:12:50 crc kubenswrapper[4792]: I0301 09:12:50.638132 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n28r8" event={"ID":"e03dee1d-d7ca-422c-8af6-faa0c1af3863","Type":"ContainerStarted","Data":"1897e3e89e2859f0dcfe575b896a35cdde57822147e478674713823e7d25153f"} Mar 01 09:12:50 crc kubenswrapper[4792]: I0301 09:12:50.642894 4792 generic.go:334] "Generic (PLEG): container finished" podID="e93e87c0-86c5-446d-9f43-71d17960a351" containerID="4ae585f0378f09e8557ad2a19a0474c0c5f7679c6fe6f603a65e86c389831b72" exitCode=0 Mar 01 09:12:50 crc kubenswrapper[4792]: I0301 09:12:50.642950 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ftk4v" event={"ID":"e93e87c0-86c5-446d-9f43-71d17960a351","Type":"ContainerDied","Data":"4ae585f0378f09e8557ad2a19a0474c0c5f7679c6fe6f603a65e86c389831b72"} Mar 01 09:12:50 crc kubenswrapper[4792]: I0301 09:12:50.672898 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-n28r8" podStartSLOduration=3.532604609 podStartE2EDuration="1m6.67288009s" podCreationTimestamp="2026-03-01 09:11:44 +0000 UTC" firstStartedPulling="2026-03-01 09:11:47.034734229 +0000 UTC m=+236.276613426" lastFinishedPulling="2026-03-01 09:12:50.17500971 +0000 UTC m=+299.416888907" observedRunningTime="2026-03-01 09:12:50.671391943 +0000 UTC m=+299.913271150" watchObservedRunningTime="2026-03-01 09:12:50.67288009 +0000 UTC m=+299.914759287" Mar 01 09:12:51 crc kubenswrapper[4792]: I0301 09:12:51.664263 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zwt8t" event={"ID":"fee8fc8f-8d72-4606-b115-4197f599cfcb","Type":"ContainerStarted","Data":"f48560c9b3d2ef7a82a59fb15bc8fde5c80832322a811c783edfd4310dd071e6"} Mar 01 09:12:51 crc kubenswrapper[4792]: I0301 09:12:51.667263 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ftk4v" event={"ID":"e93e87c0-86c5-446d-9f43-71d17960a351","Type":"ContainerStarted","Data":"66fe92889b2819aa449c7b25e42dabb8e422bdcf054e9c103637bfad90b589e3"} Mar 01 09:12:51 crc kubenswrapper[4792]: I0301 09:12:51.681793 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zwt8t" podStartSLOduration=2.398941363 podStartE2EDuration="1m6.68177938s" podCreationTimestamp="2026-03-01 09:11:45 +0000 UTC" firstStartedPulling="2026-03-01 09:11:46.998521037 +0000 UTC m=+236.240400234" lastFinishedPulling="2026-03-01 09:12:51.281359054 +0000 UTC m=+300.523238251" observedRunningTime="2026-03-01 09:12:51.681698488 +0000 UTC m=+300.923577675" watchObservedRunningTime="2026-03-01 09:12:51.68177938 +0000 UTC m=+300.923658577" Mar 01 09:12:51 crc kubenswrapper[4792]: I0301 09:12:51.702836 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ftk4v" podStartSLOduration=2.6093556209999997 podStartE2EDuration="1m5.702820915s" podCreationTimestamp="2026-03-01 09:11:46 +0000 UTC" firstStartedPulling="2026-03-01 09:11:48.072900337 +0000 UTC m=+237.314779534" lastFinishedPulling="2026-03-01 09:12:51.166365631 +0000 UTC m=+300.408244828" observedRunningTime="2026-03-01 09:12:51.701204775 +0000 UTC m=+300.943083972" watchObservedRunningTime="2026-03-01 09:12:51.702820915 +0000 UTC m=+300.944700112" Mar 01 09:12:52 crc kubenswrapper[4792]: I0301 09:12:52.678599 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l7ngd" event={"ID":"8333a325-229b-4dfd-a1f8-966f39bf55fc","Type":"ContainerStarted","Data":"fe153c70dda9f9d22a7d85fdfcdcada881b1df1b9449eeb2a850a3a6208653c3"} Mar 01 09:12:53 crc kubenswrapper[4792]: I0301 09:12:53.684618 4792 generic.go:334] "Generic (PLEG): container finished" podID="8333a325-229b-4dfd-a1f8-966f39bf55fc" containerID="fe153c70dda9f9d22a7d85fdfcdcada881b1df1b9449eeb2a850a3a6208653c3" exitCode=0 Mar 01 09:12:53 crc kubenswrapper[4792]: I0301 09:12:53.684663 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l7ngd" event={"ID":"8333a325-229b-4dfd-a1f8-966f39bf55fc","Type":"ContainerDied","Data":"fe153c70dda9f9d22a7d85fdfcdcada881b1df1b9449eeb2a850a3a6208653c3"} Mar 01 09:12:54 crc kubenswrapper[4792]: I0301 09:12:54.690732 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l7ngd" event={"ID":"8333a325-229b-4dfd-a1f8-966f39bf55fc","Type":"ContainerStarted","Data":"eb013fd09760d3456bbf7d33bd14056334305ee37d0672c8be7735153ad43a65"} Mar 01 09:12:55 crc kubenswrapper[4792]: I0301 09:12:55.147224 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-n28r8" Mar 01 09:12:55 crc kubenswrapper[4792]: I0301 09:12:55.147311 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-n28r8" Mar 01 09:12:55 crc kubenswrapper[4792]: I0301 09:12:55.430790 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-l7ngd" podStartSLOduration=2.768727987 podStartE2EDuration="1m12.430774877s" podCreationTimestamp="2026-03-01 09:11:43 +0000 UTC" firstStartedPulling="2026-03-01 09:11:44.809652582 +0000 UTC m=+234.051531779" lastFinishedPulling="2026-03-01 09:12:54.471699472 +0000 UTC m=+303.713578669" observedRunningTime="2026-03-01 09:12:54.71137109 +0000 UTC m=+303.953250287" watchObservedRunningTime="2026-03-01 09:12:55.430774877 +0000 UTC m=+304.672654074" Mar 01 09:12:55 crc kubenswrapper[4792]: I0301 09:12:55.517258 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-n28r8" Mar 01 09:12:55 crc kubenswrapper[4792]: I0301 09:12:55.543531 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zwt8t" Mar 01 09:12:55 crc kubenswrapper[4792]: I0301 09:12:55.543575 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zwt8t" Mar 01 09:12:55 crc kubenswrapper[4792]: I0301 09:12:55.587754 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zwt8t" Mar 01 09:12:55 crc kubenswrapper[4792]: I0301 09:12:55.734492 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-n28r8" Mar 01 09:12:56 crc kubenswrapper[4792]: I0301 09:12:56.624042 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ftk4v" Mar 01 09:12:56 crc kubenswrapper[4792]: I0301 09:12:56.625498 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ftk4v" Mar 01 09:12:56 crc kubenswrapper[4792]: I0301 09:12:56.703565 4792 generic.go:334] "Generic (PLEG): container finished" podID="9073e3da-2d6f-48a3-907a-e347f28559ae" containerID="4b4748f2f641b6c36501b4a63c23878d8149ab901a48ba6ca75290682667f801" exitCode=0 Mar 01 09:12:56 crc kubenswrapper[4792]: I0301 09:12:56.704598 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wxb87" event={"ID":"9073e3da-2d6f-48a3-907a-e347f28559ae","Type":"ContainerDied","Data":"4b4748f2f641b6c36501b4a63c23878d8149ab901a48ba6ca75290682667f801"} Mar 01 09:12:57 crc kubenswrapper[4792]: I0301 09:12:57.665290 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ftk4v" podUID="e93e87c0-86c5-446d-9f43-71d17960a351" containerName="registry-server" probeResult="failure" output=< Mar 01 09:12:57 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 09:12:57 crc kubenswrapper[4792]: > Mar 01 09:12:57 crc kubenswrapper[4792]: I0301 09:12:57.709932 4792 generic.go:334] "Generic (PLEG): container finished" podID="dff0d675-52dd-4cac-a7be-8750333c28e3" containerID="60c5c14884f306ec02b79c73b52f33a5df66be0f71db3ee1b928c0b932dd06d6" exitCode=0 Mar 01 09:12:57 crc kubenswrapper[4792]: I0301 09:12:57.710009 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cw675" event={"ID":"dff0d675-52dd-4cac-a7be-8750333c28e3","Type":"ContainerDied","Data":"60c5c14884f306ec02b79c73b52f33a5df66be0f71db3ee1b928c0b932dd06d6"} Mar 01 09:12:57 crc kubenswrapper[4792]: I0301 09:12:57.711799 4792 generic.go:334] "Generic (PLEG): container finished" podID="6fd91972-6bfc-4041-abc2-8f4298584603" containerID="303602554c4af893edadfe462be0a7b315bc51a29028b034dfe41a16ef93ff3e" exitCode=0 Mar 01 09:12:57 crc kubenswrapper[4792]: I0301 09:12:57.711830 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p6b9w" event={"ID":"6fd91972-6bfc-4041-abc2-8f4298584603","Type":"ContainerDied","Data":"303602554c4af893edadfe462be0a7b315bc51a29028b034dfe41a16ef93ff3e"} Mar 01 09:12:57 crc kubenswrapper[4792]: I0301 09:12:57.716134 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wxb87" event={"ID":"9073e3da-2d6f-48a3-907a-e347f28559ae","Type":"ContainerStarted","Data":"9dea4572b2cae8fb3cd81b4d5fe3e648bf003be8e11b2fe3e243bf7601f66f32"} Mar 01 09:12:57 crc kubenswrapper[4792]: I0301 09:12:57.785874 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wxb87" podStartSLOduration=3.445852139 podStartE2EDuration="1m15.785850213s" podCreationTimestamp="2026-03-01 09:11:42 +0000 UTC" firstStartedPulling="2026-03-01 09:11:44.807849498 +0000 UTC m=+234.049728695" lastFinishedPulling="2026-03-01 09:12:57.147847572 +0000 UTC m=+306.389726769" observedRunningTime="2026-03-01 09:12:57.780737045 +0000 UTC m=+307.022616252" watchObservedRunningTime="2026-03-01 09:12:57.785850213 +0000 UTC m=+307.027729420" Mar 01 09:12:57 crc kubenswrapper[4792]: I0301 09:12:57.866796 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-848d8759d6-kmmxk"] Mar 01 09:12:57 crc kubenswrapper[4792]: I0301 09:12:57.867047 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" podUID="1761bae5-8e03-478f-938c-df41041a062c" containerName="controller-manager" containerID="cri-o://c6c92a3162c61d2929d59a565ad6956d560ccb189eb5098f9d86e3c3e62e3291" gracePeriod=30 Mar 01 09:12:57 crc kubenswrapper[4792]: I0301 09:12:57.976881 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn"] Mar 01 09:12:57 crc kubenswrapper[4792]: I0301 09:12:57.977122 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" podUID="810605ea-bf2f-4cd2-87a9-a09e9d5e7110" containerName="route-controller-manager" containerID="cri-o://6c2c0cfa98afb0d975335d6142781b6265b7d1ccb65a4ca10172168101d7316d" gracePeriod=30 Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.737312 4792 generic.go:334] "Generic (PLEG): container finished" podID="810605ea-bf2f-4cd2-87a9-a09e9d5e7110" containerID="6c2c0cfa98afb0d975335d6142781b6265b7d1ccb65a4ca10172168101d7316d" exitCode=0 Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.737402 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" event={"ID":"810605ea-bf2f-4cd2-87a9-a09e9d5e7110","Type":"ContainerDied","Data":"6c2c0cfa98afb0d975335d6142781b6265b7d1ccb65a4ca10172168101d7316d"} Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.740978 4792 generic.go:334] "Generic (PLEG): container finished" podID="1761bae5-8e03-478f-938c-df41041a062c" containerID="c6c92a3162c61d2929d59a565ad6956d560ccb189eb5098f9d86e3c3e62e3291" exitCode=0 Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.741041 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" event={"ID":"1761bae5-8e03-478f-938c-df41041a062c","Type":"ContainerDied","Data":"c6c92a3162c61d2929d59a565ad6956d560ccb189eb5098f9d86e3c3e62e3291"} Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.877456 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.910130 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk"] Mar 01 09:12:59 crc kubenswrapper[4792]: E0301 09:12:59.910362 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d574e82-f840-4f0c-982d-f6a133bd64ae" containerName="oc" Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.910376 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d574e82-f840-4f0c-982d-f6a133bd64ae" containerName="oc" Mar 01 09:12:59 crc kubenswrapper[4792]: E0301 09:12:59.910391 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15c3e52b-97f9-45e8-a7ba-c360739547e7" containerName="pruner" Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.910398 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="15c3e52b-97f9-45e8-a7ba-c360739547e7" containerName="pruner" Mar 01 09:12:59 crc kubenswrapper[4792]: E0301 09:12:59.910412 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4130507-2de2-48c2-9c3f-e9474aeca556" containerName="oc" Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.910420 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4130507-2de2-48c2-9c3f-e9474aeca556" containerName="oc" Mar 01 09:12:59 crc kubenswrapper[4792]: E0301 09:12:59.910437 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="810605ea-bf2f-4cd2-87a9-a09e9d5e7110" containerName="route-controller-manager" Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.910445 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="810605ea-bf2f-4cd2-87a9-a09e9d5e7110" containerName="route-controller-manager" Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.910558 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4130507-2de2-48c2-9c3f-e9474aeca556" containerName="oc" Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.910574 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="810605ea-bf2f-4cd2-87a9-a09e9d5e7110" containerName="route-controller-manager" Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.910585 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d574e82-f840-4f0c-982d-f6a133bd64ae" containerName="oc" Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.910600 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="15c3e52b-97f9-45e8-a7ba-c360739547e7" containerName="pruner" Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.912406 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.926997 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.929880 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk"] Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.944537 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-client-ca\") pod \"810605ea-bf2f-4cd2-87a9-a09e9d5e7110\" (UID: \"810605ea-bf2f-4cd2-87a9-a09e9d5e7110\") " Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.944588 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pz4dk\" (UniqueName: \"kubernetes.io/projected/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-kube-api-access-pz4dk\") pod \"810605ea-bf2f-4cd2-87a9-a09e9d5e7110\" (UID: \"810605ea-bf2f-4cd2-87a9-a09e9d5e7110\") " Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.944610 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-config\") pod \"810605ea-bf2f-4cd2-87a9-a09e9d5e7110\" (UID: \"810605ea-bf2f-4cd2-87a9-a09e9d5e7110\") " Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.944649 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-serving-cert\") pod \"810605ea-bf2f-4cd2-87a9-a09e9d5e7110\" (UID: \"810605ea-bf2f-4cd2-87a9-a09e9d5e7110\") " Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.945187 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-client-ca" (OuterVolumeSpecName: "client-ca") pod "810605ea-bf2f-4cd2-87a9-a09e9d5e7110" (UID: "810605ea-bf2f-4cd2-87a9-a09e9d5e7110"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.945439 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-config" (OuterVolumeSpecName: "config") pod "810605ea-bf2f-4cd2-87a9-a09e9d5e7110" (UID: "810605ea-bf2f-4cd2-87a9-a09e9d5e7110"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.951041 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "810605ea-bf2f-4cd2-87a9-a09e9d5e7110" (UID: "810605ea-bf2f-4cd2-87a9-a09e9d5e7110"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.966130 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-kube-api-access-pz4dk" (OuterVolumeSpecName: "kube-api-access-pz4dk") pod "810605ea-bf2f-4cd2-87a9-a09e9d5e7110" (UID: "810605ea-bf2f-4cd2-87a9-a09e9d5e7110"). InnerVolumeSpecName "kube-api-access-pz4dk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.046279 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1761bae5-8e03-478f-938c-df41041a062c-client-ca\") pod \"1761bae5-8e03-478f-938c-df41041a062c\" (UID: \"1761bae5-8e03-478f-938c-df41041a062c\") " Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.046348 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l558v\" (UniqueName: \"kubernetes.io/projected/1761bae5-8e03-478f-938c-df41041a062c-kube-api-access-l558v\") pod \"1761bae5-8e03-478f-938c-df41041a062c\" (UID: \"1761bae5-8e03-478f-938c-df41041a062c\") " Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.046384 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1761bae5-8e03-478f-938c-df41041a062c-proxy-ca-bundles\") pod \"1761bae5-8e03-478f-938c-df41041a062c\" (UID: \"1761bae5-8e03-478f-938c-df41041a062c\") " Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.046430 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1761bae5-8e03-478f-938c-df41041a062c-serving-cert\") pod \"1761bae5-8e03-478f-938c-df41041a062c\" (UID: \"1761bae5-8e03-478f-938c-df41041a062c\") " Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.046456 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1761bae5-8e03-478f-938c-df41041a062c-config\") pod \"1761bae5-8e03-478f-938c-df41041a062c\" (UID: \"1761bae5-8e03-478f-938c-df41041a062c\") " Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.046630 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f32059d6-24aa-4b5a-afb2-b2cf75704d53-config\") pod \"route-controller-manager-79b74d7999-nwdjk\" (UID: \"f32059d6-24aa-4b5a-afb2-b2cf75704d53\") " pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.046697 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f32059d6-24aa-4b5a-afb2-b2cf75704d53-client-ca\") pod \"route-controller-manager-79b74d7999-nwdjk\" (UID: \"f32059d6-24aa-4b5a-afb2-b2cf75704d53\") " pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.046724 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f32059d6-24aa-4b5a-afb2-b2cf75704d53-serving-cert\") pod \"route-controller-manager-79b74d7999-nwdjk\" (UID: \"f32059d6-24aa-4b5a-afb2-b2cf75704d53\") " pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.046748 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chxj8\" (UniqueName: \"kubernetes.io/projected/f32059d6-24aa-4b5a-afb2-b2cf75704d53-kube-api-access-chxj8\") pod \"route-controller-manager-79b74d7999-nwdjk\" (UID: \"f32059d6-24aa-4b5a-afb2-b2cf75704d53\") " pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.046802 4792 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-client-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.046818 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pz4dk\" (UniqueName: \"kubernetes.io/projected/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-kube-api-access-pz4dk\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.046830 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.046841 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.047715 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1761bae5-8e03-478f-938c-df41041a062c-client-ca" (OuterVolumeSpecName: "client-ca") pod "1761bae5-8e03-478f-938c-df41041a062c" (UID: "1761bae5-8e03-478f-938c-df41041a062c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.047732 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1761bae5-8e03-478f-938c-df41041a062c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "1761bae5-8e03-478f-938c-df41041a062c" (UID: "1761bae5-8e03-478f-938c-df41041a062c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.047859 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1761bae5-8e03-478f-938c-df41041a062c-config" (OuterVolumeSpecName: "config") pod "1761bae5-8e03-478f-938c-df41041a062c" (UID: "1761bae5-8e03-478f-938c-df41041a062c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.049653 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1761bae5-8e03-478f-938c-df41041a062c-kube-api-access-l558v" (OuterVolumeSpecName: "kube-api-access-l558v") pod "1761bae5-8e03-478f-938c-df41041a062c" (UID: "1761bae5-8e03-478f-938c-df41041a062c"). InnerVolumeSpecName "kube-api-access-l558v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.049808 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1761bae5-8e03-478f-938c-df41041a062c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1761bae5-8e03-478f-938c-df41041a062c" (UID: "1761bae5-8e03-478f-938c-df41041a062c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.147758 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f32059d6-24aa-4b5a-afb2-b2cf75704d53-client-ca\") pod \"route-controller-manager-79b74d7999-nwdjk\" (UID: \"f32059d6-24aa-4b5a-afb2-b2cf75704d53\") " pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.147808 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f32059d6-24aa-4b5a-afb2-b2cf75704d53-serving-cert\") pod \"route-controller-manager-79b74d7999-nwdjk\" (UID: \"f32059d6-24aa-4b5a-afb2-b2cf75704d53\") " pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.147830 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chxj8\" (UniqueName: \"kubernetes.io/projected/f32059d6-24aa-4b5a-afb2-b2cf75704d53-kube-api-access-chxj8\") pod \"route-controller-manager-79b74d7999-nwdjk\" (UID: \"f32059d6-24aa-4b5a-afb2-b2cf75704d53\") " pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.147895 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f32059d6-24aa-4b5a-afb2-b2cf75704d53-config\") pod \"route-controller-manager-79b74d7999-nwdjk\" (UID: \"f32059d6-24aa-4b5a-afb2-b2cf75704d53\") " pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.147977 4792 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1761bae5-8e03-478f-938c-df41041a062c-client-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.147999 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l558v\" (UniqueName: \"kubernetes.io/projected/1761bae5-8e03-478f-938c-df41041a062c-kube-api-access-l558v\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.148013 4792 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1761bae5-8e03-478f-938c-df41041a062c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.148023 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1761bae5-8e03-478f-938c-df41041a062c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.148033 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1761bae5-8e03-478f-938c-df41041a062c-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.149284 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f32059d6-24aa-4b5a-afb2-b2cf75704d53-config\") pod \"route-controller-manager-79b74d7999-nwdjk\" (UID: \"f32059d6-24aa-4b5a-afb2-b2cf75704d53\") " pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.151206 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f32059d6-24aa-4b5a-afb2-b2cf75704d53-client-ca\") pod \"route-controller-manager-79b74d7999-nwdjk\" (UID: \"f32059d6-24aa-4b5a-afb2-b2cf75704d53\") " pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.156917 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f32059d6-24aa-4b5a-afb2-b2cf75704d53-serving-cert\") pod \"route-controller-manager-79b74d7999-nwdjk\" (UID: \"f32059d6-24aa-4b5a-afb2-b2cf75704d53\") " pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.162930 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chxj8\" (UniqueName: \"kubernetes.io/projected/f32059d6-24aa-4b5a-afb2-b2cf75704d53-kube-api-access-chxj8\") pod \"route-controller-manager-79b74d7999-nwdjk\" (UID: \"f32059d6-24aa-4b5a-afb2-b2cf75704d53\") " pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.241473 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.444876 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk"] Mar 01 09:13:00 crc kubenswrapper[4792]: W0301 09:13:00.451299 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf32059d6_24aa_4b5a_afb2_b2cf75704d53.slice/crio-8e9b5bce7a19f5245dcb156079f150381ad068104bb67fabd9a693966fe2fe05 WatchSource:0}: Error finding container 8e9b5bce7a19f5245dcb156079f150381ad068104bb67fabd9a693966fe2fe05: Status 404 returned error can't find the container with id 8e9b5bce7a19f5245dcb156079f150381ad068104bb67fabd9a693966fe2fe05 Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.754999 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" event={"ID":"f32059d6-24aa-4b5a-afb2-b2cf75704d53","Type":"ContainerStarted","Data":"8e9b5bce7a19f5245dcb156079f150381ad068104bb67fabd9a693966fe2fe05"} Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.756251 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" event={"ID":"810605ea-bf2f-4cd2-87a9-a09e9d5e7110","Type":"ContainerDied","Data":"ec7c242400c3788f70c2a91c8d7378bde577b0383ee21a7d115406b72e0bfbe5"} Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.756281 4792 scope.go:117] "RemoveContainer" containerID="6c2c0cfa98afb0d975335d6142781b6265b7d1ccb65a4ca10172168101d7316d" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.756373 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.759328 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" event={"ID":"1761bae5-8e03-478f-938c-df41041a062c","Type":"ContainerDied","Data":"7470d72dfdecb7555879e3e75bbc32cdde09ed41a6f66ab3ea84fdfcb9d1191a"} Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.759518 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.786379 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn"] Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.789985 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn"] Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.799045 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-848d8759d6-kmmxk"] Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.801566 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-848d8759d6-kmmxk"] Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.812371 4792 patch_prober.go:28] interesting pod/controller-manager-848d8759d6-kmmxk container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.62:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.812408 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" podUID="1761bae5-8e03-478f-938c-df41041a062c" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.62:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 01 09:13:01 crc kubenswrapper[4792]: I0301 09:13:01.416697 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1761bae5-8e03-478f-938c-df41041a062c" path="/var/lib/kubelet/pods/1761bae5-8e03-478f-938c-df41041a062c/volumes" Mar 01 09:13:01 crc kubenswrapper[4792]: I0301 09:13:01.418091 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="810605ea-bf2f-4cd2-87a9-a09e9d5e7110" path="/var/lib/kubelet/pods/810605ea-bf2f-4cd2-87a9-a09e9d5e7110/volumes" Mar 01 09:13:01 crc kubenswrapper[4792]: I0301 09:13:01.484378 4792 scope.go:117] "RemoveContainer" containerID="c6c92a3162c61d2929d59a565ad6956d560ccb189eb5098f9d86e3c3e62e3291" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.458721 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-774f47c464-sjvj2"] Mar 01 09:13:02 crc kubenswrapper[4792]: E0301 09:13:02.459318 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1761bae5-8e03-478f-938c-df41041a062c" containerName="controller-manager" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.459334 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="1761bae5-8e03-478f-938c-df41041a062c" containerName="controller-manager" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.459456 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="1761bae5-8e03-478f-938c-df41041a062c" containerName="controller-manager" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.459887 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.462369 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.462626 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.462768 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.462797 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.462881 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.464987 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.473057 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.479759 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-774f47c464-sjvj2"] Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.579175 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb03c466-db7d-4cc2-9744-fb90116eac6f-serving-cert\") pod \"controller-manager-774f47c464-sjvj2\" (UID: \"cb03c466-db7d-4cc2-9744-fb90116eac6f\") " pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.579255 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cb03c466-db7d-4cc2-9744-fb90116eac6f-client-ca\") pod \"controller-manager-774f47c464-sjvj2\" (UID: \"cb03c466-db7d-4cc2-9744-fb90116eac6f\") " pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.579357 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb03c466-db7d-4cc2-9744-fb90116eac6f-config\") pod \"controller-manager-774f47c464-sjvj2\" (UID: \"cb03c466-db7d-4cc2-9744-fb90116eac6f\") " pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.579461 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk7hf\" (UniqueName: \"kubernetes.io/projected/cb03c466-db7d-4cc2-9744-fb90116eac6f-kube-api-access-wk7hf\") pod \"controller-manager-774f47c464-sjvj2\" (UID: \"cb03c466-db7d-4cc2-9744-fb90116eac6f\") " pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.579509 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cb03c466-db7d-4cc2-9744-fb90116eac6f-proxy-ca-bundles\") pod \"controller-manager-774f47c464-sjvj2\" (UID: \"cb03c466-db7d-4cc2-9744-fb90116eac6f\") " pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.680828 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk7hf\" (UniqueName: \"kubernetes.io/projected/cb03c466-db7d-4cc2-9744-fb90116eac6f-kube-api-access-wk7hf\") pod \"controller-manager-774f47c464-sjvj2\" (UID: \"cb03c466-db7d-4cc2-9744-fb90116eac6f\") " pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.680975 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cb03c466-db7d-4cc2-9744-fb90116eac6f-proxy-ca-bundles\") pod \"controller-manager-774f47c464-sjvj2\" (UID: \"cb03c466-db7d-4cc2-9744-fb90116eac6f\") " pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.681094 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb03c466-db7d-4cc2-9744-fb90116eac6f-serving-cert\") pod \"controller-manager-774f47c464-sjvj2\" (UID: \"cb03c466-db7d-4cc2-9744-fb90116eac6f\") " pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.681947 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cb03c466-db7d-4cc2-9744-fb90116eac6f-proxy-ca-bundles\") pod \"controller-manager-774f47c464-sjvj2\" (UID: \"cb03c466-db7d-4cc2-9744-fb90116eac6f\") " pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.682485 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cb03c466-db7d-4cc2-9744-fb90116eac6f-client-ca\") pod \"controller-manager-774f47c464-sjvj2\" (UID: \"cb03c466-db7d-4cc2-9744-fb90116eac6f\") " pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.683701 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cb03c466-db7d-4cc2-9744-fb90116eac6f-client-ca\") pod \"controller-manager-774f47c464-sjvj2\" (UID: \"cb03c466-db7d-4cc2-9744-fb90116eac6f\") " pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.684053 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb03c466-db7d-4cc2-9744-fb90116eac6f-config\") pod \"controller-manager-774f47c464-sjvj2\" (UID: \"cb03c466-db7d-4cc2-9744-fb90116eac6f\") " pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.686214 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb03c466-db7d-4cc2-9744-fb90116eac6f-config\") pod \"controller-manager-774f47c464-sjvj2\" (UID: \"cb03c466-db7d-4cc2-9744-fb90116eac6f\") " pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.687401 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb03c466-db7d-4cc2-9744-fb90116eac6f-serving-cert\") pod \"controller-manager-774f47c464-sjvj2\" (UID: \"cb03c466-db7d-4cc2-9744-fb90116eac6f\") " pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.699185 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk7hf\" (UniqueName: \"kubernetes.io/projected/cb03c466-db7d-4cc2-9744-fb90116eac6f-kube-api-access-wk7hf\") pod \"controller-manager-774f47c464-sjvj2\" (UID: \"cb03c466-db7d-4cc2-9744-fb90116eac6f\") " pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.779401 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" Mar 01 09:13:03 crc kubenswrapper[4792]: I0301 09:13:03.021585 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wxb87" Mar 01 09:13:03 crc kubenswrapper[4792]: I0301 09:13:03.022169 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wxb87" Mar 01 09:13:03 crc kubenswrapper[4792]: I0301 09:13:03.074216 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wxb87" Mar 01 09:13:03 crc kubenswrapper[4792]: I0301 09:13:03.564256 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-l7ngd" Mar 01 09:13:03 crc kubenswrapper[4792]: I0301 09:13:03.564405 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-l7ngd" Mar 01 09:13:03 crc kubenswrapper[4792]: I0301 09:13:03.620643 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-l7ngd" Mar 01 09:13:03 crc kubenswrapper[4792]: I0301 09:13:03.819103 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-l7ngd" Mar 01 09:13:03 crc kubenswrapper[4792]: I0301 09:13:03.830189 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wxb87" Mar 01 09:13:04 crc kubenswrapper[4792]: I0301 09:13:04.942556 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:13:04 crc kubenswrapper[4792]: I0301 09:13:04.942608 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:13:04 crc kubenswrapper[4792]: I0301 09:13:04.942647 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:13:04 crc kubenswrapper[4792]: I0301 09:13:04.943148 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac"} pod="openshift-machine-config-operator/machine-config-daemon-bqszv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 01 09:13:04 crc kubenswrapper[4792]: I0301 09:13:04.943199 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" containerID="cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac" gracePeriod=600 Mar 01 09:13:05 crc kubenswrapper[4792]: I0301 09:13:05.605538 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zwt8t" Mar 01 09:13:06 crc kubenswrapper[4792]: I0301 09:13:06.660225 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ftk4v" Mar 01 09:13:06 crc kubenswrapper[4792]: I0301 09:13:06.698891 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ftk4v" Mar 01 09:13:06 crc kubenswrapper[4792]: I0301 09:13:06.794656 4792 generic.go:334] "Generic (PLEG): container finished" podID="9105f6b0-6f16-47aa-8009-73736a90b765" containerID="47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac" exitCode=0 Mar 01 09:13:06 crc kubenswrapper[4792]: I0301 09:13:06.794993 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerDied","Data":"47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac"} Mar 01 09:13:07 crc kubenswrapper[4792]: I0301 09:13:07.864118 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l7ngd"] Mar 01 09:13:07 crc kubenswrapper[4792]: I0301 09:13:07.864488 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-l7ngd" podUID="8333a325-229b-4dfd-a1f8-966f39bf55fc" containerName="registry-server" containerID="cri-o://eb013fd09760d3456bbf7d33bd14056334305ee37d0672c8be7735153ad43a65" gracePeriod=2 Mar 01 09:13:08 crc kubenswrapper[4792]: I0301 09:13:08.065212 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zwt8t"] Mar 01 09:13:08 crc kubenswrapper[4792]: I0301 09:13:08.065460 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zwt8t" podUID="fee8fc8f-8d72-4606-b115-4197f599cfcb" containerName="registry-server" containerID="cri-o://f48560c9b3d2ef7a82a59fb15bc8fde5c80832322a811c783edfd4310dd071e6" gracePeriod=2 Mar 01 09:13:08 crc kubenswrapper[4792]: I0301 09:13:08.302222 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" podUID="020a8218-62f4-4abf-a8d2-fed602de5f7f" containerName="oauth-openshift" containerID="cri-o://63337be8a65e2fc4ccac6c7a4ef78cbcdcc5f66cc06d8b699c08445a9271a940" gracePeriod=15 Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.812830 4792 generic.go:334] "Generic (PLEG): container finished" podID="fee8fc8f-8d72-4606-b115-4197f599cfcb" containerID="f48560c9b3d2ef7a82a59fb15bc8fde5c80832322a811c783edfd4310dd071e6" exitCode=0 Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.813220 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zwt8t" event={"ID":"fee8fc8f-8d72-4606-b115-4197f599cfcb","Type":"ContainerDied","Data":"f48560c9b3d2ef7a82a59fb15bc8fde5c80832322a811c783edfd4310dd071e6"} Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.815427 4792 generic.go:334] "Generic (PLEG): container finished" podID="8333a325-229b-4dfd-a1f8-966f39bf55fc" containerID="eb013fd09760d3456bbf7d33bd14056334305ee37d0672c8be7735153ad43a65" exitCode=0 Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.815483 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l7ngd" event={"ID":"8333a325-229b-4dfd-a1f8-966f39bf55fc","Type":"ContainerDied","Data":"eb013fd09760d3456bbf7d33bd14056334305ee37d0672c8be7735153ad43a65"} Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.817009 4792 generic.go:334] "Generic (PLEG): container finished" podID="020a8218-62f4-4abf-a8d2-fed602de5f7f" containerID="63337be8a65e2fc4ccac6c7a4ef78cbcdcc5f66cc06d8b699c08445a9271a940" exitCode=0 Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.817038 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" event={"ID":"020a8218-62f4-4abf-a8d2-fed602de5f7f","Type":"ContainerDied","Data":"63337be8a65e2fc4ccac6c7a4ef78cbcdcc5f66cc06d8b699c08445a9271a940"} Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.889357 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.973361 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-ocp-branding-template\") pod \"020a8218-62f4-4abf-a8d2-fed602de5f7f\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.974500 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-template-provider-selection\") pod \"020a8218-62f4-4abf-a8d2-fed602de5f7f\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.974560 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-cliconfig\") pod \"020a8218-62f4-4abf-a8d2-fed602de5f7f\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.974578 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55wpx\" (UniqueName: \"kubernetes.io/projected/020a8218-62f4-4abf-a8d2-fed602de5f7f-kube-api-access-55wpx\") pod \"020a8218-62f4-4abf-a8d2-fed602de5f7f\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.974599 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/020a8218-62f4-4abf-a8d2-fed602de5f7f-audit-dir\") pod \"020a8218-62f4-4abf-a8d2-fed602de5f7f\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.974633 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-idp-0-file-data\") pod \"020a8218-62f4-4abf-a8d2-fed602de5f7f\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.974661 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-trusted-ca-bundle\") pod \"020a8218-62f4-4abf-a8d2-fed602de5f7f\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.974680 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-router-certs\") pod \"020a8218-62f4-4abf-a8d2-fed602de5f7f\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.976171 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-audit-policies\") pod \"020a8218-62f4-4abf-a8d2-fed602de5f7f\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.976228 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-service-ca\") pod \"020a8218-62f4-4abf-a8d2-fed602de5f7f\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.976244 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-template-login\") pod \"020a8218-62f4-4abf-a8d2-fed602de5f7f\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.974833 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/020a8218-62f4-4abf-a8d2-fed602de5f7f-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "020a8218-62f4-4abf-a8d2-fed602de5f7f" (UID: "020a8218-62f4-4abf-a8d2-fed602de5f7f"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.976265 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-serving-cert\") pod \"020a8218-62f4-4abf-a8d2-fed602de5f7f\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.975308 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "020a8218-62f4-4abf-a8d2-fed602de5f7f" (UID: "020a8218-62f4-4abf-a8d2-fed602de5f7f"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.976215 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "020a8218-62f4-4abf-a8d2-fed602de5f7f" (UID: "020a8218-62f4-4abf-a8d2-fed602de5f7f"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.976285 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-template-error\") pod \"020a8218-62f4-4abf-a8d2-fed602de5f7f\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.976441 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-session\") pod \"020a8218-62f4-4abf-a8d2-fed602de5f7f\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.976754 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "020a8218-62f4-4abf-a8d2-fed602de5f7f" (UID: "020a8218-62f4-4abf-a8d2-fed602de5f7f"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.977137 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "020a8218-62f4-4abf-a8d2-fed602de5f7f" (UID: "020a8218-62f4-4abf-a8d2-fed602de5f7f"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.977334 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.977438 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.977462 4792 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/020a8218-62f4-4abf-a8d2-fed602de5f7f-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.977482 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.977500 4792 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.980379 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "020a8218-62f4-4abf-a8d2-fed602de5f7f" (UID: "020a8218-62f4-4abf-a8d2-fed602de5f7f"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.981491 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "020a8218-62f4-4abf-a8d2-fed602de5f7f" (UID: "020a8218-62f4-4abf-a8d2-fed602de5f7f"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.982205 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "020a8218-62f4-4abf-a8d2-fed602de5f7f" (UID: "020a8218-62f4-4abf-a8d2-fed602de5f7f"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.983885 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/020a8218-62f4-4abf-a8d2-fed602de5f7f-kube-api-access-55wpx" (OuterVolumeSpecName: "kube-api-access-55wpx") pod "020a8218-62f4-4abf-a8d2-fed602de5f7f" (UID: "020a8218-62f4-4abf-a8d2-fed602de5f7f"). InnerVolumeSpecName "kube-api-access-55wpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.985495 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "020a8218-62f4-4abf-a8d2-fed602de5f7f" (UID: "020a8218-62f4-4abf-a8d2-fed602de5f7f"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.986162 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "020a8218-62f4-4abf-a8d2-fed602de5f7f" (UID: "020a8218-62f4-4abf-a8d2-fed602de5f7f"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.987278 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "020a8218-62f4-4abf-a8d2-fed602de5f7f" (UID: "020a8218-62f4-4abf-a8d2-fed602de5f7f"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.987457 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "020a8218-62f4-4abf-a8d2-fed602de5f7f" (UID: "020a8218-62f4-4abf-a8d2-fed602de5f7f"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.987656 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "020a8218-62f4-4abf-a8d2-fed602de5f7f" (UID: "020a8218-62f4-4abf-a8d2-fed602de5f7f"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.078409 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.078445 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.078458 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.078471 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.078486 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.078500 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.078514 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55wpx\" (UniqueName: \"kubernetes.io/projected/020a8218-62f4-4abf-a8d2-fed602de5f7f-kube-api-access-55wpx\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.078527 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.078538 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.254801 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-774f47c464-sjvj2"] Mar 01 09:13:10 crc kubenswrapper[4792]: W0301 09:13:10.283999 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb03c466_db7d_4cc2_9744_fb90116eac6f.slice/crio-2782208b0d666f0e62b1d1210bb9b0f8f9d295fb7d4cff04f1224273c092f525 WatchSource:0}: Error finding container 2782208b0d666f0e62b1d1210bb9b0f8f9d295fb7d4cff04f1224273c092f525: Status 404 returned error can't find the container with id 2782208b0d666f0e62b1d1210bb9b0f8f9d295fb7d4cff04f1224273c092f525 Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.315119 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zwt8t" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.352585 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l7ngd" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.381590 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fee8fc8f-8d72-4606-b115-4197f599cfcb-utilities\") pod \"fee8fc8f-8d72-4606-b115-4197f599cfcb\" (UID: \"fee8fc8f-8d72-4606-b115-4197f599cfcb\") " Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.381634 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjlvw\" (UniqueName: \"kubernetes.io/projected/fee8fc8f-8d72-4606-b115-4197f599cfcb-kube-api-access-mjlvw\") pod \"fee8fc8f-8d72-4606-b115-4197f599cfcb\" (UID: \"fee8fc8f-8d72-4606-b115-4197f599cfcb\") " Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.381702 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fee8fc8f-8d72-4606-b115-4197f599cfcb-catalog-content\") pod \"fee8fc8f-8d72-4606-b115-4197f599cfcb\" (UID: \"fee8fc8f-8d72-4606-b115-4197f599cfcb\") " Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.382860 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fee8fc8f-8d72-4606-b115-4197f599cfcb-utilities" (OuterVolumeSpecName: "utilities") pod "fee8fc8f-8d72-4606-b115-4197f599cfcb" (UID: "fee8fc8f-8d72-4606-b115-4197f599cfcb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.387278 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fee8fc8f-8d72-4606-b115-4197f599cfcb-kube-api-access-mjlvw" (OuterVolumeSpecName: "kube-api-access-mjlvw") pod "fee8fc8f-8d72-4606-b115-4197f599cfcb" (UID: "fee8fc8f-8d72-4606-b115-4197f599cfcb"). InnerVolumeSpecName "kube-api-access-mjlvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.408636 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fee8fc8f-8d72-4606-b115-4197f599cfcb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fee8fc8f-8d72-4606-b115-4197f599cfcb" (UID: "fee8fc8f-8d72-4606-b115-4197f599cfcb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.467620 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ftk4v"] Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.467890 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ftk4v" podUID="e93e87c0-86c5-446d-9f43-71d17960a351" containerName="registry-server" containerID="cri-o://66fe92889b2819aa449c7b25e42dabb8e422bdcf054e9c103637bfad90b589e3" gracePeriod=2 Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.481856 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8"] Mar 01 09:13:10 crc kubenswrapper[4792]: E0301 09:13:10.482114 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8333a325-229b-4dfd-a1f8-966f39bf55fc" containerName="extract-content" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.482134 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8333a325-229b-4dfd-a1f8-966f39bf55fc" containerName="extract-content" Mar 01 09:13:10 crc kubenswrapper[4792]: E0301 09:13:10.482147 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8333a325-229b-4dfd-a1f8-966f39bf55fc" containerName="registry-server" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.482155 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8333a325-229b-4dfd-a1f8-966f39bf55fc" containerName="registry-server" Mar 01 09:13:10 crc kubenswrapper[4792]: E0301 09:13:10.482166 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fee8fc8f-8d72-4606-b115-4197f599cfcb" containerName="extract-utilities" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.482178 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="fee8fc8f-8d72-4606-b115-4197f599cfcb" containerName="extract-utilities" Mar 01 09:13:10 crc kubenswrapper[4792]: E0301 09:13:10.482193 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8333a325-229b-4dfd-a1f8-966f39bf55fc" containerName="extract-utilities" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.482203 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8333a325-229b-4dfd-a1f8-966f39bf55fc" containerName="extract-utilities" Mar 01 09:13:10 crc kubenswrapper[4792]: E0301 09:13:10.482222 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fee8fc8f-8d72-4606-b115-4197f599cfcb" containerName="extract-content" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.482231 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="fee8fc8f-8d72-4606-b115-4197f599cfcb" containerName="extract-content" Mar 01 09:13:10 crc kubenswrapper[4792]: E0301 09:13:10.482243 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="020a8218-62f4-4abf-a8d2-fed602de5f7f" containerName="oauth-openshift" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.482251 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="020a8218-62f4-4abf-a8d2-fed602de5f7f" containerName="oauth-openshift" Mar 01 09:13:10 crc kubenswrapper[4792]: E0301 09:13:10.482259 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fee8fc8f-8d72-4606-b115-4197f599cfcb" containerName="registry-server" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.482266 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="fee8fc8f-8d72-4606-b115-4197f599cfcb" containerName="registry-server" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.482364 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="8333a325-229b-4dfd-a1f8-966f39bf55fc" containerName="registry-server" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.482379 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="020a8218-62f4-4abf-a8d2-fed602de5f7f" containerName="oauth-openshift" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.482392 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="fee8fc8f-8d72-4606-b115-4197f599cfcb" containerName="registry-server" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.483010 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8333a325-229b-4dfd-a1f8-966f39bf55fc-catalog-content\") pod \"8333a325-229b-4dfd-a1f8-966f39bf55fc\" (UID: \"8333a325-229b-4dfd-a1f8-966f39bf55fc\") " Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.483062 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jn884\" (UniqueName: \"kubernetes.io/projected/8333a325-229b-4dfd-a1f8-966f39bf55fc-kube-api-access-jn884\") pod \"8333a325-229b-4dfd-a1f8-966f39bf55fc\" (UID: \"8333a325-229b-4dfd-a1f8-966f39bf55fc\") " Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.483092 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8333a325-229b-4dfd-a1f8-966f39bf55fc-utilities\") pod \"8333a325-229b-4dfd-a1f8-966f39bf55fc\" (UID: \"8333a325-229b-4dfd-a1f8-966f39bf55fc\") " Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.483413 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fee8fc8f-8d72-4606-b115-4197f599cfcb-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.483435 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjlvw\" (UniqueName: \"kubernetes.io/projected/fee8fc8f-8d72-4606-b115-4197f599cfcb-kube-api-access-mjlvw\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.483449 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fee8fc8f-8d72-4606-b115-4197f599cfcb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.484234 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8333a325-229b-4dfd-a1f8-966f39bf55fc-utilities" (OuterVolumeSpecName: "utilities") pod "8333a325-229b-4dfd-a1f8-966f39bf55fc" (UID: "8333a325-229b-4dfd-a1f8-966f39bf55fc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.484432 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.490597 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8"] Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.490950 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8333a325-229b-4dfd-a1f8-966f39bf55fc-kube-api-access-jn884" (OuterVolumeSpecName: "kube-api-access-jn884") pod "8333a325-229b-4dfd-a1f8-966f39bf55fc" (UID: "8333a325-229b-4dfd-a1f8-966f39bf55fc"). InnerVolumeSpecName "kube-api-access-jn884". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.551738 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8333a325-229b-4dfd-a1f8-966f39bf55fc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8333a325-229b-4dfd-a1f8-966f39bf55fc" (UID: "8333a325-229b-4dfd-a1f8-966f39bf55fc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.584804 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.584860 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-system-cliconfig\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.584886 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-system-router-certs\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.584966 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-system-session\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.584990 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-user-template-login\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.585017 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/584dbcf3-9289-47c3-a556-0418b670cb21-audit-dir\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.585193 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.585271 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-user-template-error\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.585293 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmvs6\" (UniqueName: \"kubernetes.io/projected/584dbcf3-9289-47c3-a556-0418b670cb21-kube-api-access-cmvs6\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.585322 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-system-service-ca\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.585408 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.585445 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.585508 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-system-serving-cert\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.585553 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/584dbcf3-9289-47c3-a556-0418b670cb21-audit-policies\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.585644 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jn884\" (UniqueName: \"kubernetes.io/projected/8333a325-229b-4dfd-a1f8-966f39bf55fc-kube-api-access-jn884\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.585662 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8333a325-229b-4dfd-a1f8-966f39bf55fc-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.585674 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8333a325-229b-4dfd-a1f8-966f39bf55fc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.686174 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-system-cliconfig\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.686222 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-system-router-certs\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.686252 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-system-session\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.686269 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-user-template-login\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.686290 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/584dbcf3-9289-47c3-a556-0418b670cb21-audit-dir\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.686311 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.686334 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-user-template-error\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.686351 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmvs6\" (UniqueName: \"kubernetes.io/projected/584dbcf3-9289-47c3-a556-0418b670cb21-kube-api-access-cmvs6\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.686368 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-system-service-ca\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.686386 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.686402 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.686426 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-system-serving-cert\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.686445 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/584dbcf3-9289-47c3-a556-0418b670cb21-audit-policies\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.686466 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.688252 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/584dbcf3-9289-47c3-a556-0418b670cb21-audit-policies\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.688288 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/584dbcf3-9289-47c3-a556-0418b670cb21-audit-dir\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.689010 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.688402 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-system-service-ca\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.689757 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-system-session\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.689842 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-system-cliconfig\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.690922 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-system-router-certs\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.691243 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-system-serving-cert\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.691838 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.692151 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.692198 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.692504 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-user-template-error\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.693998 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-user-template-login\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.711022 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmvs6\" (UniqueName: \"kubernetes.io/projected/584dbcf3-9289-47c3-a556-0418b670cb21-kube-api-access-cmvs6\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.809705 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.825082 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zwt8t" event={"ID":"fee8fc8f-8d72-4606-b115-4197f599cfcb","Type":"ContainerDied","Data":"6a4a6d1ca04b5a791e8cc232ac1bcb86844593ceb569a67c479055eb35e8caec"} Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.825131 4792 scope.go:117] "RemoveContainer" containerID="f48560c9b3d2ef7a82a59fb15bc8fde5c80832322a811c783edfd4310dd071e6" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.825149 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zwt8t" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.830160 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l7ngd" event={"ID":"8333a325-229b-4dfd-a1f8-966f39bf55fc","Type":"ContainerDied","Data":"90fafa3ed5c527a4e521d0e12598d211c5be0e011b432479d9a243fe086b9d81"} Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.830219 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l7ngd" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.832789 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" event={"ID":"020a8218-62f4-4abf-a8d2-fed602de5f7f","Type":"ContainerDied","Data":"87d106e344fa51bb8e5e92cf97b7c6070e8daa571dd37784f078b0bfdb5ba165"} Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.832882 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.837134 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" event={"ID":"cb03c466-db7d-4cc2-9744-fb90116eac6f","Type":"ContainerStarted","Data":"2782208b0d666f0e62b1d1210bb9b0f8f9d295fb7d4cff04f1224273c092f525"} Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.909153 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zwt8t"] Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.952529 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zwt8t"] Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.964826 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l7ngd"] Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.973189 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-l7ngd"] Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.985670 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-prqqp"] Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.988777 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-prqqp"] Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.992520 4792 scope.go:117] "RemoveContainer" containerID="8da1a6a75bb09923b49fb00136c53f3ad6da4b84f38958d2ae68061cac2e183c" Mar 01 09:13:11 crc kubenswrapper[4792]: I0301 09:13:11.024289 4792 scope.go:117] "RemoveContainer" containerID="e49f509c95cf50378f216231453383d3c1c559a790286ac1d1a0c0a75d1546f4" Mar 01 09:13:11 crc kubenswrapper[4792]: I0301 09:13:11.045760 4792 scope.go:117] "RemoveContainer" containerID="eb013fd09760d3456bbf7d33bd14056334305ee37d0672c8be7735153ad43a65" Mar 01 09:13:11 crc kubenswrapper[4792]: I0301 09:13:11.059549 4792 scope.go:117] "RemoveContainer" containerID="fe153c70dda9f9d22a7d85fdfcdcada881b1df1b9449eeb2a850a3a6208653c3" Mar 01 09:13:11 crc kubenswrapper[4792]: I0301 09:13:11.083021 4792 scope.go:117] "RemoveContainer" containerID="ea0440bd6858d820e5ab5d60d7085504804067b9ef42039aabaf07d6f0cd7730" Mar 01 09:13:11 crc kubenswrapper[4792]: I0301 09:13:11.092030 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8"] Mar 01 09:13:11 crc kubenswrapper[4792]: I0301 09:13:11.098305 4792 scope.go:117] "RemoveContainer" containerID="63337be8a65e2fc4ccac6c7a4ef78cbcdcc5f66cc06d8b699c08445a9271a940" Mar 01 09:13:11 crc kubenswrapper[4792]: W0301 09:13:11.101181 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod584dbcf3_9289_47c3_a556_0418b670cb21.slice/crio-ddee0dfea9750e94ae58bfc7dc992b0bfdb5fd94b13629a83b3b0f431dc26676 WatchSource:0}: Error finding container ddee0dfea9750e94ae58bfc7dc992b0bfdb5fd94b13629a83b3b0f431dc26676: Status 404 returned error can't find the container with id ddee0dfea9750e94ae58bfc7dc992b0bfdb5fd94b13629a83b3b0f431dc26676 Mar 01 09:13:11 crc kubenswrapper[4792]: I0301 09:13:11.416082 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="020a8218-62f4-4abf-a8d2-fed602de5f7f" path="/var/lib/kubelet/pods/020a8218-62f4-4abf-a8d2-fed602de5f7f/volumes" Mar 01 09:13:11 crc kubenswrapper[4792]: I0301 09:13:11.416874 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8333a325-229b-4dfd-a1f8-966f39bf55fc" path="/var/lib/kubelet/pods/8333a325-229b-4dfd-a1f8-966f39bf55fc/volumes" Mar 01 09:13:11 crc kubenswrapper[4792]: I0301 09:13:11.417425 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fee8fc8f-8d72-4606-b115-4197f599cfcb" path="/var/lib/kubelet/pods/fee8fc8f-8d72-4606-b115-4197f599cfcb/volumes" Mar 01 09:13:11 crc kubenswrapper[4792]: I0301 09:13:11.843700 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" event={"ID":"cb03c466-db7d-4cc2-9744-fb90116eac6f","Type":"ContainerStarted","Data":"fe3861f72d22b8b2894aaf7d7bf6d91d16cd803c0347d640e158b07a91ef553f"} Mar 01 09:13:11 crc kubenswrapper[4792]: I0301 09:13:11.845386 4792 generic.go:334] "Generic (PLEG): container finished" podID="e93e87c0-86c5-446d-9f43-71d17960a351" containerID="66fe92889b2819aa449c7b25e42dabb8e422bdcf054e9c103637bfad90b589e3" exitCode=0 Mar 01 09:13:11 crc kubenswrapper[4792]: I0301 09:13:11.845439 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ftk4v" event={"ID":"e93e87c0-86c5-446d-9f43-71d17960a351","Type":"ContainerDied","Data":"66fe92889b2819aa449c7b25e42dabb8e422bdcf054e9c103637bfad90b589e3"} Mar 01 09:13:11 crc kubenswrapper[4792]: I0301 09:13:11.846506 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" event={"ID":"f32059d6-24aa-4b5a-afb2-b2cf75704d53","Type":"ContainerStarted","Data":"f1a8ea691a13ea588efa820347a57b92c8419278817b613643f842ce395bc050"} Mar 01 09:13:11 crc kubenswrapper[4792]: I0301 09:13:11.848078 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7nwc2" event={"ID":"22c7c368-3523-4224-aebd-59b29640bed0","Type":"ContainerStarted","Data":"225f228a4c6b7dec18ef0a0e39f231e76893c213f11c4f32a59ac86f3a689ed1"} Mar 01 09:13:11 crc kubenswrapper[4792]: I0301 09:13:11.849646 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p6b9w" event={"ID":"6fd91972-6bfc-4041-abc2-8f4298584603","Type":"ContainerStarted","Data":"9b1ebd40c76d6a05f41f90e7e6ff9f66cc3e0ca75b2cbd8232430f1380f165ea"} Mar 01 09:13:11 crc kubenswrapper[4792]: I0301 09:13:11.853103 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerStarted","Data":"d953fb6bdd1dde8b39f7c850e5987057bc7f87ba2081744a1e12425c6ecc8289"} Mar 01 09:13:11 crc kubenswrapper[4792]: I0301 09:13:11.855416 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cw675" event={"ID":"dff0d675-52dd-4cac-a7be-8750333c28e3","Type":"ContainerStarted","Data":"5bb5a5f0949169743626acf007f4aa939920851854703451752f35c1714d5b63"} Mar 01 09:13:11 crc kubenswrapper[4792]: I0301 09:13:11.856347 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" event={"ID":"584dbcf3-9289-47c3-a556-0418b670cb21","Type":"ContainerStarted","Data":"ddee0dfea9750e94ae58bfc7dc992b0bfdb5fd94b13629a83b3b0f431dc26676"} Mar 01 09:13:12 crc kubenswrapper[4792]: I0301 09:13:12.863267 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" event={"ID":"584dbcf3-9289-47c3-a556-0418b670cb21","Type":"ContainerStarted","Data":"af637ff35f9f2e8b44705b6017f39b85dcd173a564912f90d5eeac99c1dbbae3"} Mar 01 09:13:12 crc kubenswrapper[4792]: I0301 09:13:12.865681 4792 generic.go:334] "Generic (PLEG): container finished" podID="22c7c368-3523-4224-aebd-59b29640bed0" containerID="225f228a4c6b7dec18ef0a0e39f231e76893c213f11c4f32a59ac86f3a689ed1" exitCode=0 Mar 01 09:13:12 crc kubenswrapper[4792]: I0301 09:13:12.865826 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7nwc2" event={"ID":"22c7c368-3523-4224-aebd-59b29640bed0","Type":"ContainerDied","Data":"225f228a4c6b7dec18ef0a0e39f231e76893c213f11c4f32a59ac86f3a689ed1"} Mar 01 09:13:12 crc kubenswrapper[4792]: I0301 09:13:12.866330 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" Mar 01 09:13:12 crc kubenswrapper[4792]: I0301 09:13:12.872031 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" Mar 01 09:13:12 crc kubenswrapper[4792]: I0301 09:13:12.926644 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" podStartSLOduration=14.926627477 podStartE2EDuration="14.926627477s" podCreationTimestamp="2026-03-01 09:12:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:13:12.922538875 +0000 UTC m=+322.164418082" watchObservedRunningTime="2026-03-01 09:13:12.926627477 +0000 UTC m=+322.168506674" Mar 01 09:13:12 crc kubenswrapper[4792]: I0301 09:13:12.952593 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-p6b9w" podStartSLOduration=4.616126009 podStartE2EDuration="1m29.952579725s" podCreationTimestamp="2026-03-01 09:11:43 +0000 UTC" firstStartedPulling="2026-03-01 09:11:44.804786092 +0000 UTC m=+234.046665289" lastFinishedPulling="2026-03-01 09:13:10.141239798 +0000 UTC m=+319.383119005" observedRunningTime="2026-03-01 09:13:12.948950805 +0000 UTC m=+322.190830002" watchObservedRunningTime="2026-03-01 09:13:12.952579725 +0000 UTC m=+322.194458922" Mar 01 09:13:12 crc kubenswrapper[4792]: I0301 09:13:12.973959 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cw675" podStartSLOduration=5.503586673 podStartE2EDuration="1m30.973937679s" podCreationTimestamp="2026-03-01 09:11:42 +0000 UTC" firstStartedPulling="2026-03-01 09:11:44.787979238 +0000 UTC m=+234.029858435" lastFinishedPulling="2026-03-01 09:13:10.258330244 +0000 UTC m=+319.500209441" observedRunningTime="2026-03-01 09:13:12.971323274 +0000 UTC m=+322.213202471" watchObservedRunningTime="2026-03-01 09:13:12.973937679 +0000 UTC m=+322.215816866" Mar 01 09:13:12 crc kubenswrapper[4792]: I0301 09:13:12.989495 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" podStartSLOduration=15.989480677 podStartE2EDuration="15.989480677s" podCreationTimestamp="2026-03-01 09:12:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:13:12.988994325 +0000 UTC m=+322.230873522" watchObservedRunningTime="2026-03-01 09:13:12.989480677 +0000 UTC m=+322.231359874" Mar 01 09:13:13 crc kubenswrapper[4792]: I0301 09:13:13.229045 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cw675" Mar 01 09:13:13 crc kubenswrapper[4792]: I0301 09:13:13.229225 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cw675" Mar 01 09:13:13 crc kubenswrapper[4792]: I0301 09:13:13.269494 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ftk4v" Mar 01 09:13:13 crc kubenswrapper[4792]: I0301 09:13:13.316442 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tbnw\" (UniqueName: \"kubernetes.io/projected/e93e87c0-86c5-446d-9f43-71d17960a351-kube-api-access-4tbnw\") pod \"e93e87c0-86c5-446d-9f43-71d17960a351\" (UID: \"e93e87c0-86c5-446d-9f43-71d17960a351\") " Mar 01 09:13:13 crc kubenswrapper[4792]: I0301 09:13:13.316510 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e93e87c0-86c5-446d-9f43-71d17960a351-utilities\") pod \"e93e87c0-86c5-446d-9f43-71d17960a351\" (UID: \"e93e87c0-86c5-446d-9f43-71d17960a351\") " Mar 01 09:13:13 crc kubenswrapper[4792]: I0301 09:13:13.316547 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e93e87c0-86c5-446d-9f43-71d17960a351-catalog-content\") pod \"e93e87c0-86c5-446d-9f43-71d17960a351\" (UID: \"e93e87c0-86c5-446d-9f43-71d17960a351\") " Mar 01 09:13:13 crc kubenswrapper[4792]: I0301 09:13:13.320732 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e93e87c0-86c5-446d-9f43-71d17960a351-utilities" (OuterVolumeSpecName: "utilities") pod "e93e87c0-86c5-446d-9f43-71d17960a351" (UID: "e93e87c0-86c5-446d-9f43-71d17960a351"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:13:13 crc kubenswrapper[4792]: I0301 09:13:13.329079 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e93e87c0-86c5-446d-9f43-71d17960a351-kube-api-access-4tbnw" (OuterVolumeSpecName: "kube-api-access-4tbnw") pod "e93e87c0-86c5-446d-9f43-71d17960a351" (UID: "e93e87c0-86c5-446d-9f43-71d17960a351"). InnerVolumeSpecName "kube-api-access-4tbnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:13:13 crc kubenswrapper[4792]: I0301 09:13:13.361851 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p6b9w" Mar 01 09:13:13 crc kubenswrapper[4792]: I0301 09:13:13.362084 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p6b9w" Mar 01 09:13:13 crc kubenswrapper[4792]: I0301 09:13:13.417668 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tbnw\" (UniqueName: \"kubernetes.io/projected/e93e87c0-86c5-446d-9f43-71d17960a351-kube-api-access-4tbnw\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:13 crc kubenswrapper[4792]: I0301 09:13:13.417711 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e93e87c0-86c5-446d-9f43-71d17960a351-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:13 crc kubenswrapper[4792]: I0301 09:13:13.873519 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ftk4v" event={"ID":"e93e87c0-86c5-446d-9f43-71d17960a351","Type":"ContainerDied","Data":"4d930733fc56c620ebbeeb1e0668d704b5df033f141d5439fd47f1e4757bacb0"} Mar 01 09:13:13 crc kubenswrapper[4792]: I0301 09:13:13.873785 4792 scope.go:117] "RemoveContainer" containerID="66fe92889b2819aa449c7b25e42dabb8e422bdcf054e9c103637bfad90b589e3" Mar 01 09:13:13 crc kubenswrapper[4792]: I0301 09:13:13.874552 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ftk4v" Mar 01 09:13:13 crc kubenswrapper[4792]: I0301 09:13:13.888201 4792 scope.go:117] "RemoveContainer" containerID="4ae585f0378f09e8557ad2a19a0474c0c5f7679c6fe6f603a65e86c389831b72" Mar 01 09:13:13 crc kubenswrapper[4792]: I0301 09:13:13.897636 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" podStartSLOduration=30.897613349 podStartE2EDuration="30.897613349s" podCreationTimestamp="2026-03-01 09:12:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:13:13.892386209 +0000 UTC m=+323.134265406" watchObservedRunningTime="2026-03-01 09:13:13.897613349 +0000 UTC m=+323.139492546" Mar 01 09:13:13 crc kubenswrapper[4792]: I0301 09:13:13.904453 4792 scope.go:117] "RemoveContainer" containerID="c36d8345eeaa99ca3b40ff37727a6e09aab9d36c26bc0292d9c0f3405bd1fe17" Mar 01 09:13:14 crc kubenswrapper[4792]: I0301 09:13:14.038407 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e93e87c0-86c5-446d-9f43-71d17960a351-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e93e87c0-86c5-446d-9f43-71d17960a351" (UID: "e93e87c0-86c5-446d-9f43-71d17960a351"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:13:14 crc kubenswrapper[4792]: I0301 09:13:14.126015 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e93e87c0-86c5-446d-9f43-71d17960a351-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:14 crc kubenswrapper[4792]: I0301 09:13:14.206021 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ftk4v"] Mar 01 09:13:14 crc kubenswrapper[4792]: I0301 09:13:14.208866 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ftk4v"] Mar 01 09:13:14 crc kubenswrapper[4792]: I0301 09:13:14.283585 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-cw675" podUID="dff0d675-52dd-4cac-a7be-8750333c28e3" containerName="registry-server" probeResult="failure" output=< Mar 01 09:13:14 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 09:13:14 crc kubenswrapper[4792]: > Mar 01 09:13:14 crc kubenswrapper[4792]: I0301 09:13:14.398636 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-p6b9w" podUID="6fd91972-6bfc-4041-abc2-8f4298584603" containerName="registry-server" probeResult="failure" output=< Mar 01 09:13:14 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 09:13:14 crc kubenswrapper[4792]: > Mar 01 09:13:14 crc kubenswrapper[4792]: I0301 09:13:14.886733 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7nwc2" event={"ID":"22c7c368-3523-4224-aebd-59b29640bed0","Type":"ContainerStarted","Data":"6796f3254defaa157a9986218237b224b14d5a54cd6e180bf82e9634212462ec"} Mar 01 09:13:15 crc kubenswrapper[4792]: I0301 09:13:15.420317 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e93e87c0-86c5-446d-9f43-71d17960a351" path="/var/lib/kubelet/pods/e93e87c0-86c5-446d-9f43-71d17960a351/volumes" Mar 01 09:13:16 crc kubenswrapper[4792]: I0301 09:13:16.188745 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7nwc2" Mar 01 09:13:16 crc kubenswrapper[4792]: I0301 09:13:16.188790 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7nwc2" Mar 01 09:13:17 crc kubenswrapper[4792]: I0301 09:13:17.231323 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7nwc2" podUID="22c7c368-3523-4224-aebd-59b29640bed0" containerName="registry-server" probeResult="failure" output=< Mar 01 09:13:17 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 09:13:17 crc kubenswrapper[4792]: > Mar 01 09:13:17 crc kubenswrapper[4792]: I0301 09:13:17.865476 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7nwc2" podStartSLOduration=6.796994658 podStartE2EDuration="1m32.86543761s" podCreationTimestamp="2026-03-01 09:11:45 +0000 UTC" firstStartedPulling="2026-03-01 09:11:48.04378161 +0000 UTC m=+237.285660807" lastFinishedPulling="2026-03-01 09:13:14.112224562 +0000 UTC m=+323.354103759" observedRunningTime="2026-03-01 09:13:14.915538944 +0000 UTC m=+324.157418141" watchObservedRunningTime="2026-03-01 09:13:17.86543761 +0000 UTC m=+327.107316807" Mar 01 09:13:17 crc kubenswrapper[4792]: I0301 09:13:17.870194 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-774f47c464-sjvj2"] Mar 01 09:13:17 crc kubenswrapper[4792]: I0301 09:13:17.870427 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" podUID="cb03c466-db7d-4cc2-9744-fb90116eac6f" containerName="controller-manager" containerID="cri-o://fe3861f72d22b8b2894aaf7d7bf6d91d16cd803c0347d640e158b07a91ef553f" gracePeriod=30 Mar 01 09:13:17 crc kubenswrapper[4792]: I0301 09:13:17.933898 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk"] Mar 01 09:13:17 crc kubenswrapper[4792]: I0301 09:13:17.934095 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" podUID="f32059d6-24aa-4b5a-afb2-b2cf75704d53" containerName="route-controller-manager" containerID="cri-o://f1a8ea691a13ea588efa820347a57b92c8419278817b613643f842ce395bc050" gracePeriod=30 Mar 01 09:13:17 crc kubenswrapper[4792]: I0301 09:13:17.934836 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" Mar 01 09:13:17 crc kubenswrapper[4792]: I0301 09:13:17.940648 4792 patch_prober.go:28] interesting pod/route-controller-manager-79b74d7999-nwdjk container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.63:8443/healthz\": read tcp 10.217.0.2:38504->10.217.0.63:8443: read: connection reset by peer" start-of-body= Mar 01 09:13:17 crc kubenswrapper[4792]: I0301 09:13:17.940696 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" podUID="f32059d6-24aa-4b5a-afb2-b2cf75704d53" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.63:8443/healthz\": read tcp 10.217.0.2:38504->10.217.0.63:8443: read: connection reset by peer" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.515817 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.521373 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.601713 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cb03c466-db7d-4cc2-9744-fb90116eac6f-proxy-ca-bundles\") pod \"cb03c466-db7d-4cc2-9744-fb90116eac6f\" (UID: \"cb03c466-db7d-4cc2-9744-fb90116eac6f\") " Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.601784 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chxj8\" (UniqueName: \"kubernetes.io/projected/f32059d6-24aa-4b5a-afb2-b2cf75704d53-kube-api-access-chxj8\") pod \"f32059d6-24aa-4b5a-afb2-b2cf75704d53\" (UID: \"f32059d6-24aa-4b5a-afb2-b2cf75704d53\") " Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.601814 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f32059d6-24aa-4b5a-afb2-b2cf75704d53-client-ca\") pod \"f32059d6-24aa-4b5a-afb2-b2cf75704d53\" (UID: \"f32059d6-24aa-4b5a-afb2-b2cf75704d53\") " Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.601860 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wk7hf\" (UniqueName: \"kubernetes.io/projected/cb03c466-db7d-4cc2-9744-fb90116eac6f-kube-api-access-wk7hf\") pod \"cb03c466-db7d-4cc2-9744-fb90116eac6f\" (UID: \"cb03c466-db7d-4cc2-9744-fb90116eac6f\") " Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.601951 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb03c466-db7d-4cc2-9744-fb90116eac6f-config\") pod \"cb03c466-db7d-4cc2-9744-fb90116eac6f\" (UID: \"cb03c466-db7d-4cc2-9744-fb90116eac6f\") " Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.601975 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb03c466-db7d-4cc2-9744-fb90116eac6f-serving-cert\") pod \"cb03c466-db7d-4cc2-9744-fb90116eac6f\" (UID: \"cb03c466-db7d-4cc2-9744-fb90116eac6f\") " Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.601995 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f32059d6-24aa-4b5a-afb2-b2cf75704d53-serving-cert\") pod \"f32059d6-24aa-4b5a-afb2-b2cf75704d53\" (UID: \"f32059d6-24aa-4b5a-afb2-b2cf75704d53\") " Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.602042 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f32059d6-24aa-4b5a-afb2-b2cf75704d53-config\") pod \"f32059d6-24aa-4b5a-afb2-b2cf75704d53\" (UID: \"f32059d6-24aa-4b5a-afb2-b2cf75704d53\") " Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.602063 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cb03c466-db7d-4cc2-9744-fb90116eac6f-client-ca\") pod \"cb03c466-db7d-4cc2-9744-fb90116eac6f\" (UID: \"cb03c466-db7d-4cc2-9744-fb90116eac6f\") " Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.602644 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb03c466-db7d-4cc2-9744-fb90116eac6f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "cb03c466-db7d-4cc2-9744-fb90116eac6f" (UID: "cb03c466-db7d-4cc2-9744-fb90116eac6f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.602861 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb03c466-db7d-4cc2-9744-fb90116eac6f-client-ca" (OuterVolumeSpecName: "client-ca") pod "cb03c466-db7d-4cc2-9744-fb90116eac6f" (UID: "cb03c466-db7d-4cc2-9744-fb90116eac6f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.603545 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f32059d6-24aa-4b5a-afb2-b2cf75704d53-client-ca" (OuterVolumeSpecName: "client-ca") pod "f32059d6-24aa-4b5a-afb2-b2cf75704d53" (UID: "f32059d6-24aa-4b5a-afb2-b2cf75704d53"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.603562 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb03c466-db7d-4cc2-9744-fb90116eac6f-config" (OuterVolumeSpecName: "config") pod "cb03c466-db7d-4cc2-9744-fb90116eac6f" (UID: "cb03c466-db7d-4cc2-9744-fb90116eac6f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.604185 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f32059d6-24aa-4b5a-afb2-b2cf75704d53-config" (OuterVolumeSpecName: "config") pod "f32059d6-24aa-4b5a-afb2-b2cf75704d53" (UID: "f32059d6-24aa-4b5a-afb2-b2cf75704d53"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.607366 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb03c466-db7d-4cc2-9744-fb90116eac6f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "cb03c466-db7d-4cc2-9744-fb90116eac6f" (UID: "cb03c466-db7d-4cc2-9744-fb90116eac6f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.607492 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f32059d6-24aa-4b5a-afb2-b2cf75704d53-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f32059d6-24aa-4b5a-afb2-b2cf75704d53" (UID: "f32059d6-24aa-4b5a-afb2-b2cf75704d53"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.607609 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f32059d6-24aa-4b5a-afb2-b2cf75704d53-kube-api-access-chxj8" (OuterVolumeSpecName: "kube-api-access-chxj8") pod "f32059d6-24aa-4b5a-afb2-b2cf75704d53" (UID: "f32059d6-24aa-4b5a-afb2-b2cf75704d53"). InnerVolumeSpecName "kube-api-access-chxj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.608001 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb03c466-db7d-4cc2-9744-fb90116eac6f-kube-api-access-wk7hf" (OuterVolumeSpecName: "kube-api-access-wk7hf") pod "cb03c466-db7d-4cc2-9744-fb90116eac6f" (UID: "cb03c466-db7d-4cc2-9744-fb90116eac6f"). InnerVolumeSpecName "kube-api-access-wk7hf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.703955 4792 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cb03c466-db7d-4cc2-9744-fb90116eac6f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.704015 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chxj8\" (UniqueName: \"kubernetes.io/projected/f32059d6-24aa-4b5a-afb2-b2cf75704d53-kube-api-access-chxj8\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.704040 4792 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f32059d6-24aa-4b5a-afb2-b2cf75704d53-client-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.704058 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wk7hf\" (UniqueName: \"kubernetes.io/projected/cb03c466-db7d-4cc2-9744-fb90116eac6f-kube-api-access-wk7hf\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.704078 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb03c466-db7d-4cc2-9744-fb90116eac6f-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.704095 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb03c466-db7d-4cc2-9744-fb90116eac6f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.704112 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f32059d6-24aa-4b5a-afb2-b2cf75704d53-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.704129 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f32059d6-24aa-4b5a-afb2-b2cf75704d53-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.704145 4792 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cb03c466-db7d-4cc2-9744-fb90116eac6f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.912614 4792 generic.go:334] "Generic (PLEG): container finished" podID="cb03c466-db7d-4cc2-9744-fb90116eac6f" containerID="fe3861f72d22b8b2894aaf7d7bf6d91d16cd803c0347d640e158b07a91ef553f" exitCode=0 Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.912704 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" event={"ID":"cb03c466-db7d-4cc2-9744-fb90116eac6f","Type":"ContainerDied","Data":"fe3861f72d22b8b2894aaf7d7bf6d91d16cd803c0347d640e158b07a91ef553f"} Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.912771 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.913830 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" event={"ID":"cb03c466-db7d-4cc2-9744-fb90116eac6f","Type":"ContainerDied","Data":"2782208b0d666f0e62b1d1210bb9b0f8f9d295fb7d4cff04f1224273c092f525"} Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.913875 4792 scope.go:117] "RemoveContainer" containerID="fe3861f72d22b8b2894aaf7d7bf6d91d16cd803c0347d640e158b07a91ef553f" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.915514 4792 generic.go:334] "Generic (PLEG): container finished" podID="f32059d6-24aa-4b5a-afb2-b2cf75704d53" containerID="f1a8ea691a13ea588efa820347a57b92c8419278817b613643f842ce395bc050" exitCode=0 Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.915553 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" event={"ID":"f32059d6-24aa-4b5a-afb2-b2cf75704d53","Type":"ContainerDied","Data":"f1a8ea691a13ea588efa820347a57b92c8419278817b613643f842ce395bc050"} Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.915579 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" event={"ID":"f32059d6-24aa-4b5a-afb2-b2cf75704d53","Type":"ContainerDied","Data":"8e9b5bce7a19f5245dcb156079f150381ad068104bb67fabd9a693966fe2fe05"} Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.915584 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.948415 4792 scope.go:117] "RemoveContainer" containerID="fe3861f72d22b8b2894aaf7d7bf6d91d16cd803c0347d640e158b07a91ef553f" Mar 01 09:13:18 crc kubenswrapper[4792]: E0301 09:13:18.949532 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe3861f72d22b8b2894aaf7d7bf6d91d16cd803c0347d640e158b07a91ef553f\": container with ID starting with fe3861f72d22b8b2894aaf7d7bf6d91d16cd803c0347d640e158b07a91ef553f not found: ID does not exist" containerID="fe3861f72d22b8b2894aaf7d7bf6d91d16cd803c0347d640e158b07a91ef553f" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.949582 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe3861f72d22b8b2894aaf7d7bf6d91d16cd803c0347d640e158b07a91ef553f"} err="failed to get container status \"fe3861f72d22b8b2894aaf7d7bf6d91d16cd803c0347d640e158b07a91ef553f\": rpc error: code = NotFound desc = could not find container \"fe3861f72d22b8b2894aaf7d7bf6d91d16cd803c0347d640e158b07a91ef553f\": container with ID starting with fe3861f72d22b8b2894aaf7d7bf6d91d16cd803c0347d640e158b07a91ef553f not found: ID does not exist" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.949611 4792 scope.go:117] "RemoveContainer" containerID="f1a8ea691a13ea588efa820347a57b92c8419278817b613643f842ce395bc050" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.971033 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-774f47c464-sjvj2"] Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.973227 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-774f47c464-sjvj2"] Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.974551 4792 scope.go:117] "RemoveContainer" containerID="f1a8ea691a13ea588efa820347a57b92c8419278817b613643f842ce395bc050" Mar 01 09:13:18 crc kubenswrapper[4792]: E0301 09:13:18.975125 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1a8ea691a13ea588efa820347a57b92c8419278817b613643f842ce395bc050\": container with ID starting with f1a8ea691a13ea588efa820347a57b92c8419278817b613643f842ce395bc050 not found: ID does not exist" containerID="f1a8ea691a13ea588efa820347a57b92c8419278817b613643f842ce395bc050" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.975203 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1a8ea691a13ea588efa820347a57b92c8419278817b613643f842ce395bc050"} err="failed to get container status \"f1a8ea691a13ea588efa820347a57b92c8419278817b613643f842ce395bc050\": rpc error: code = NotFound desc = could not find container \"f1a8ea691a13ea588efa820347a57b92c8419278817b613643f842ce395bc050\": container with ID starting with f1a8ea691a13ea588efa820347a57b92c8419278817b613643f842ce395bc050 not found: ID does not exist" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.993074 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk"] Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.002150 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk"] Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.161226 4792 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 01 09:13:19 crc kubenswrapper[4792]: E0301 09:13:19.161468 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e93e87c0-86c5-446d-9f43-71d17960a351" containerName="registry-server" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.161487 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e93e87c0-86c5-446d-9f43-71d17960a351" containerName="registry-server" Mar 01 09:13:19 crc kubenswrapper[4792]: E0301 09:13:19.161504 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e93e87c0-86c5-446d-9f43-71d17960a351" containerName="extract-content" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.161510 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e93e87c0-86c5-446d-9f43-71d17960a351" containerName="extract-content" Mar 01 09:13:19 crc kubenswrapper[4792]: E0301 09:13:19.161518 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f32059d6-24aa-4b5a-afb2-b2cf75704d53" containerName="route-controller-manager" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.161524 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f32059d6-24aa-4b5a-afb2-b2cf75704d53" containerName="route-controller-manager" Mar 01 09:13:19 crc kubenswrapper[4792]: E0301 09:13:19.161531 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e93e87c0-86c5-446d-9f43-71d17960a351" containerName="extract-utilities" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.161538 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e93e87c0-86c5-446d-9f43-71d17960a351" containerName="extract-utilities" Mar 01 09:13:19 crc kubenswrapper[4792]: E0301 09:13:19.161546 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb03c466-db7d-4cc2-9744-fb90116eac6f" containerName="controller-manager" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.161552 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb03c466-db7d-4cc2-9744-fb90116eac6f" containerName="controller-manager" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.161652 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb03c466-db7d-4cc2-9744-fb90116eac6f" containerName="controller-manager" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.161663 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f32059d6-24aa-4b5a-afb2-b2cf75704d53" containerName="route-controller-manager" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.161675 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e93e87c0-86c5-446d-9f43-71d17960a351" containerName="registry-server" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.162092 4792 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.162416 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610" gracePeriod=15 Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.162611 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.162951 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://4c38215135d7b0d135f95361664363266a76db5a039fae95c1e2507e52ee9f40" gracePeriod=15 Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.162996 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48" gracePeriod=15 Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.163028 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1" gracePeriod=15 Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.163061 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7" gracePeriod=15 Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.163489 4792 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 01 09:13:19 crc kubenswrapper[4792]: E0301 09:13:19.163675 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.163689 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 01 09:13:19 crc kubenswrapper[4792]: E0301 09:13:19.163704 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.163710 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 01 09:13:19 crc kubenswrapper[4792]: E0301 09:13:19.163718 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.163724 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 01 09:13:19 crc kubenswrapper[4792]: E0301 09:13:19.163731 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.163737 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 01 09:13:19 crc kubenswrapper[4792]: E0301 09:13:19.163746 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.163752 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 01 09:13:19 crc kubenswrapper[4792]: E0301 09:13:19.163761 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.163767 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 01 09:13:19 crc kubenswrapper[4792]: E0301 09:13:19.163774 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.163780 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 01 09:13:19 crc kubenswrapper[4792]: E0301 09:13:19.163788 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.163794 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 01 09:13:19 crc kubenswrapper[4792]: E0301 09:13:19.163801 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.163806 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 01 09:13:19 crc kubenswrapper[4792]: E0301 09:13:19.163817 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.163836 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.163928 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.163938 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.163945 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.163952 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.163960 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.163968 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.163975 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 01 09:13:19 crc kubenswrapper[4792]: E0301 09:13:19.164066 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.164072 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.164154 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.164162 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.164169 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.228562 4792 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]log ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]api-openshift-apiserver-available ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]api-openshift-oauth-apiserver-available ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]informer-sync ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/generic-apiserver-start-informers ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/priority-and-fairness-filter ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/start-apiextensions-informers ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/start-apiextensions-controllers ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/crd-informer-synced ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/start-system-namespaces-controller ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/rbac/bootstrap-roles ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/bootstrap-controller ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/start-kube-aggregator-informers ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/apiservice-registration-controller ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/apiservice-discovery-controller ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]autoregister-completion ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/apiservice-openapi-controller ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 01 09:13:19 crc kubenswrapper[4792]: [-]shutdown failed: reason withheld Mar 01 09:13:19 crc kubenswrapper[4792]: readyz check failed Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.228627 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.238105 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.311275 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.311320 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.311358 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.311421 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.311490 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.311539 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.311566 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.311586 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.412241 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.412292 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.412320 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.412356 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.412370 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.412403 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.412391 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.412429 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.412376 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.412439 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.412479 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.412517 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.412541 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.412547 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.412555 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.412614 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.416017 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb03c466-db7d-4cc2-9744-fb90116eac6f" path="/var/lib/kubelet/pods/cb03c466-db7d-4cc2-9744-fb90116eac6f/volumes" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.416525 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f32059d6-24aa-4b5a-afb2-b2cf75704d53" path="/var/lib/kubelet/pods/f32059d6-24aa-4b5a-afb2-b2cf75704d53/volumes" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.535264 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: W0301 09:13:19.565403 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-be28ae32e2e32e88a62b899a8c7d88cd88d16451f547cb14206f079e225e827c WatchSource:0}: Error finding container be28ae32e2e32e88a62b899a8c7d88cd88d16451f547cb14206f079e225e827c: Status 404 returned error can't find the container with id be28ae32e2e32e88a62b899a8c7d88cd88d16451f547cb14206f079e225e827c Mar 01 09:13:19 crc kubenswrapper[4792]: E0301 09:13:19.569677 4792 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.89:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1898acc14bb41be2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:13:19.569050594 +0000 UTC m=+328.810929791,LastTimestamp:2026-03-01 09:13:19.569050594 +0000 UTC m=+328.810929791,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.923266 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.924578 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.925334 4792 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4c38215135d7b0d135f95361664363266a76db5a039fae95c1e2507e52ee9f40" exitCode=0 Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.925358 4792 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48" exitCode=0 Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.925369 4792 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1" exitCode=0 Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.925379 4792 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7" exitCode=2 Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.925462 4792 scope.go:117] "RemoveContainer" containerID="40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.928326 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"0053a8c0b15b85cc2122bda95d2798eb0ba27a1e086aa18527efd0c4e0caccae"} Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.928376 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"be28ae32e2e32e88a62b899a8c7d88cd88d16451f547cb14206f079e225e827c"} Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.929009 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.929589 4792 generic.go:334] "Generic (PLEG): container finished" podID="4bc9fd32-61ce-4fbb-b67e-1376102f5384" containerID="7ee832f8488525fdd9e2872f5f8c217b74292b7dc7a5f5c0537fec99a3845c5f" exitCode=0 Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.929627 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"4bc9fd32-61ce-4fbb-b67e-1376102f5384","Type":"ContainerDied","Data":"7ee832f8488525fdd9e2872f5f8c217b74292b7dc7a5f5c0537fec99a3845c5f"} Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.930313 4792 status_manager.go:851] "Failed to get status for pod" podUID="4bc9fd32-61ce-4fbb-b67e-1376102f5384" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.930930 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:20 crc kubenswrapper[4792]: I0301 09:13:20.810713 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:20 crc kubenswrapper[4792]: I0301 09:13:20.816581 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:20 crc kubenswrapper[4792]: I0301 09:13:20.817266 4792 status_manager.go:851] "Failed to get status for pod" podUID="4bc9fd32-61ce-4fbb-b67e-1376102f5384" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:20 crc kubenswrapper[4792]: I0301 09:13:20.817975 4792 status_manager.go:851] "Failed to get status for pod" podUID="584dbcf3-9289-47c3-a556-0418b670cb21" pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-58b6dc46cc-pd7g8\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:20 crc kubenswrapper[4792]: I0301 09:13:20.818439 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:20 crc kubenswrapper[4792]: I0301 09:13:20.942156 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.289853 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.290565 4792 status_manager.go:851] "Failed to get status for pod" podUID="4bc9fd32-61ce-4fbb-b67e-1376102f5384" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.290744 4792 status_manager.go:851] "Failed to get status for pod" podUID="584dbcf3-9289-47c3-a556-0418b670cb21" pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-58b6dc46cc-pd7g8\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.290969 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.411555 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.412199 4792 status_manager.go:851] "Failed to get status for pod" podUID="4bc9fd32-61ce-4fbb-b67e-1376102f5384" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.412555 4792 status_manager.go:851] "Failed to get status for pod" podUID="584dbcf3-9289-47c3-a556-0418b670cb21" pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-58b6dc46cc-pd7g8\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.446864 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4bc9fd32-61ce-4fbb-b67e-1376102f5384-kube-api-access\") pod \"4bc9fd32-61ce-4fbb-b67e-1376102f5384\" (UID: \"4bc9fd32-61ce-4fbb-b67e-1376102f5384\") " Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.446951 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4bc9fd32-61ce-4fbb-b67e-1376102f5384-kubelet-dir\") pod \"4bc9fd32-61ce-4fbb-b67e-1376102f5384\" (UID: \"4bc9fd32-61ce-4fbb-b67e-1376102f5384\") " Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.446980 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4bc9fd32-61ce-4fbb-b67e-1376102f5384-var-lock\") pod \"4bc9fd32-61ce-4fbb-b67e-1376102f5384\" (UID: \"4bc9fd32-61ce-4fbb-b67e-1376102f5384\") " Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.447198 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4bc9fd32-61ce-4fbb-b67e-1376102f5384-var-lock" (OuterVolumeSpecName: "var-lock") pod "4bc9fd32-61ce-4fbb-b67e-1376102f5384" (UID: "4bc9fd32-61ce-4fbb-b67e-1376102f5384"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.447315 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4bc9fd32-61ce-4fbb-b67e-1376102f5384-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4bc9fd32-61ce-4fbb-b67e-1376102f5384" (UID: "4bc9fd32-61ce-4fbb-b67e-1376102f5384"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.451723 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bc9fd32-61ce-4fbb-b67e-1376102f5384-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4bc9fd32-61ce-4fbb-b67e-1376102f5384" (UID: "4bc9fd32-61ce-4fbb-b67e-1376102f5384"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.548631 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4bc9fd32-61ce-4fbb-b67e-1376102f5384-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.548677 4792 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4bc9fd32-61ce-4fbb-b67e-1376102f5384-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.548685 4792 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4bc9fd32-61ce-4fbb-b67e-1376102f5384-var-lock\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.555404 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.556233 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.557210 4792 status_manager.go:851] "Failed to get status for pod" podUID="4bc9fd32-61ce-4fbb-b67e-1376102f5384" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.557811 4792 status_manager.go:851] "Failed to get status for pod" podUID="584dbcf3-9289-47c3-a556-0418b670cb21" pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-58b6dc46cc-pd7g8\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.558611 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.559204 4792 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.651054 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.651938 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.652147 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.651144 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.652407 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.652535 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.653248 4792 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.653428 4792 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.653616 4792 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.953432 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.954553 4792 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610" exitCode=0 Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.954745 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.955061 4792 scope.go:117] "RemoveContainer" containerID="4c38215135d7b0d135f95361664363266a76db5a039fae95c1e2507e52ee9f40" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.958506 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"4bc9fd32-61ce-4fbb-b67e-1376102f5384","Type":"ContainerDied","Data":"e1aa5214c0457299056866d70c565c5094c4dfa52c85c6428fa756810ed8da10"} Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.958553 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1aa5214c0457299056866d70c565c5094c4dfa52c85c6428fa756810ed8da10" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.958653 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.986085 4792 scope.go:117] "RemoveContainer" containerID="d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.991900 4792 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.992340 4792 status_manager.go:851] "Failed to get status for pod" podUID="4bc9fd32-61ce-4fbb-b67e-1376102f5384" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.992618 4792 status_manager.go:851] "Failed to get status for pod" podUID="584dbcf3-9289-47c3-a556-0418b670cb21" pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-58b6dc46cc-pd7g8\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.992868 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.993635 4792 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.994410 4792 status_manager.go:851] "Failed to get status for pod" podUID="4bc9fd32-61ce-4fbb-b67e-1376102f5384" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.994638 4792 status_manager.go:851] "Failed to get status for pod" podUID="584dbcf3-9289-47c3-a556-0418b670cb21" pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-58b6dc46cc-pd7g8\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.995526 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:22 crc kubenswrapper[4792]: I0301 09:13:22.007438 4792 scope.go:117] "RemoveContainer" containerID="5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1" Mar 01 09:13:22 crc kubenswrapper[4792]: I0301 09:13:22.024084 4792 scope.go:117] "RemoveContainer" containerID="8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7" Mar 01 09:13:22 crc kubenswrapper[4792]: I0301 09:13:22.043673 4792 scope.go:117] "RemoveContainer" containerID="1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610" Mar 01 09:13:22 crc kubenswrapper[4792]: I0301 09:13:22.064945 4792 scope.go:117] "RemoveContainer" containerID="d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada" Mar 01 09:13:22 crc kubenswrapper[4792]: I0301 09:13:22.086263 4792 scope.go:117] "RemoveContainer" containerID="4c38215135d7b0d135f95361664363266a76db5a039fae95c1e2507e52ee9f40" Mar 01 09:13:22 crc kubenswrapper[4792]: E0301 09:13:22.086847 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c38215135d7b0d135f95361664363266a76db5a039fae95c1e2507e52ee9f40\": container with ID starting with 4c38215135d7b0d135f95361664363266a76db5a039fae95c1e2507e52ee9f40 not found: ID does not exist" containerID="4c38215135d7b0d135f95361664363266a76db5a039fae95c1e2507e52ee9f40" Mar 01 09:13:22 crc kubenswrapper[4792]: I0301 09:13:22.086977 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c38215135d7b0d135f95361664363266a76db5a039fae95c1e2507e52ee9f40"} err="failed to get container status \"4c38215135d7b0d135f95361664363266a76db5a039fae95c1e2507e52ee9f40\": rpc error: code = NotFound desc = could not find container \"4c38215135d7b0d135f95361664363266a76db5a039fae95c1e2507e52ee9f40\": container with ID starting with 4c38215135d7b0d135f95361664363266a76db5a039fae95c1e2507e52ee9f40 not found: ID does not exist" Mar 01 09:13:22 crc kubenswrapper[4792]: I0301 09:13:22.087067 4792 scope.go:117] "RemoveContainer" containerID="d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48" Mar 01 09:13:22 crc kubenswrapper[4792]: E0301 09:13:22.087720 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\": container with ID starting with d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48 not found: ID does not exist" containerID="d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48" Mar 01 09:13:22 crc kubenswrapper[4792]: I0301 09:13:22.087776 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48"} err="failed to get container status \"d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\": rpc error: code = NotFound desc = could not find container \"d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\": container with ID starting with d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48 not found: ID does not exist" Mar 01 09:13:22 crc kubenswrapper[4792]: I0301 09:13:22.087810 4792 scope.go:117] "RemoveContainer" containerID="5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1" Mar 01 09:13:22 crc kubenswrapper[4792]: E0301 09:13:22.088202 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\": container with ID starting with 5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1 not found: ID does not exist" containerID="5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1" Mar 01 09:13:22 crc kubenswrapper[4792]: I0301 09:13:22.088290 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1"} err="failed to get container status \"5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\": rpc error: code = NotFound desc = could not find container \"5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\": container with ID starting with 5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1 not found: ID does not exist" Mar 01 09:13:22 crc kubenswrapper[4792]: I0301 09:13:22.088357 4792 scope.go:117] "RemoveContainer" containerID="8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7" Mar 01 09:13:22 crc kubenswrapper[4792]: E0301 09:13:22.089045 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\": container with ID starting with 8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7 not found: ID does not exist" containerID="8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7" Mar 01 09:13:22 crc kubenswrapper[4792]: I0301 09:13:22.089124 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7"} err="failed to get container status \"8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\": rpc error: code = NotFound desc = could not find container \"8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\": container with ID starting with 8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7 not found: ID does not exist" Mar 01 09:13:22 crc kubenswrapper[4792]: I0301 09:13:22.089201 4792 scope.go:117] "RemoveContainer" containerID="1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610" Mar 01 09:13:22 crc kubenswrapper[4792]: E0301 09:13:22.089627 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\": container with ID starting with 1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610 not found: ID does not exist" containerID="1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610" Mar 01 09:13:22 crc kubenswrapper[4792]: I0301 09:13:22.089702 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610"} err="failed to get container status \"1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\": rpc error: code = NotFound desc = could not find container \"1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\": container with ID starting with 1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610 not found: ID does not exist" Mar 01 09:13:22 crc kubenswrapper[4792]: I0301 09:13:22.089773 4792 scope.go:117] "RemoveContainer" containerID="d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada" Mar 01 09:13:22 crc kubenswrapper[4792]: E0301 09:13:22.090169 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\": container with ID starting with d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada not found: ID does not exist" containerID="d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada" Mar 01 09:13:22 crc kubenswrapper[4792]: I0301 09:13:22.090217 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada"} err="failed to get container status \"d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\": rpc error: code = NotFound desc = could not find container \"d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\": container with ID starting with d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada not found: ID does not exist" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.299111 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cw675" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.300997 4792 status_manager.go:851] "Failed to get status for pod" podUID="4bc9fd32-61ce-4fbb-b67e-1376102f5384" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.301395 4792 status_manager.go:851] "Failed to get status for pod" podUID="584dbcf3-9289-47c3-a556-0418b670cb21" pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-58b6dc46cc-pd7g8\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.301851 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.302611 4792 status_manager.go:851] "Failed to get status for pod" podUID="dff0d675-52dd-4cac-a7be-8750333c28e3" pod="openshift-marketplace/certified-operators-cw675" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cw675\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.303462 4792 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.360324 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cw675" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.360877 4792 status_manager.go:851] "Failed to get status for pod" podUID="dff0d675-52dd-4cac-a7be-8750333c28e3" pod="openshift-marketplace/certified-operators-cw675" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cw675\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.361383 4792 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.361760 4792 status_manager.go:851] "Failed to get status for pod" podUID="4bc9fd32-61ce-4fbb-b67e-1376102f5384" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.362140 4792 status_manager.go:851] "Failed to get status for pod" podUID="584dbcf3-9289-47c3-a556-0418b670cb21" pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-58b6dc46cc-pd7g8\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.362551 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.422349 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.438757 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p6b9w" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.439572 4792 status_manager.go:851] "Failed to get status for pod" podUID="6fd91972-6bfc-4041-abc2-8f4298584603" pod="openshift-marketplace/community-operators-p6b9w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-p6b9w\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.440138 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.440782 4792 status_manager.go:851] "Failed to get status for pod" podUID="dff0d675-52dd-4cac-a7be-8750333c28e3" pod="openshift-marketplace/certified-operators-cw675" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cw675\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.441736 4792 status_manager.go:851] "Failed to get status for pod" podUID="4bc9fd32-61ce-4fbb-b67e-1376102f5384" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.442262 4792 status_manager.go:851] "Failed to get status for pod" podUID="584dbcf3-9289-47c3-a556-0418b670cb21" pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-58b6dc46cc-pd7g8\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.510594 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p6b9w" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.511406 4792 status_manager.go:851] "Failed to get status for pod" podUID="4bc9fd32-61ce-4fbb-b67e-1376102f5384" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.512012 4792 status_manager.go:851] "Failed to get status for pod" podUID="584dbcf3-9289-47c3-a556-0418b670cb21" pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-58b6dc46cc-pd7g8\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.512544 4792 status_manager.go:851] "Failed to get status for pod" podUID="6fd91972-6bfc-4041-abc2-8f4298584603" pod="openshift-marketplace/community-operators-p6b9w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-p6b9w\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.513008 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.513407 4792 status_manager.go:851] "Failed to get status for pod" podUID="dff0d675-52dd-4cac-a7be-8750333c28e3" pod="openshift-marketplace/certified-operators-cw675" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cw675\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:24 crc kubenswrapper[4792]: E0301 09:13:24.079127 4792 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:24 crc kubenswrapper[4792]: E0301 09:13:24.079882 4792 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:24 crc kubenswrapper[4792]: E0301 09:13:24.080378 4792 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:24 crc kubenswrapper[4792]: E0301 09:13:24.080878 4792 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:24 crc kubenswrapper[4792]: E0301 09:13:24.081340 4792 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:24 crc kubenswrapper[4792]: I0301 09:13:24.081391 4792 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 01 09:13:24 crc kubenswrapper[4792]: E0301 09:13:24.081730 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="200ms" Mar 01 09:13:24 crc kubenswrapper[4792]: E0301 09:13:24.283103 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="400ms" Mar 01 09:13:24 crc kubenswrapper[4792]: E0301 09:13:24.684033 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="800ms" Mar 01 09:13:25 crc kubenswrapper[4792]: E0301 09:13:25.485009 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="1.6s" Mar 01 09:13:26 crc kubenswrapper[4792]: I0301 09:13:26.262577 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7nwc2" Mar 01 09:13:26 crc kubenswrapper[4792]: I0301 09:13:26.263577 4792 status_manager.go:851] "Failed to get status for pod" podUID="22c7c368-3523-4224-aebd-59b29640bed0" pod="openshift-marketplace/redhat-operators-7nwc2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7nwc2\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:26 crc kubenswrapper[4792]: I0301 09:13:26.264973 4792 status_manager.go:851] "Failed to get status for pod" podUID="4bc9fd32-61ce-4fbb-b67e-1376102f5384" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:26 crc kubenswrapper[4792]: I0301 09:13:26.265759 4792 status_manager.go:851] "Failed to get status for pod" podUID="584dbcf3-9289-47c3-a556-0418b670cb21" pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-58b6dc46cc-pd7g8\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:26 crc kubenswrapper[4792]: I0301 09:13:26.266464 4792 status_manager.go:851] "Failed to get status for pod" podUID="6fd91972-6bfc-4041-abc2-8f4298584603" pod="openshift-marketplace/community-operators-p6b9w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-p6b9w\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:26 crc kubenswrapper[4792]: I0301 09:13:26.266966 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:26 crc kubenswrapper[4792]: I0301 09:13:26.267607 4792 status_manager.go:851] "Failed to get status for pod" podUID="dff0d675-52dd-4cac-a7be-8750333c28e3" pod="openshift-marketplace/certified-operators-cw675" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cw675\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:26 crc kubenswrapper[4792]: I0301 09:13:26.322293 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7nwc2" Mar 01 09:13:26 crc kubenswrapper[4792]: I0301 09:13:26.323221 4792 status_manager.go:851] "Failed to get status for pod" podUID="6fd91972-6bfc-4041-abc2-8f4298584603" pod="openshift-marketplace/community-operators-p6b9w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-p6b9w\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:26 crc kubenswrapper[4792]: I0301 09:13:26.323805 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:26 crc kubenswrapper[4792]: I0301 09:13:26.324514 4792 status_manager.go:851] "Failed to get status for pod" podUID="dff0d675-52dd-4cac-a7be-8750333c28e3" pod="openshift-marketplace/certified-operators-cw675" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cw675\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:26 crc kubenswrapper[4792]: I0301 09:13:26.324998 4792 status_manager.go:851] "Failed to get status for pod" podUID="22c7c368-3523-4224-aebd-59b29640bed0" pod="openshift-marketplace/redhat-operators-7nwc2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7nwc2\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:26 crc kubenswrapper[4792]: I0301 09:13:26.325421 4792 status_manager.go:851] "Failed to get status for pod" podUID="4bc9fd32-61ce-4fbb-b67e-1376102f5384" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:26 crc kubenswrapper[4792]: I0301 09:13:26.325858 4792 status_manager.go:851] "Failed to get status for pod" podUID="584dbcf3-9289-47c3-a556-0418b670cb21" pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-58b6dc46cc-pd7g8\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:27 crc kubenswrapper[4792]: E0301 09:13:27.086390 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="3.2s" Mar 01 09:13:29 crc kubenswrapper[4792]: E0301 09:13:29.459938 4792 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.89:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1898acc14bb41be2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:13:19.569050594 +0000 UTC m=+328.810929791,LastTimestamp:2026-03-01 09:13:19.569050594 +0000 UTC m=+328.810929791,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:13:30 crc kubenswrapper[4792]: E0301 09:13:30.287190 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="6.4s" Mar 01 09:13:31 crc kubenswrapper[4792]: I0301 09:13:31.408841 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:13:31 crc kubenswrapper[4792]: I0301 09:13:31.412170 4792 status_manager.go:851] "Failed to get status for pod" podUID="4bc9fd32-61ce-4fbb-b67e-1376102f5384" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:31 crc kubenswrapper[4792]: I0301 09:13:31.412466 4792 status_manager.go:851] "Failed to get status for pod" podUID="584dbcf3-9289-47c3-a556-0418b670cb21" pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-58b6dc46cc-pd7g8\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:31 crc kubenswrapper[4792]: I0301 09:13:31.412655 4792 status_manager.go:851] "Failed to get status for pod" podUID="6fd91972-6bfc-4041-abc2-8f4298584603" pod="openshift-marketplace/community-operators-p6b9w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-p6b9w\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:31 crc kubenswrapper[4792]: I0301 09:13:31.412791 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:31 crc kubenswrapper[4792]: I0301 09:13:31.412970 4792 status_manager.go:851] "Failed to get status for pod" podUID="dff0d675-52dd-4cac-a7be-8750333c28e3" pod="openshift-marketplace/certified-operators-cw675" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cw675\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:31 crc kubenswrapper[4792]: I0301 09:13:31.413295 4792 status_manager.go:851] "Failed to get status for pod" podUID="22c7c368-3523-4224-aebd-59b29640bed0" pod="openshift-marketplace/redhat-operators-7nwc2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7nwc2\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:31 crc kubenswrapper[4792]: I0301 09:13:31.413874 4792 status_manager.go:851] "Failed to get status for pod" podUID="22c7c368-3523-4224-aebd-59b29640bed0" pod="openshift-marketplace/redhat-operators-7nwc2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7nwc2\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:31 crc kubenswrapper[4792]: I0301 09:13:31.414147 4792 status_manager.go:851] "Failed to get status for pod" podUID="4bc9fd32-61ce-4fbb-b67e-1376102f5384" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:31 crc kubenswrapper[4792]: I0301 09:13:31.414499 4792 status_manager.go:851] "Failed to get status for pod" podUID="584dbcf3-9289-47c3-a556-0418b670cb21" pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-58b6dc46cc-pd7g8\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:31 crc kubenswrapper[4792]: I0301 09:13:31.414880 4792 status_manager.go:851] "Failed to get status for pod" podUID="6fd91972-6bfc-4041-abc2-8f4298584603" pod="openshift-marketplace/community-operators-p6b9w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-p6b9w\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:31 crc kubenswrapper[4792]: I0301 09:13:31.415292 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:31 crc kubenswrapper[4792]: I0301 09:13:31.415533 4792 status_manager.go:851] "Failed to get status for pod" podUID="dff0d675-52dd-4cac-a7be-8750333c28e3" pod="openshift-marketplace/certified-operators-cw675" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cw675\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:31 crc kubenswrapper[4792]: I0301 09:13:31.425075 4792 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="de0393ad-517c-4b31-9bf4-ed1a3d855bc1" Mar 01 09:13:31 crc kubenswrapper[4792]: I0301 09:13:31.425108 4792 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="de0393ad-517c-4b31-9bf4-ed1a3d855bc1" Mar 01 09:13:31 crc kubenswrapper[4792]: E0301 09:13:31.425536 4792 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:13:31 crc kubenswrapper[4792]: I0301 09:13:31.426276 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:13:31 crc kubenswrapper[4792]: W0301 09:13:31.462700 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-de26490347370479ac4fe71a4febf0c58330e3ef1ac9defc73a62c8cb4d077e7 WatchSource:0}: Error finding container de26490347370479ac4fe71a4febf0c58330e3ef1ac9defc73a62c8cb4d077e7: Status 404 returned error can't find the container with id de26490347370479ac4fe71a4febf0c58330e3ef1ac9defc73a62c8cb4d077e7 Mar 01 09:13:32 crc kubenswrapper[4792]: I0301 09:13:32.030211 4792 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="2c69491c8520b0c3299bb8bff8a9128dc94446caa1325badc2162e5d39d0744d" exitCode=0 Mar 01 09:13:32 crc kubenswrapper[4792]: I0301 09:13:32.030330 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"2c69491c8520b0c3299bb8bff8a9128dc94446caa1325badc2162e5d39d0744d"} Mar 01 09:13:32 crc kubenswrapper[4792]: I0301 09:13:32.030697 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"de26490347370479ac4fe71a4febf0c58330e3ef1ac9defc73a62c8cb4d077e7"} Mar 01 09:13:32 crc kubenswrapper[4792]: I0301 09:13:32.031275 4792 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="de0393ad-517c-4b31-9bf4-ed1a3d855bc1" Mar 01 09:13:32 crc kubenswrapper[4792]: I0301 09:13:32.031316 4792 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="de0393ad-517c-4b31-9bf4-ed1a3d855bc1" Mar 01 09:13:32 crc kubenswrapper[4792]: I0301 09:13:32.031783 4792 status_manager.go:851] "Failed to get status for pod" podUID="22c7c368-3523-4224-aebd-59b29640bed0" pod="openshift-marketplace/redhat-operators-7nwc2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7nwc2\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:32 crc kubenswrapper[4792]: E0301 09:13:32.032195 4792 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:13:32 crc kubenswrapper[4792]: I0301 09:13:32.032420 4792 status_manager.go:851] "Failed to get status for pod" podUID="4bc9fd32-61ce-4fbb-b67e-1376102f5384" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:32 crc kubenswrapper[4792]: I0301 09:13:32.032894 4792 status_manager.go:851] "Failed to get status for pod" podUID="584dbcf3-9289-47c3-a556-0418b670cb21" pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-58b6dc46cc-pd7g8\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:32 crc kubenswrapper[4792]: I0301 09:13:32.033288 4792 status_manager.go:851] "Failed to get status for pod" podUID="6fd91972-6bfc-4041-abc2-8f4298584603" pod="openshift-marketplace/community-operators-p6b9w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-p6b9w\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:32 crc kubenswrapper[4792]: I0301 09:13:32.034064 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:32 crc kubenswrapper[4792]: I0301 09:13:32.034542 4792 status_manager.go:851] "Failed to get status for pod" podUID="dff0d675-52dd-4cac-a7be-8750333c28e3" pod="openshift-marketplace/certified-operators-cw675" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cw675\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:33 crc kubenswrapper[4792]: I0301 09:13:33.038847 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c6a36614beeb413ffb051ac3aaa16c27304eae223b3d69894745febf80f64bc1"} Mar 01 09:13:33 crc kubenswrapper[4792]: I0301 09:13:33.039267 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6aaac00c2bd57a2c892468be6a66b4edf964fad95d25779e62f9c9fb81968827"} Mar 01 09:13:33 crc kubenswrapper[4792]: I0301 09:13:33.039277 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ed0425c88611496504d23e2b0e98365fe19146df38e3248933a2747a1cfd6ffb"} Mar 01 09:13:34 crc kubenswrapper[4792]: I0301 09:13:34.048123 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"185916b4b4557494e9003d7fe65c3f35fd6e3b5ef2c507151a5435e3f373b9a0"} Mar 01 09:13:34 crc kubenswrapper[4792]: I0301 09:13:34.048163 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"aacad5073f12ad0c4d39fbff51590bbf75cc36a18c5050288d92bbb137e5415d"} Mar 01 09:13:34 crc kubenswrapper[4792]: I0301 09:13:34.048370 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:13:34 crc kubenswrapper[4792]: I0301 09:13:34.048585 4792 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="de0393ad-517c-4b31-9bf4-ed1a3d855bc1" Mar 01 09:13:34 crc kubenswrapper[4792]: I0301 09:13:34.048619 4792 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="de0393ad-517c-4b31-9bf4-ed1a3d855bc1" Mar 01 09:13:35 crc kubenswrapper[4792]: I0301 09:13:35.057183 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 01 09:13:35 crc kubenswrapper[4792]: I0301 09:13:35.059267 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 01 09:13:35 crc kubenswrapper[4792]: I0301 09:13:35.059341 4792 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="859afd52205b95f45333b5975e67308ee360d3887aa679a922b7cba5e3acda4c" exitCode=1 Mar 01 09:13:35 crc kubenswrapper[4792]: I0301 09:13:35.059391 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"859afd52205b95f45333b5975e67308ee360d3887aa679a922b7cba5e3acda4c"} Mar 01 09:13:35 crc kubenswrapper[4792]: I0301 09:13:35.060290 4792 scope.go:117] "RemoveContainer" containerID="859afd52205b95f45333b5975e67308ee360d3887aa679a922b7cba5e3acda4c" Mar 01 09:13:36 crc kubenswrapper[4792]: I0301 09:13:36.066377 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 01 09:13:36 crc kubenswrapper[4792]: I0301 09:13:36.067987 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 01 09:13:36 crc kubenswrapper[4792]: I0301 09:13:36.068042 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"96c3960038c0101f16ce250d82c61191002c8a6e70c05569b64ed4476b4203ef"} Mar 01 09:13:36 crc kubenswrapper[4792]: I0301 09:13:36.426424 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:13:36 crc kubenswrapper[4792]: I0301 09:13:36.426748 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:13:36 crc kubenswrapper[4792]: I0301 09:13:36.434375 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:13:39 crc kubenswrapper[4792]: I0301 09:13:39.064810 4792 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:13:39 crc kubenswrapper[4792]: I0301 09:13:39.082039 4792 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="de0393ad-517c-4b31-9bf4-ed1a3d855bc1" Mar 01 09:13:39 crc kubenswrapper[4792]: I0301 09:13:39.082238 4792 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="de0393ad-517c-4b31-9bf4-ed1a3d855bc1" Mar 01 09:13:39 crc kubenswrapper[4792]: I0301 09:13:39.085453 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:13:40 crc kubenswrapper[4792]: I0301 09:13:40.086929 4792 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="de0393ad-517c-4b31-9bf4-ed1a3d855bc1" Mar 01 09:13:40 crc kubenswrapper[4792]: I0301 09:13:40.087255 4792 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="de0393ad-517c-4b31-9bf4-ed1a3d855bc1" Mar 01 09:13:40 crc kubenswrapper[4792]: I0301 09:13:40.115593 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:13:40 crc kubenswrapper[4792]: I0301 09:13:40.119225 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:13:41 crc kubenswrapper[4792]: I0301 09:13:41.006954 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:13:41 crc kubenswrapper[4792]: I0301 09:13:41.429112 4792 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="f1fe19d9-d509-4345-a79f-7d4bad570cf9" Mar 01 09:13:49 crc kubenswrapper[4792]: I0301 09:13:49.664557 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 01 09:13:50 crc kubenswrapper[4792]: I0301 09:13:50.176700 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 01 09:13:50 crc kubenswrapper[4792]: I0301 09:13:50.392394 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 01 09:13:50 crc kubenswrapper[4792]: I0301 09:13:50.702944 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 01 09:13:50 crc kubenswrapper[4792]: I0301 09:13:50.827379 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 01 09:13:50 crc kubenswrapper[4792]: I0301 09:13:50.828797 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 01 09:13:50 crc kubenswrapper[4792]: I0301 09:13:50.873756 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 01 09:13:51 crc kubenswrapper[4792]: I0301 09:13:51.009979 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:13:51 crc kubenswrapper[4792]: I0301 09:13:51.102838 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 01 09:13:51 crc kubenswrapper[4792]: I0301 09:13:51.343524 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 01 09:13:51 crc kubenswrapper[4792]: I0301 09:13:51.358578 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 01 09:13:51 crc kubenswrapper[4792]: I0301 09:13:51.441543 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 01 09:13:51 crc kubenswrapper[4792]: I0301 09:13:51.560390 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 01 09:13:51 crc kubenswrapper[4792]: I0301 09:13:51.640325 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 01 09:13:51 crc kubenswrapper[4792]: I0301 09:13:51.661838 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 01 09:13:51 crc kubenswrapper[4792]: I0301 09:13:51.667210 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 01 09:13:51 crc kubenswrapper[4792]: I0301 09:13:51.684632 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 01 09:13:51 crc kubenswrapper[4792]: I0301 09:13:51.715938 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 01 09:13:51 crc kubenswrapper[4792]: I0301 09:13:51.847459 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 01 09:13:51 crc kubenswrapper[4792]: I0301 09:13:51.867779 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.003113 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.037023 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.113124 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.163625 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.331090 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.587665 4792 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.588269 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=33.588250879 podStartE2EDuration="33.588250879s" podCreationTimestamp="2026-03-01 09:13:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:13:39.142058798 +0000 UTC m=+348.383937995" watchObservedRunningTime="2026-03-01 09:13:52.588250879 +0000 UTC m=+361.830130116" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.595310 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.595374 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k","openshift-kube-apiserver/kube-apiserver-crc","openshift-controller-manager/controller-manager-55c479c949-bxrgx"] Mar 01 09:13:52 crc kubenswrapper[4792]: E0301 09:13:52.595643 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bc9fd32-61ce-4fbb-b67e-1376102f5384" containerName="installer" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.595673 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bc9fd32-61ce-4fbb-b67e-1376102f5384" containerName="installer" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.595847 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bc9fd32-61ce-4fbb-b67e-1376102f5384" containerName="installer" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.596649 4792 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="de0393ad-517c-4b31-9bf4-ed1a3d855bc1" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.596856 4792 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="de0393ad-517c-4b31-9bf4-ed1a3d855bc1" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.597242 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.597288 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.601807 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.602099 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.603801 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.604110 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.604316 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.604488 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.604738 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.604883 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.605012 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.605082 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.605285 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.605399 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.605843 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8vjz\" (UniqueName: \"kubernetes.io/projected/f9ec3613-4fe1-4e71-8991-f5be9a94579e-kube-api-access-g8vjz\") pod \"route-controller-manager-f66f7bb8d-7bp5k\" (UID: \"f9ec3613-4fe1-4e71-8991-f5be9a94579e\") " pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.605900 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqzkw\" (UniqueName: \"kubernetes.io/projected/059440a1-ff60-496f-bdbc-8218b5ceb3f7-kube-api-access-zqzkw\") pod \"controller-manager-55c479c949-bxrgx\" (UID: \"059440a1-ff60-496f-bdbc-8218b5ceb3f7\") " pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.605983 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/059440a1-ff60-496f-bdbc-8218b5ceb3f7-client-ca\") pod \"controller-manager-55c479c949-bxrgx\" (UID: \"059440a1-ff60-496f-bdbc-8218b5ceb3f7\") " pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.606033 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/059440a1-ff60-496f-bdbc-8218b5ceb3f7-config\") pod \"controller-manager-55c479c949-bxrgx\" (UID: \"059440a1-ff60-496f-bdbc-8218b5ceb3f7\") " pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.606107 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/059440a1-ff60-496f-bdbc-8218b5ceb3f7-serving-cert\") pod \"controller-manager-55c479c949-bxrgx\" (UID: \"059440a1-ff60-496f-bdbc-8218b5ceb3f7\") " pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.606156 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9ec3613-4fe1-4e71-8991-f5be9a94579e-serving-cert\") pod \"route-controller-manager-f66f7bb8d-7bp5k\" (UID: \"f9ec3613-4fe1-4e71-8991-f5be9a94579e\") " pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.606260 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f9ec3613-4fe1-4e71-8991-f5be9a94579e-client-ca\") pod \"route-controller-manager-f66f7bb8d-7bp5k\" (UID: \"f9ec3613-4fe1-4e71-8991-f5be9a94579e\") " pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.606300 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9ec3613-4fe1-4e71-8991-f5be9a94579e-config\") pod \"route-controller-manager-f66f7bb8d-7bp5k\" (UID: \"f9ec3613-4fe1-4e71-8991-f5be9a94579e\") " pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.606327 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/059440a1-ff60-496f-bdbc-8218b5ceb3f7-proxy-ca-bundles\") pod \"controller-manager-55c479c949-bxrgx\" (UID: \"059440a1-ff60-496f-bdbc-8218b5ceb3f7\") " pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.606939 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.613571 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.626587 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=13.626555057000001 podStartE2EDuration="13.626555057s" podCreationTimestamp="2026-03-01 09:13:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:13:52.624984139 +0000 UTC m=+361.866863376" watchObservedRunningTime="2026-03-01 09:13:52.626555057 +0000 UTC m=+361.868434304" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.630326 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.692293 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.707370 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/059440a1-ff60-496f-bdbc-8218b5ceb3f7-serving-cert\") pod \"controller-manager-55c479c949-bxrgx\" (UID: \"059440a1-ff60-496f-bdbc-8218b5ceb3f7\") " pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.707417 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9ec3613-4fe1-4e71-8991-f5be9a94579e-serving-cert\") pod \"route-controller-manager-f66f7bb8d-7bp5k\" (UID: \"f9ec3613-4fe1-4e71-8991-f5be9a94579e\") " pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.707462 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f9ec3613-4fe1-4e71-8991-f5be9a94579e-client-ca\") pod \"route-controller-manager-f66f7bb8d-7bp5k\" (UID: \"f9ec3613-4fe1-4e71-8991-f5be9a94579e\") " pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.707482 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/059440a1-ff60-496f-bdbc-8218b5ceb3f7-proxy-ca-bundles\") pod \"controller-manager-55c479c949-bxrgx\" (UID: \"059440a1-ff60-496f-bdbc-8218b5ceb3f7\") " pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.707499 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9ec3613-4fe1-4e71-8991-f5be9a94579e-config\") pod \"route-controller-manager-f66f7bb8d-7bp5k\" (UID: \"f9ec3613-4fe1-4e71-8991-f5be9a94579e\") " pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.707522 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8vjz\" (UniqueName: \"kubernetes.io/projected/f9ec3613-4fe1-4e71-8991-f5be9a94579e-kube-api-access-g8vjz\") pod \"route-controller-manager-f66f7bb8d-7bp5k\" (UID: \"f9ec3613-4fe1-4e71-8991-f5be9a94579e\") " pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.707541 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqzkw\" (UniqueName: \"kubernetes.io/projected/059440a1-ff60-496f-bdbc-8218b5ceb3f7-kube-api-access-zqzkw\") pod \"controller-manager-55c479c949-bxrgx\" (UID: \"059440a1-ff60-496f-bdbc-8218b5ceb3f7\") " pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.707566 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/059440a1-ff60-496f-bdbc-8218b5ceb3f7-client-ca\") pod \"controller-manager-55c479c949-bxrgx\" (UID: \"059440a1-ff60-496f-bdbc-8218b5ceb3f7\") " pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.707586 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/059440a1-ff60-496f-bdbc-8218b5ceb3f7-config\") pod \"controller-manager-55c479c949-bxrgx\" (UID: \"059440a1-ff60-496f-bdbc-8218b5ceb3f7\") " pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.709200 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/059440a1-ff60-496f-bdbc-8218b5ceb3f7-config\") pod \"controller-manager-55c479c949-bxrgx\" (UID: \"059440a1-ff60-496f-bdbc-8218b5ceb3f7\") " pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.709408 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9ec3613-4fe1-4e71-8991-f5be9a94579e-config\") pod \"route-controller-manager-f66f7bb8d-7bp5k\" (UID: \"f9ec3613-4fe1-4e71-8991-f5be9a94579e\") " pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.710102 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f9ec3613-4fe1-4e71-8991-f5be9a94579e-client-ca\") pod \"route-controller-manager-f66f7bb8d-7bp5k\" (UID: \"f9ec3613-4fe1-4e71-8991-f5be9a94579e\") " pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.710533 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/059440a1-ff60-496f-bdbc-8218b5ceb3f7-client-ca\") pod \"controller-manager-55c479c949-bxrgx\" (UID: \"059440a1-ff60-496f-bdbc-8218b5ceb3f7\") " pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.710977 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/059440a1-ff60-496f-bdbc-8218b5ceb3f7-proxy-ca-bundles\") pod \"controller-manager-55c479c949-bxrgx\" (UID: \"059440a1-ff60-496f-bdbc-8218b5ceb3f7\") " pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.714613 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9ec3613-4fe1-4e71-8991-f5be9a94579e-serving-cert\") pod \"route-controller-manager-f66f7bb8d-7bp5k\" (UID: \"f9ec3613-4fe1-4e71-8991-f5be9a94579e\") " pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.715799 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/059440a1-ff60-496f-bdbc-8218b5ceb3f7-serving-cert\") pod \"controller-manager-55c479c949-bxrgx\" (UID: \"059440a1-ff60-496f-bdbc-8218b5ceb3f7\") " pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.772660 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqzkw\" (UniqueName: \"kubernetes.io/projected/059440a1-ff60-496f-bdbc-8218b5ceb3f7-kube-api-access-zqzkw\") pod \"controller-manager-55c479c949-bxrgx\" (UID: \"059440a1-ff60-496f-bdbc-8218b5ceb3f7\") " pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.772877 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8vjz\" (UniqueName: \"kubernetes.io/projected/f9ec3613-4fe1-4e71-8991-f5be9a94579e-kube-api-access-g8vjz\") pod \"route-controller-manager-f66f7bb8d-7bp5k\" (UID: \"f9ec3613-4fe1-4e71-8991-f5be9a94579e\") " pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.801640 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.890091 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.940327 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.957636 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" Mar 01 09:13:53 crc kubenswrapper[4792]: I0301 09:13:53.094426 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 01 09:13:53 crc kubenswrapper[4792]: I0301 09:13:53.135658 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 01 09:13:53 crc kubenswrapper[4792]: I0301 09:13:53.225525 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 01 09:13:53 crc kubenswrapper[4792]: I0301 09:13:53.353882 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 01 09:13:53 crc kubenswrapper[4792]: I0301 09:13:53.359225 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 01 09:13:53 crc kubenswrapper[4792]: I0301 09:13:53.417811 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 01 09:13:53 crc kubenswrapper[4792]: I0301 09:13:53.516433 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 01 09:13:53 crc kubenswrapper[4792]: I0301 09:13:53.537113 4792 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 01 09:13:53 crc kubenswrapper[4792]: I0301 09:13:53.551475 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 01 09:13:53 crc kubenswrapper[4792]: I0301 09:13:53.576648 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 01 09:13:53 crc kubenswrapper[4792]: I0301 09:13:53.576845 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 01 09:13:53 crc kubenswrapper[4792]: I0301 09:13:53.656126 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 01 09:13:53 crc kubenswrapper[4792]: I0301 09:13:53.666331 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 01 09:13:53 crc kubenswrapper[4792]: I0301 09:13:53.719188 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 01 09:13:53 crc kubenswrapper[4792]: I0301 09:13:53.881471 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 01 09:13:53 crc kubenswrapper[4792]: I0301 09:13:53.882892 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 01 09:13:53 crc kubenswrapper[4792]: I0301 09:13:53.953220 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 01 09:13:53 crc kubenswrapper[4792]: I0301 09:13:53.995518 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 01 09:13:54 crc kubenswrapper[4792]: I0301 09:13:54.103188 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 01 09:13:54 crc kubenswrapper[4792]: I0301 09:13:54.119176 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 01 09:13:54 crc kubenswrapper[4792]: I0301 09:13:54.143226 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 01 09:13:54 crc kubenswrapper[4792]: I0301 09:13:54.230996 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 01 09:13:54 crc kubenswrapper[4792]: I0301 09:13:54.252985 4792 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 01 09:13:54 crc kubenswrapper[4792]: I0301 09:13:54.476069 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 01 09:13:54 crc kubenswrapper[4792]: I0301 09:13:54.545610 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 01 09:13:54 crc kubenswrapper[4792]: I0301 09:13:54.573395 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 01 09:13:54 crc kubenswrapper[4792]: I0301 09:13:54.637359 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 01 09:13:54 crc kubenswrapper[4792]: I0301 09:13:54.836870 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 01 09:13:54 crc kubenswrapper[4792]: I0301 09:13:54.847590 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 01 09:13:54 crc kubenswrapper[4792]: I0301 09:13:54.873965 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 01 09:13:54 crc kubenswrapper[4792]: I0301 09:13:54.913197 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 01 09:13:55 crc kubenswrapper[4792]: I0301 09:13:55.062091 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 01 09:13:55 crc kubenswrapper[4792]: I0301 09:13:55.145213 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 01 09:13:55 crc kubenswrapper[4792]: I0301 09:13:55.158516 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 01 09:13:55 crc kubenswrapper[4792]: I0301 09:13:55.345586 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 01 09:13:55 crc kubenswrapper[4792]: I0301 09:13:55.554536 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 01 09:13:55 crc kubenswrapper[4792]: I0301 09:13:55.555592 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 01 09:13:55 crc kubenswrapper[4792]: I0301 09:13:55.707421 4792 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 01 09:13:55 crc kubenswrapper[4792]: I0301 09:13:55.961759 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.021462 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.051238 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.089418 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.105509 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.114535 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.135940 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.159482 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.172461 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.236195 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.277246 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.415958 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.417334 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.477578 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.521440 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.544465 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.577663 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.633955 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.636883 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.653309 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.664271 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.764811 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.887400 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.928210 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.955785 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.985946 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 01 09:13:57 crc kubenswrapper[4792]: I0301 09:13:57.070843 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 01 09:13:57 crc kubenswrapper[4792]: I0301 09:13:57.079125 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 01 09:13:57 crc kubenswrapper[4792]: I0301 09:13:57.113376 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 01 09:13:57 crc kubenswrapper[4792]: I0301 09:13:57.187359 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 01 09:13:57 crc kubenswrapper[4792]: I0301 09:13:57.284747 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 01 09:13:57 crc kubenswrapper[4792]: I0301 09:13:57.309051 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 01 09:13:57 crc kubenswrapper[4792]: I0301 09:13:57.388549 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 01 09:13:57 crc kubenswrapper[4792]: I0301 09:13:57.448439 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 01 09:13:57 crc kubenswrapper[4792]: I0301 09:13:57.459408 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 01 09:13:57 crc kubenswrapper[4792]: I0301 09:13:57.554362 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 01 09:13:57 crc kubenswrapper[4792]: I0301 09:13:57.596395 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 01 09:13:57 crc kubenswrapper[4792]: I0301 09:13:57.602006 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 01 09:13:57 crc kubenswrapper[4792]: I0301 09:13:57.613238 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 01 09:13:57 crc kubenswrapper[4792]: I0301 09:13:57.814083 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 01 09:13:57 crc kubenswrapper[4792]: I0301 09:13:57.815882 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 01 09:13:57 crc kubenswrapper[4792]: I0301 09:13:57.968071 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 01 09:13:57 crc kubenswrapper[4792]: I0301 09:13:57.980265 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 01 09:13:57 crc kubenswrapper[4792]: I0301 09:13:57.982876 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 01 09:13:58 crc kubenswrapper[4792]: I0301 09:13:58.011489 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 01 09:13:58 crc kubenswrapper[4792]: I0301 09:13:58.029764 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 01 09:13:58 crc kubenswrapper[4792]: I0301 09:13:58.204369 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 01 09:13:58 crc kubenswrapper[4792]: I0301 09:13:58.257075 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 01 09:13:58 crc kubenswrapper[4792]: I0301 09:13:58.307348 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 01 09:13:58 crc kubenswrapper[4792]: I0301 09:13:58.348306 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 01 09:13:58 crc kubenswrapper[4792]: I0301 09:13:58.416737 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 01 09:13:58 crc kubenswrapper[4792]: I0301 09:13:58.431484 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 01 09:13:58 crc kubenswrapper[4792]: I0301 09:13:58.517748 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 01 09:13:58 crc kubenswrapper[4792]: I0301 09:13:58.519344 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 01 09:13:58 crc kubenswrapper[4792]: I0301 09:13:58.543267 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 01 09:13:58 crc kubenswrapper[4792]: I0301 09:13:58.566507 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 01 09:13:58 crc kubenswrapper[4792]: I0301 09:13:58.600693 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 01 09:13:58 crc kubenswrapper[4792]: I0301 09:13:58.602473 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 01 09:13:58 crc kubenswrapper[4792]: I0301 09:13:58.630043 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 01 09:13:58 crc kubenswrapper[4792]: I0301 09:13:58.773001 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 01 09:13:58 crc kubenswrapper[4792]: I0301 09:13:58.817265 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 01 09:13:58 crc kubenswrapper[4792]: I0301 09:13:58.827113 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 01 09:13:58 crc kubenswrapper[4792]: I0301 09:13:58.833180 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 01 09:13:58 crc kubenswrapper[4792]: I0301 09:13:58.880920 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 01 09:13:59 crc kubenswrapper[4792]: I0301 09:13:59.060992 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 01 09:13:59 crc kubenswrapper[4792]: I0301 09:13:59.084269 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 01 09:13:59 crc kubenswrapper[4792]: I0301 09:13:59.124150 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 01 09:13:59 crc kubenswrapper[4792]: I0301 09:13:59.142840 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 01 09:13:59 crc kubenswrapper[4792]: I0301 09:13:59.148628 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 01 09:13:59 crc kubenswrapper[4792]: I0301 09:13:59.158571 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 01 09:13:59 crc kubenswrapper[4792]: I0301 09:13:59.158948 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 01 09:13:59 crc kubenswrapper[4792]: I0301 09:13:59.226411 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 01 09:13:59 crc kubenswrapper[4792]: I0301 09:13:59.356198 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 01 09:13:59 crc kubenswrapper[4792]: I0301 09:13:59.377674 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 01 09:13:59 crc kubenswrapper[4792]: I0301 09:13:59.390191 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 01 09:13:59 crc kubenswrapper[4792]: I0301 09:13:59.427660 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 01 09:13:59 crc kubenswrapper[4792]: I0301 09:13:59.473673 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 01 09:13:59 crc kubenswrapper[4792]: I0301 09:13:59.481790 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 01 09:13:59 crc kubenswrapper[4792]: I0301 09:13:59.518684 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 01 09:13:59 crc kubenswrapper[4792]: I0301 09:13:59.524461 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 01 09:13:59 crc kubenswrapper[4792]: I0301 09:13:59.531250 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 01 09:13:59 crc kubenswrapper[4792]: I0301 09:13:59.582712 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 01 09:13:59 crc kubenswrapper[4792]: I0301 09:13:59.786673 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 01 09:13:59 crc kubenswrapper[4792]: I0301 09:13:59.854181 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 01 09:13:59 crc kubenswrapper[4792]: I0301 09:13:59.911125 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 01 09:13:59 crc kubenswrapper[4792]: I0301 09:13:59.948335 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.055813 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.123154 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.150175 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.157682 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539274-zckv2"] Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.158270 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539274-zckv2" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.160041 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.160676 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.161481 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.253774 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.303381 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.308034 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnb28\" (UniqueName: \"kubernetes.io/projected/81a3bf03-822b-4b69-93a3-b420d8f58efd-kube-api-access-gnb28\") pod \"auto-csr-approver-29539274-zckv2\" (UID: \"81a3bf03-822b-4b69-93a3-b420d8f58efd\") " pod="openshift-infra/auto-csr-approver-29539274-zckv2" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.319999 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.378377 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.409007 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnb28\" (UniqueName: \"kubernetes.io/projected/81a3bf03-822b-4b69-93a3-b420d8f58efd-kube-api-access-gnb28\") pod \"auto-csr-approver-29539274-zckv2\" (UID: \"81a3bf03-822b-4b69-93a3-b420d8f58efd\") " pod="openshift-infra/auto-csr-approver-29539274-zckv2" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.423815 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.424083 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.428705 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnb28\" (UniqueName: \"kubernetes.io/projected/81a3bf03-822b-4b69-93a3-b420d8f58efd-kube-api-access-gnb28\") pod \"auto-csr-approver-29539274-zckv2\" (UID: \"81a3bf03-822b-4b69-93a3-b420d8f58efd\") " pod="openshift-infra/auto-csr-approver-29539274-zckv2" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.430147 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.475528 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539274-zckv2" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.477939 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.495718 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.510744 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.585482 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.690298 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.741857 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.772038 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.801111 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.870376 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.880680 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.969325 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.983008 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.996557 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.026395 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.033290 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.049136 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.054079 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.278645 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.284832 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.303756 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.311804 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.321569 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.368834 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.378566 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.393784 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.422358 4792 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.422750 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://0053a8c0b15b85cc2122bda95d2798eb0ba27a1e086aa18527efd0c4e0caccae" gracePeriod=5 Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.509550 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.512018 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.663317 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.727022 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.871407 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.888110 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.931871 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.933734 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.952634 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.959981 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 01 09:14:02 crc kubenswrapper[4792]: I0301 09:14:02.121350 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 01 09:14:02 crc kubenswrapper[4792]: I0301 09:14:02.159821 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 01 09:14:02 crc kubenswrapper[4792]: I0301 09:14:02.276806 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-55c479c949-bxrgx"] Mar 01 09:14:02 crc kubenswrapper[4792]: I0301 09:14:02.280506 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539274-zckv2"] Mar 01 09:14:02 crc kubenswrapper[4792]: I0301 09:14:02.298290 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k"] Mar 01 09:14:02 crc kubenswrapper[4792]: I0301 09:14:02.530395 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 01 09:14:02 crc kubenswrapper[4792]: I0301 09:14:02.550989 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 01 09:14:02 crc kubenswrapper[4792]: I0301 09:14:02.583990 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 01 09:14:02 crc kubenswrapper[4792]: I0301 09:14:02.715627 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 01 09:14:02 crc kubenswrapper[4792]: I0301 09:14:02.739648 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 01 09:14:02 crc kubenswrapper[4792]: I0301 09:14:02.752404 4792 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 01 09:14:02 crc kubenswrapper[4792]: I0301 09:14:02.824828 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 01 09:14:02 crc kubenswrapper[4792]: I0301 09:14:02.824829 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 01 09:14:02 crc kubenswrapper[4792]: I0301 09:14:02.872435 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 01 09:14:02 crc kubenswrapper[4792]: I0301 09:14:02.902259 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 01 09:14:02 crc kubenswrapper[4792]: I0301 09:14:02.902851 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 01 09:14:03 crc kubenswrapper[4792]: E0301 09:14:03.005209 4792 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 01 09:14:03 crc kubenswrapper[4792]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-f66f7bb8d-7bp5k_openshift-route-controller-manager_f9ec3613-4fe1-4e71-8991-f5be9a94579e_0(1e2720b79b9d762674d7a7d3843e4f18329f00c615ae105fe7a340ffdc9a654f): error adding pod openshift-route-controller-manager_route-controller-manager-f66f7bb8d-7bp5k to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"1e2720b79b9d762674d7a7d3843e4f18329f00c615ae105fe7a340ffdc9a654f" Netns:"/var/run/netns/c5b3c562-e98d-43f3-9947-f2b0b86c571d" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-f66f7bb8d-7bp5k;K8S_POD_INFRA_CONTAINER_ID=1e2720b79b9d762674d7a7d3843e4f18329f00c615ae105fe7a340ffdc9a654f;K8S_POD_UID=f9ec3613-4fe1-4e71-8991-f5be9a94579e" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k] networking: Multus: [openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k/f9ec3613-4fe1-4e71-8991-f5be9a94579e]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod route-controller-manager-f66f7bb8d-7bp5k in out of cluster comm: pod "route-controller-manager-f66f7bb8d-7bp5k" not found Mar 01 09:14:03 crc kubenswrapper[4792]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 01 09:14:03 crc kubenswrapper[4792]: > Mar 01 09:14:03 crc kubenswrapper[4792]: E0301 09:14:03.005295 4792 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 01 09:14:03 crc kubenswrapper[4792]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-f66f7bb8d-7bp5k_openshift-route-controller-manager_f9ec3613-4fe1-4e71-8991-f5be9a94579e_0(1e2720b79b9d762674d7a7d3843e4f18329f00c615ae105fe7a340ffdc9a654f): error adding pod openshift-route-controller-manager_route-controller-manager-f66f7bb8d-7bp5k to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"1e2720b79b9d762674d7a7d3843e4f18329f00c615ae105fe7a340ffdc9a654f" Netns:"/var/run/netns/c5b3c562-e98d-43f3-9947-f2b0b86c571d" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-f66f7bb8d-7bp5k;K8S_POD_INFRA_CONTAINER_ID=1e2720b79b9d762674d7a7d3843e4f18329f00c615ae105fe7a340ffdc9a654f;K8S_POD_UID=f9ec3613-4fe1-4e71-8991-f5be9a94579e" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k] networking: Multus: [openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k/f9ec3613-4fe1-4e71-8991-f5be9a94579e]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod route-controller-manager-f66f7bb8d-7bp5k in out of cluster comm: pod "route-controller-manager-f66f7bb8d-7bp5k" not found Mar 01 09:14:03 crc kubenswrapper[4792]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 01 09:14:03 crc kubenswrapper[4792]: > pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" Mar 01 09:14:03 crc kubenswrapper[4792]: E0301 09:14:03.005334 4792 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 01 09:14:03 crc kubenswrapper[4792]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-f66f7bb8d-7bp5k_openshift-route-controller-manager_f9ec3613-4fe1-4e71-8991-f5be9a94579e_0(1e2720b79b9d762674d7a7d3843e4f18329f00c615ae105fe7a340ffdc9a654f): error adding pod openshift-route-controller-manager_route-controller-manager-f66f7bb8d-7bp5k to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"1e2720b79b9d762674d7a7d3843e4f18329f00c615ae105fe7a340ffdc9a654f" Netns:"/var/run/netns/c5b3c562-e98d-43f3-9947-f2b0b86c571d" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-f66f7bb8d-7bp5k;K8S_POD_INFRA_CONTAINER_ID=1e2720b79b9d762674d7a7d3843e4f18329f00c615ae105fe7a340ffdc9a654f;K8S_POD_UID=f9ec3613-4fe1-4e71-8991-f5be9a94579e" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k] networking: Multus: [openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k/f9ec3613-4fe1-4e71-8991-f5be9a94579e]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod route-controller-manager-f66f7bb8d-7bp5k in out of cluster comm: pod "route-controller-manager-f66f7bb8d-7bp5k" not found Mar 01 09:14:03 crc kubenswrapper[4792]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 01 09:14:03 crc kubenswrapper[4792]: > pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" Mar 01 09:14:03 crc kubenswrapper[4792]: E0301 09:14:03.005413 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"route-controller-manager-f66f7bb8d-7bp5k_openshift-route-controller-manager(f9ec3613-4fe1-4e71-8991-f5be9a94579e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"route-controller-manager-f66f7bb8d-7bp5k_openshift-route-controller-manager(f9ec3613-4fe1-4e71-8991-f5be9a94579e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-f66f7bb8d-7bp5k_openshift-route-controller-manager_f9ec3613-4fe1-4e71-8991-f5be9a94579e_0(1e2720b79b9d762674d7a7d3843e4f18329f00c615ae105fe7a340ffdc9a654f): error adding pod openshift-route-controller-manager_route-controller-manager-f66f7bb8d-7bp5k to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"1e2720b79b9d762674d7a7d3843e4f18329f00c615ae105fe7a340ffdc9a654f\\\" Netns:\\\"/var/run/netns/c5b3c562-e98d-43f3-9947-f2b0b86c571d\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-f66f7bb8d-7bp5k;K8S_POD_INFRA_CONTAINER_ID=1e2720b79b9d762674d7a7d3843e4f18329f00c615ae105fe7a340ffdc9a654f;K8S_POD_UID=f9ec3613-4fe1-4e71-8991-f5be9a94579e\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k] networking: Multus: [openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k/f9ec3613-4fe1-4e71-8991-f5be9a94579e]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod route-controller-manager-f66f7bb8d-7bp5k in out of cluster comm: pod \\\"route-controller-manager-f66f7bb8d-7bp5k\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" podUID="f9ec3613-4fe1-4e71-8991-f5be9a94579e" Mar 01 09:14:03 crc kubenswrapper[4792]: E0301 09:14:03.016167 4792 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 01 09:14:03 crc kubenswrapper[4792]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-55c479c949-bxrgx_openshift-controller-manager_059440a1-ff60-496f-bdbc-8218b5ceb3f7_0(d1012c3ba2d9ba9c869f80eccdd22940410646d41d24b82510d46f497988cbef): error adding pod openshift-controller-manager_controller-manager-55c479c949-bxrgx to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"d1012c3ba2d9ba9c869f80eccdd22940410646d41d24b82510d46f497988cbef" Netns:"/var/run/netns/b77ff266-85d5-40ff-bdf8-eae839742e42" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-55c479c949-bxrgx;K8S_POD_INFRA_CONTAINER_ID=d1012c3ba2d9ba9c869f80eccdd22940410646d41d24b82510d46f497988cbef;K8S_POD_UID=059440a1-ff60-496f-bdbc-8218b5ceb3f7" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-55c479c949-bxrgx] networking: Multus: [openshift-controller-manager/controller-manager-55c479c949-bxrgx/059440a1-ff60-496f-bdbc-8218b5ceb3f7]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod controller-manager-55c479c949-bxrgx in out of cluster comm: pod "controller-manager-55c479c949-bxrgx" not found Mar 01 09:14:03 crc kubenswrapper[4792]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 01 09:14:03 crc kubenswrapper[4792]: > Mar 01 09:14:03 crc kubenswrapper[4792]: E0301 09:14:03.016252 4792 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 01 09:14:03 crc kubenswrapper[4792]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-55c479c949-bxrgx_openshift-controller-manager_059440a1-ff60-496f-bdbc-8218b5ceb3f7_0(d1012c3ba2d9ba9c869f80eccdd22940410646d41d24b82510d46f497988cbef): error adding pod openshift-controller-manager_controller-manager-55c479c949-bxrgx to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"d1012c3ba2d9ba9c869f80eccdd22940410646d41d24b82510d46f497988cbef" Netns:"/var/run/netns/b77ff266-85d5-40ff-bdf8-eae839742e42" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-55c479c949-bxrgx;K8S_POD_INFRA_CONTAINER_ID=d1012c3ba2d9ba9c869f80eccdd22940410646d41d24b82510d46f497988cbef;K8S_POD_UID=059440a1-ff60-496f-bdbc-8218b5ceb3f7" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-55c479c949-bxrgx] networking: Multus: [openshift-controller-manager/controller-manager-55c479c949-bxrgx/059440a1-ff60-496f-bdbc-8218b5ceb3f7]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod controller-manager-55c479c949-bxrgx in out of cluster comm: pod "controller-manager-55c479c949-bxrgx" not found Mar 01 09:14:03 crc kubenswrapper[4792]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 01 09:14:03 crc kubenswrapper[4792]: > pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" Mar 01 09:14:03 crc kubenswrapper[4792]: E0301 09:14:03.016274 4792 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 01 09:14:03 crc kubenswrapper[4792]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-55c479c949-bxrgx_openshift-controller-manager_059440a1-ff60-496f-bdbc-8218b5ceb3f7_0(d1012c3ba2d9ba9c869f80eccdd22940410646d41d24b82510d46f497988cbef): error adding pod openshift-controller-manager_controller-manager-55c479c949-bxrgx to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"d1012c3ba2d9ba9c869f80eccdd22940410646d41d24b82510d46f497988cbef" Netns:"/var/run/netns/b77ff266-85d5-40ff-bdf8-eae839742e42" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-55c479c949-bxrgx;K8S_POD_INFRA_CONTAINER_ID=d1012c3ba2d9ba9c869f80eccdd22940410646d41d24b82510d46f497988cbef;K8S_POD_UID=059440a1-ff60-496f-bdbc-8218b5ceb3f7" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-55c479c949-bxrgx] networking: Multus: [openshift-controller-manager/controller-manager-55c479c949-bxrgx/059440a1-ff60-496f-bdbc-8218b5ceb3f7]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod controller-manager-55c479c949-bxrgx in out of cluster comm: pod "controller-manager-55c479c949-bxrgx" not found Mar 01 09:14:03 crc kubenswrapper[4792]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 01 09:14:03 crc kubenswrapper[4792]: > pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" Mar 01 09:14:03 crc kubenswrapper[4792]: E0301 09:14:03.016332 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"controller-manager-55c479c949-bxrgx_openshift-controller-manager(059440a1-ff60-496f-bdbc-8218b5ceb3f7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"controller-manager-55c479c949-bxrgx_openshift-controller-manager(059440a1-ff60-496f-bdbc-8218b5ceb3f7)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-55c479c949-bxrgx_openshift-controller-manager_059440a1-ff60-496f-bdbc-8218b5ceb3f7_0(d1012c3ba2d9ba9c869f80eccdd22940410646d41d24b82510d46f497988cbef): error adding pod openshift-controller-manager_controller-manager-55c479c949-bxrgx to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"d1012c3ba2d9ba9c869f80eccdd22940410646d41d24b82510d46f497988cbef\\\" Netns:\\\"/var/run/netns/b77ff266-85d5-40ff-bdf8-eae839742e42\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-55c479c949-bxrgx;K8S_POD_INFRA_CONTAINER_ID=d1012c3ba2d9ba9c869f80eccdd22940410646d41d24b82510d46f497988cbef;K8S_POD_UID=059440a1-ff60-496f-bdbc-8218b5ceb3f7\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-55c479c949-bxrgx] networking: Multus: [openshift-controller-manager/controller-manager-55c479c949-bxrgx/059440a1-ff60-496f-bdbc-8218b5ceb3f7]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod controller-manager-55c479c949-bxrgx in out of cluster comm: pod \\\"controller-manager-55c479c949-bxrgx\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" podUID="059440a1-ff60-496f-bdbc-8218b5ceb3f7" Mar 01 09:14:03 crc kubenswrapper[4792]: I0301 09:14:03.049148 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 01 09:14:03 crc kubenswrapper[4792]: I0301 09:14:03.094729 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 01 09:14:03 crc kubenswrapper[4792]: I0301 09:14:03.100540 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 01 09:14:03 crc kubenswrapper[4792]: I0301 09:14:03.194703 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 01 09:14:03 crc kubenswrapper[4792]: I0301 09:14:03.217200 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" Mar 01 09:14:03 crc kubenswrapper[4792]: I0301 09:14:03.217291 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" Mar 01 09:14:03 crc kubenswrapper[4792]: I0301 09:14:03.217852 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" Mar 01 09:14:03 crc kubenswrapper[4792]: I0301 09:14:03.218359 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" Mar 01 09:14:03 crc kubenswrapper[4792]: I0301 09:14:03.308161 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 01 09:14:03 crc kubenswrapper[4792]: I0301 09:14:03.325040 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 01 09:14:03 crc kubenswrapper[4792]: E0301 09:14:03.372740 4792 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 01 09:14:03 crc kubenswrapper[4792]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29539274-zckv2_openshift-infra_81a3bf03-822b-4b69-93a3-b420d8f58efd_0(973c84189d303a057ec7d6896ef4f066682b84dbe6f8f59321aa5c83025f3b78): error adding pod openshift-infra_auto-csr-approver-29539274-zckv2 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"973c84189d303a057ec7d6896ef4f066682b84dbe6f8f59321aa5c83025f3b78" Netns:"/var/run/netns/94f9648b-6ae8-4642-82c8-580fccbe1e89" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-infra;K8S_POD_NAME=auto-csr-approver-29539274-zckv2;K8S_POD_INFRA_CONTAINER_ID=973c84189d303a057ec7d6896ef4f066682b84dbe6f8f59321aa5c83025f3b78;K8S_POD_UID=81a3bf03-822b-4b69-93a3-b420d8f58efd" Path:"" ERRORED: error configuring pod [openshift-infra/auto-csr-approver-29539274-zckv2] networking: Multus: [openshift-infra/auto-csr-approver-29539274-zckv2/81a3bf03-822b-4b69-93a3-b420d8f58efd]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod auto-csr-approver-29539274-zckv2 in out of cluster comm: pod "auto-csr-approver-29539274-zckv2" not found Mar 01 09:14:03 crc kubenswrapper[4792]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 01 09:14:03 crc kubenswrapper[4792]: > Mar 01 09:14:03 crc kubenswrapper[4792]: E0301 09:14:03.372821 4792 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 01 09:14:03 crc kubenswrapper[4792]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29539274-zckv2_openshift-infra_81a3bf03-822b-4b69-93a3-b420d8f58efd_0(973c84189d303a057ec7d6896ef4f066682b84dbe6f8f59321aa5c83025f3b78): error adding pod openshift-infra_auto-csr-approver-29539274-zckv2 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"973c84189d303a057ec7d6896ef4f066682b84dbe6f8f59321aa5c83025f3b78" Netns:"/var/run/netns/94f9648b-6ae8-4642-82c8-580fccbe1e89" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-infra;K8S_POD_NAME=auto-csr-approver-29539274-zckv2;K8S_POD_INFRA_CONTAINER_ID=973c84189d303a057ec7d6896ef4f066682b84dbe6f8f59321aa5c83025f3b78;K8S_POD_UID=81a3bf03-822b-4b69-93a3-b420d8f58efd" Path:"" ERRORED: error configuring pod [openshift-infra/auto-csr-approver-29539274-zckv2] networking: Multus: [openshift-infra/auto-csr-approver-29539274-zckv2/81a3bf03-822b-4b69-93a3-b420d8f58efd]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod auto-csr-approver-29539274-zckv2 in out of cluster comm: pod "auto-csr-approver-29539274-zckv2" not found Mar 01 09:14:03 crc kubenswrapper[4792]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 01 09:14:03 crc kubenswrapper[4792]: > pod="openshift-infra/auto-csr-approver-29539274-zckv2" Mar 01 09:14:03 crc kubenswrapper[4792]: E0301 09:14:03.372847 4792 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 01 09:14:03 crc kubenswrapper[4792]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29539274-zckv2_openshift-infra_81a3bf03-822b-4b69-93a3-b420d8f58efd_0(973c84189d303a057ec7d6896ef4f066682b84dbe6f8f59321aa5c83025f3b78): error adding pod openshift-infra_auto-csr-approver-29539274-zckv2 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"973c84189d303a057ec7d6896ef4f066682b84dbe6f8f59321aa5c83025f3b78" Netns:"/var/run/netns/94f9648b-6ae8-4642-82c8-580fccbe1e89" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-infra;K8S_POD_NAME=auto-csr-approver-29539274-zckv2;K8S_POD_INFRA_CONTAINER_ID=973c84189d303a057ec7d6896ef4f066682b84dbe6f8f59321aa5c83025f3b78;K8S_POD_UID=81a3bf03-822b-4b69-93a3-b420d8f58efd" Path:"" ERRORED: error configuring pod [openshift-infra/auto-csr-approver-29539274-zckv2] networking: Multus: [openshift-infra/auto-csr-approver-29539274-zckv2/81a3bf03-822b-4b69-93a3-b420d8f58efd]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod auto-csr-approver-29539274-zckv2 in out of cluster comm: pod "auto-csr-approver-29539274-zckv2" not found Mar 01 09:14:03 crc kubenswrapper[4792]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 01 09:14:03 crc kubenswrapper[4792]: > pod="openshift-infra/auto-csr-approver-29539274-zckv2" Mar 01 09:14:03 crc kubenswrapper[4792]: E0301 09:14:03.372932 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29539274-zckv2_openshift-infra(81a3bf03-822b-4b69-93a3-b420d8f58efd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29539274-zckv2_openshift-infra(81a3bf03-822b-4b69-93a3-b420d8f58efd)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29539274-zckv2_openshift-infra_81a3bf03-822b-4b69-93a3-b420d8f58efd_0(973c84189d303a057ec7d6896ef4f066682b84dbe6f8f59321aa5c83025f3b78): error adding pod openshift-infra_auto-csr-approver-29539274-zckv2 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"973c84189d303a057ec7d6896ef4f066682b84dbe6f8f59321aa5c83025f3b78\\\" Netns:\\\"/var/run/netns/94f9648b-6ae8-4642-82c8-580fccbe1e89\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-infra;K8S_POD_NAME=auto-csr-approver-29539274-zckv2;K8S_POD_INFRA_CONTAINER_ID=973c84189d303a057ec7d6896ef4f066682b84dbe6f8f59321aa5c83025f3b78;K8S_POD_UID=81a3bf03-822b-4b69-93a3-b420d8f58efd\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-infra/auto-csr-approver-29539274-zckv2] networking: Multus: [openshift-infra/auto-csr-approver-29539274-zckv2/81a3bf03-822b-4b69-93a3-b420d8f58efd]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod auto-csr-approver-29539274-zckv2 in out of cluster comm: pod \\\"auto-csr-approver-29539274-zckv2\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-infra/auto-csr-approver-29539274-zckv2" podUID="81a3bf03-822b-4b69-93a3-b420d8f58efd" Mar 01 09:14:03 crc kubenswrapper[4792]: I0301 09:14:03.505953 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 01 09:14:03 crc kubenswrapper[4792]: I0301 09:14:03.571446 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 01 09:14:03 crc kubenswrapper[4792]: I0301 09:14:03.641866 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 01 09:14:03 crc kubenswrapper[4792]: I0301 09:14:03.689825 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 01 09:14:03 crc kubenswrapper[4792]: I0301 09:14:03.929689 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k"] Mar 01 09:14:03 crc kubenswrapper[4792]: I0301 09:14:03.996096 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 01 09:14:04 crc kubenswrapper[4792]: I0301 09:14:04.056411 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-55c479c949-bxrgx"] Mar 01 09:14:04 crc kubenswrapper[4792]: W0301 09:14:04.060022 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod059440a1_ff60_496f_bdbc_8218b5ceb3f7.slice/crio-93588b0b710160330739dce49ef130b3bc798f987a69e63656832318c0b660e9 WatchSource:0}: Error finding container 93588b0b710160330739dce49ef130b3bc798f987a69e63656832318c0b660e9: Status 404 returned error can't find the container with id 93588b0b710160330739dce49ef130b3bc798f987a69e63656832318c0b660e9 Mar 01 09:14:04 crc kubenswrapper[4792]: I0301 09:14:04.166522 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 01 09:14:04 crc kubenswrapper[4792]: I0301 09:14:04.233731 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" event={"ID":"f9ec3613-4fe1-4e71-8991-f5be9a94579e","Type":"ContainerStarted","Data":"1e70c60c4e7c815e1ff19c61b0b52d3e0a670b9a447599618d07c39ce492f957"} Mar 01 09:14:04 crc kubenswrapper[4792]: I0301 09:14:04.234862 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" event={"ID":"059440a1-ff60-496f-bdbc-8218b5ceb3f7","Type":"ContainerStarted","Data":"93588b0b710160330739dce49ef130b3bc798f987a69e63656832318c0b660e9"} Mar 01 09:14:04 crc kubenswrapper[4792]: I0301 09:14:04.235024 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539274-zckv2" Mar 01 09:14:04 crc kubenswrapper[4792]: I0301 09:14:04.235677 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539274-zckv2" Mar 01 09:14:04 crc kubenswrapper[4792]: I0301 09:14:04.425088 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 01 09:14:04 crc kubenswrapper[4792]: I0301 09:14:04.540885 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 01 09:14:04 crc kubenswrapper[4792]: I0301 09:14:04.608320 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 01 09:14:04 crc kubenswrapper[4792]: I0301 09:14:04.627568 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539274-zckv2"] Mar 01 09:14:04 crc kubenswrapper[4792]: W0301 09:14:04.634084 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81a3bf03_822b_4b69_93a3_b420d8f58efd.slice/crio-a3c24e76c9cd27814b70d42d5440e38471f26a7d1011faa03af0f134d3a951a7 WatchSource:0}: Error finding container a3c24e76c9cd27814b70d42d5440e38471f26a7d1011faa03af0f134d3a951a7: Status 404 returned error can't find the container with id a3c24e76c9cd27814b70d42d5440e38471f26a7d1011faa03af0f134d3a951a7 Mar 01 09:14:04 crc kubenswrapper[4792]: I0301 09:14:04.640443 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 01 09:14:04 crc kubenswrapper[4792]: I0301 09:14:04.685695 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 01 09:14:04 crc kubenswrapper[4792]: I0301 09:14:04.756062 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 01 09:14:04 crc kubenswrapper[4792]: I0301 09:14:04.837962 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 01 09:14:04 crc kubenswrapper[4792]: I0301 09:14:04.860196 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 01 09:14:05 crc kubenswrapper[4792]: I0301 09:14:05.003642 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 01 09:14:05 crc kubenswrapper[4792]: I0301 09:14:05.059994 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 01 09:14:05 crc kubenswrapper[4792]: I0301 09:14:05.083048 4792 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 01 09:14:05 crc kubenswrapper[4792]: I0301 09:14:05.099535 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 01 09:14:05 crc kubenswrapper[4792]: I0301 09:14:05.172165 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 01 09:14:05 crc kubenswrapper[4792]: I0301 09:14:05.240576 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" event={"ID":"f9ec3613-4fe1-4e71-8991-f5be9a94579e","Type":"ContainerStarted","Data":"b92fa0430770ca4ba11287fde661a77eb495b1dfe8b0c1b92d46f3d044ca5494"} Mar 01 09:14:05 crc kubenswrapper[4792]: I0301 09:14:05.241669 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" Mar 01 09:14:05 crc kubenswrapper[4792]: I0301 09:14:05.243276 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539274-zckv2" event={"ID":"81a3bf03-822b-4b69-93a3-b420d8f58efd","Type":"ContainerStarted","Data":"a3c24e76c9cd27814b70d42d5440e38471f26a7d1011faa03af0f134d3a951a7"} Mar 01 09:14:05 crc kubenswrapper[4792]: I0301 09:14:05.245688 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" event={"ID":"059440a1-ff60-496f-bdbc-8218b5ceb3f7","Type":"ContainerStarted","Data":"f0bcfc05997e9dda23611e6b0d7d0d240d9f4f4c0b360a81f6b2c1cb880b601d"} Mar 01 09:14:05 crc kubenswrapper[4792]: I0301 09:14:05.246242 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" Mar 01 09:14:05 crc kubenswrapper[4792]: I0301 09:14:05.251336 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" Mar 01 09:14:05 crc kubenswrapper[4792]: I0301 09:14:05.253395 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" Mar 01 09:14:05 crc kubenswrapper[4792]: I0301 09:14:05.262791 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" podStartSLOduration=48.262777829 podStartE2EDuration="48.262777829s" podCreationTimestamp="2026-03-01 09:13:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:14:05.261674422 +0000 UTC m=+374.503553619" watchObservedRunningTime="2026-03-01 09:14:05.262777829 +0000 UTC m=+374.504657026" Mar 01 09:14:05 crc kubenswrapper[4792]: I0301 09:14:05.330965 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 01 09:14:05 crc kubenswrapper[4792]: I0301 09:14:05.380150 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 01 09:14:06 crc kubenswrapper[4792]: I0301 09:14:06.251387 4792 generic.go:334] "Generic (PLEG): container finished" podID="81a3bf03-822b-4b69-93a3-b420d8f58efd" containerID="8ef57da9b21fb114ba0f54a09e6174de667175684b505b54b0c846389388b402" exitCode=0 Mar 01 09:14:06 crc kubenswrapper[4792]: I0301 09:14:06.251587 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539274-zckv2" event={"ID":"81a3bf03-822b-4b69-93a3-b420d8f58efd","Type":"ContainerDied","Data":"8ef57da9b21fb114ba0f54a09e6174de667175684b505b54b0c846389388b402"} Mar 01 09:14:06 crc kubenswrapper[4792]: I0301 09:14:06.262872 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" podStartSLOduration=49.262854359 podStartE2EDuration="49.262854359s" podCreationTimestamp="2026-03-01 09:13:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:14:05.354100652 +0000 UTC m=+374.595979849" watchObservedRunningTime="2026-03-01 09:14:06.262854359 +0000 UTC m=+375.504733556" Mar 01 09:14:06 crc kubenswrapper[4792]: I0301 09:14:06.992570 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 01 09:14:06 crc kubenswrapper[4792]: I0301 09:14:06.992640 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.096634 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.096702 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.096741 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.096769 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.096845 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.096838 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.096863 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.096858 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.096884 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.097055 4792 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.097065 4792 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.097073 4792 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.097081 4792 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.103130 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.198014 4792 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.260699 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.260752 4792 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="0053a8c0b15b85cc2122bda95d2798eb0ba27a1e086aa18527efd0c4e0caccae" exitCode=137 Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.260838 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.260949 4792 scope.go:117] "RemoveContainer" containerID="0053a8c0b15b85cc2122bda95d2798eb0ba27a1e086aa18527efd0c4e0caccae" Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.279152 4792 scope.go:117] "RemoveContainer" containerID="0053a8c0b15b85cc2122bda95d2798eb0ba27a1e086aa18527efd0c4e0caccae" Mar 01 09:14:07 crc kubenswrapper[4792]: E0301 09:14:07.279575 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0053a8c0b15b85cc2122bda95d2798eb0ba27a1e086aa18527efd0c4e0caccae\": container with ID starting with 0053a8c0b15b85cc2122bda95d2798eb0ba27a1e086aa18527efd0c4e0caccae not found: ID does not exist" containerID="0053a8c0b15b85cc2122bda95d2798eb0ba27a1e086aa18527efd0c4e0caccae" Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.279638 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0053a8c0b15b85cc2122bda95d2798eb0ba27a1e086aa18527efd0c4e0caccae"} err="failed to get container status \"0053a8c0b15b85cc2122bda95d2798eb0ba27a1e086aa18527efd0c4e0caccae\": rpc error: code = NotFound desc = could not find container \"0053a8c0b15b85cc2122bda95d2798eb0ba27a1e086aa18527efd0c4e0caccae\": container with ID starting with 0053a8c0b15b85cc2122bda95d2798eb0ba27a1e086aa18527efd0c4e0caccae not found: ID does not exist" Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.415658 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.416242 4792 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.429067 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.429110 4792 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="ee49ed71-d637-4156-a077-becc9128fadb" Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.433285 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.433334 4792 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="ee49ed71-d637-4156-a077-becc9128fadb" Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.488108 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539274-zckv2" Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.606483 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnb28\" (UniqueName: \"kubernetes.io/projected/81a3bf03-822b-4b69-93a3-b420d8f58efd-kube-api-access-gnb28\") pod \"81a3bf03-822b-4b69-93a3-b420d8f58efd\" (UID: \"81a3bf03-822b-4b69-93a3-b420d8f58efd\") " Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.610975 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81a3bf03-822b-4b69-93a3-b420d8f58efd-kube-api-access-gnb28" (OuterVolumeSpecName: "kube-api-access-gnb28") pod "81a3bf03-822b-4b69-93a3-b420d8f58efd" (UID: "81a3bf03-822b-4b69-93a3-b420d8f58efd"). InnerVolumeSpecName "kube-api-access-gnb28". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.707788 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnb28\" (UniqueName: \"kubernetes.io/projected/81a3bf03-822b-4b69-93a3-b420d8f58efd-kube-api-access-gnb28\") on node \"crc\" DevicePath \"\"" Mar 01 09:14:08 crc kubenswrapper[4792]: I0301 09:14:08.270721 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539274-zckv2" event={"ID":"81a3bf03-822b-4b69-93a3-b420d8f58efd","Type":"ContainerDied","Data":"a3c24e76c9cd27814b70d42d5440e38471f26a7d1011faa03af0f134d3a951a7"} Mar 01 09:14:08 crc kubenswrapper[4792]: I0301 09:14:08.270762 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3c24e76c9cd27814b70d42d5440e38471f26a7d1011faa03af0f134d3a951a7" Mar 01 09:14:08 crc kubenswrapper[4792]: I0301 09:14:08.270779 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539274-zckv2" Mar 01 09:14:08 crc kubenswrapper[4792]: I0301 09:14:08.594026 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 01 09:15:00 crc kubenswrapper[4792]: I0301 09:15:00.137764 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539275-5mhm4"] Mar 01 09:15:00 crc kubenswrapper[4792]: E0301 09:15:00.139733 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 01 09:15:00 crc kubenswrapper[4792]: I0301 09:15:00.139820 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 01 09:15:00 crc kubenswrapper[4792]: E0301 09:15:00.139929 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81a3bf03-822b-4b69-93a3-b420d8f58efd" containerName="oc" Mar 01 09:15:00 crc kubenswrapper[4792]: I0301 09:15:00.140019 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="81a3bf03-822b-4b69-93a3-b420d8f58efd" containerName="oc" Mar 01 09:15:00 crc kubenswrapper[4792]: I0301 09:15:00.140202 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="81a3bf03-822b-4b69-93a3-b420d8f58efd" containerName="oc" Mar 01 09:15:00 crc kubenswrapper[4792]: I0301 09:15:00.140282 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 01 09:15:00 crc kubenswrapper[4792]: I0301 09:15:00.140697 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539275-5mhm4" Mar 01 09:15:00 crc kubenswrapper[4792]: I0301 09:15:00.147080 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539275-5mhm4"] Mar 01 09:15:00 crc kubenswrapper[4792]: I0301 09:15:00.147647 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 01 09:15:00 crc kubenswrapper[4792]: I0301 09:15:00.148234 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 01 09:15:00 crc kubenswrapper[4792]: I0301 09:15:00.156092 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s26sl\" (UniqueName: \"kubernetes.io/projected/8558286c-6cb2-4061-bb84-07803d33b576-kube-api-access-s26sl\") pod \"collect-profiles-29539275-5mhm4\" (UID: \"8558286c-6cb2-4061-bb84-07803d33b576\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539275-5mhm4" Mar 01 09:15:00 crc kubenswrapper[4792]: I0301 09:15:00.156192 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8558286c-6cb2-4061-bb84-07803d33b576-config-volume\") pod \"collect-profiles-29539275-5mhm4\" (UID: \"8558286c-6cb2-4061-bb84-07803d33b576\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539275-5mhm4" Mar 01 09:15:00 crc kubenswrapper[4792]: I0301 09:15:00.156215 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8558286c-6cb2-4061-bb84-07803d33b576-secret-volume\") pod \"collect-profiles-29539275-5mhm4\" (UID: \"8558286c-6cb2-4061-bb84-07803d33b576\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539275-5mhm4" Mar 01 09:15:00 crc kubenswrapper[4792]: I0301 09:15:00.257865 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8558286c-6cb2-4061-bb84-07803d33b576-config-volume\") pod \"collect-profiles-29539275-5mhm4\" (UID: \"8558286c-6cb2-4061-bb84-07803d33b576\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539275-5mhm4" Mar 01 09:15:00 crc kubenswrapper[4792]: I0301 09:15:00.257938 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8558286c-6cb2-4061-bb84-07803d33b576-secret-volume\") pod \"collect-profiles-29539275-5mhm4\" (UID: \"8558286c-6cb2-4061-bb84-07803d33b576\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539275-5mhm4" Mar 01 09:15:00 crc kubenswrapper[4792]: I0301 09:15:00.258003 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s26sl\" (UniqueName: \"kubernetes.io/projected/8558286c-6cb2-4061-bb84-07803d33b576-kube-api-access-s26sl\") pod \"collect-profiles-29539275-5mhm4\" (UID: \"8558286c-6cb2-4061-bb84-07803d33b576\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539275-5mhm4" Mar 01 09:15:00 crc kubenswrapper[4792]: I0301 09:15:00.259012 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8558286c-6cb2-4061-bb84-07803d33b576-config-volume\") pod \"collect-profiles-29539275-5mhm4\" (UID: \"8558286c-6cb2-4061-bb84-07803d33b576\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539275-5mhm4" Mar 01 09:15:00 crc kubenswrapper[4792]: I0301 09:15:00.264093 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8558286c-6cb2-4061-bb84-07803d33b576-secret-volume\") pod \"collect-profiles-29539275-5mhm4\" (UID: \"8558286c-6cb2-4061-bb84-07803d33b576\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539275-5mhm4" Mar 01 09:15:00 crc kubenswrapper[4792]: I0301 09:15:00.274335 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s26sl\" (UniqueName: \"kubernetes.io/projected/8558286c-6cb2-4061-bb84-07803d33b576-kube-api-access-s26sl\") pod \"collect-profiles-29539275-5mhm4\" (UID: \"8558286c-6cb2-4061-bb84-07803d33b576\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539275-5mhm4" Mar 01 09:15:00 crc kubenswrapper[4792]: I0301 09:15:00.456066 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539275-5mhm4" Mar 01 09:15:00 crc kubenswrapper[4792]: I0301 09:15:00.643739 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539275-5mhm4"] Mar 01 09:15:00 crc kubenswrapper[4792]: I0301 09:15:00.675281 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539275-5mhm4" event={"ID":"8558286c-6cb2-4061-bb84-07803d33b576","Type":"ContainerStarted","Data":"fc342858aad6d7fe721366ec61b4a6ce763019dbdc83ff19b9b3acaafecb55fe"} Mar 01 09:15:01 crc kubenswrapper[4792]: I0301 09:15:01.682268 4792 generic.go:334] "Generic (PLEG): container finished" podID="8558286c-6cb2-4061-bb84-07803d33b576" containerID="25d35f30a0bda8efbd3c0227d7f47d3a25118a249c78dad70cafb27c52068acf" exitCode=0 Mar 01 09:15:01 crc kubenswrapper[4792]: I0301 09:15:01.682429 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539275-5mhm4" event={"ID":"8558286c-6cb2-4061-bb84-07803d33b576","Type":"ContainerDied","Data":"25d35f30a0bda8efbd3c0227d7f47d3a25118a249c78dad70cafb27c52068acf"} Mar 01 09:15:02 crc kubenswrapper[4792]: I0301 09:15:02.962261 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539275-5mhm4" Mar 01 09:15:03 crc kubenswrapper[4792]: I0301 09:15:03.094370 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s26sl\" (UniqueName: \"kubernetes.io/projected/8558286c-6cb2-4061-bb84-07803d33b576-kube-api-access-s26sl\") pod \"8558286c-6cb2-4061-bb84-07803d33b576\" (UID: \"8558286c-6cb2-4061-bb84-07803d33b576\") " Mar 01 09:15:03 crc kubenswrapper[4792]: I0301 09:15:03.094443 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8558286c-6cb2-4061-bb84-07803d33b576-secret-volume\") pod \"8558286c-6cb2-4061-bb84-07803d33b576\" (UID: \"8558286c-6cb2-4061-bb84-07803d33b576\") " Mar 01 09:15:03 crc kubenswrapper[4792]: I0301 09:15:03.094519 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8558286c-6cb2-4061-bb84-07803d33b576-config-volume\") pod \"8558286c-6cb2-4061-bb84-07803d33b576\" (UID: \"8558286c-6cb2-4061-bb84-07803d33b576\") " Mar 01 09:15:03 crc kubenswrapper[4792]: I0301 09:15:03.095579 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8558286c-6cb2-4061-bb84-07803d33b576-config-volume" (OuterVolumeSpecName: "config-volume") pod "8558286c-6cb2-4061-bb84-07803d33b576" (UID: "8558286c-6cb2-4061-bb84-07803d33b576"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:15:03 crc kubenswrapper[4792]: I0301 09:15:03.099891 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8558286c-6cb2-4061-bb84-07803d33b576-kube-api-access-s26sl" (OuterVolumeSpecName: "kube-api-access-s26sl") pod "8558286c-6cb2-4061-bb84-07803d33b576" (UID: "8558286c-6cb2-4061-bb84-07803d33b576"). InnerVolumeSpecName "kube-api-access-s26sl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:15:03 crc kubenswrapper[4792]: I0301 09:15:03.109046 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8558286c-6cb2-4061-bb84-07803d33b576-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8558286c-6cb2-4061-bb84-07803d33b576" (UID: "8558286c-6cb2-4061-bb84-07803d33b576"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:15:03 crc kubenswrapper[4792]: I0301 09:15:03.196253 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s26sl\" (UniqueName: \"kubernetes.io/projected/8558286c-6cb2-4061-bb84-07803d33b576-kube-api-access-s26sl\") on node \"crc\" DevicePath \"\"" Mar 01 09:15:03 crc kubenswrapper[4792]: I0301 09:15:03.196290 4792 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8558286c-6cb2-4061-bb84-07803d33b576-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 01 09:15:03 crc kubenswrapper[4792]: I0301 09:15:03.196300 4792 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8558286c-6cb2-4061-bb84-07803d33b576-config-volume\") on node \"crc\" DevicePath \"\"" Mar 01 09:15:03 crc kubenswrapper[4792]: I0301 09:15:03.692665 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539275-5mhm4" event={"ID":"8558286c-6cb2-4061-bb84-07803d33b576","Type":"ContainerDied","Data":"fc342858aad6d7fe721366ec61b4a6ce763019dbdc83ff19b9b3acaafecb55fe"} Mar 01 09:15:03 crc kubenswrapper[4792]: I0301 09:15:03.692705 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc342858aad6d7fe721366ec61b4a6ce763019dbdc83ff19b9b3acaafecb55fe" Mar 01 09:15:03 crc kubenswrapper[4792]: I0301 09:15:03.692736 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539275-5mhm4" Mar 01 09:15:10 crc kubenswrapper[4792]: I0301 09:15:10.482041 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p6b9w"] Mar 01 09:15:10 crc kubenswrapper[4792]: I0301 09:15:10.482808 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-p6b9w" podUID="6fd91972-6bfc-4041-abc2-8f4298584603" containerName="registry-server" containerID="cri-o://9b1ebd40c76d6a05f41f90e7e6ff9f66cc3e0ca75b2cbd8232430f1380f165ea" gracePeriod=2 Mar 01 09:15:10 crc kubenswrapper[4792]: I0301 09:15:10.735142 4792 generic.go:334] "Generic (PLEG): container finished" podID="6fd91972-6bfc-4041-abc2-8f4298584603" containerID="9b1ebd40c76d6a05f41f90e7e6ff9f66cc3e0ca75b2cbd8232430f1380f165ea" exitCode=0 Mar 01 09:15:10 crc kubenswrapper[4792]: I0301 09:15:10.735182 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p6b9w" event={"ID":"6fd91972-6bfc-4041-abc2-8f4298584603","Type":"ContainerDied","Data":"9b1ebd40c76d6a05f41f90e7e6ff9f66cc3e0ca75b2cbd8232430f1380f165ea"} Mar 01 09:15:10 crc kubenswrapper[4792]: I0301 09:15:10.918198 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p6b9w" Mar 01 09:15:11 crc kubenswrapper[4792]: I0301 09:15:11.033519 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fd91972-6bfc-4041-abc2-8f4298584603-catalog-content\") pod \"6fd91972-6bfc-4041-abc2-8f4298584603\" (UID: \"6fd91972-6bfc-4041-abc2-8f4298584603\") " Mar 01 09:15:11 crc kubenswrapper[4792]: I0301 09:15:11.033558 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fd91972-6bfc-4041-abc2-8f4298584603-utilities\") pod \"6fd91972-6bfc-4041-abc2-8f4298584603\" (UID: \"6fd91972-6bfc-4041-abc2-8f4298584603\") " Mar 01 09:15:11 crc kubenswrapper[4792]: I0301 09:15:11.033619 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whj6h\" (UniqueName: \"kubernetes.io/projected/6fd91972-6bfc-4041-abc2-8f4298584603-kube-api-access-whj6h\") pod \"6fd91972-6bfc-4041-abc2-8f4298584603\" (UID: \"6fd91972-6bfc-4041-abc2-8f4298584603\") " Mar 01 09:15:11 crc kubenswrapper[4792]: I0301 09:15:11.034352 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fd91972-6bfc-4041-abc2-8f4298584603-utilities" (OuterVolumeSpecName: "utilities") pod "6fd91972-6bfc-4041-abc2-8f4298584603" (UID: "6fd91972-6bfc-4041-abc2-8f4298584603"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:15:11 crc kubenswrapper[4792]: I0301 09:15:11.047115 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fd91972-6bfc-4041-abc2-8f4298584603-kube-api-access-whj6h" (OuterVolumeSpecName: "kube-api-access-whj6h") pod "6fd91972-6bfc-4041-abc2-8f4298584603" (UID: "6fd91972-6bfc-4041-abc2-8f4298584603"). InnerVolumeSpecName "kube-api-access-whj6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:15:11 crc kubenswrapper[4792]: I0301 09:15:11.084735 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fd91972-6bfc-4041-abc2-8f4298584603-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6fd91972-6bfc-4041-abc2-8f4298584603" (UID: "6fd91972-6bfc-4041-abc2-8f4298584603"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:15:11 crc kubenswrapper[4792]: I0301 09:15:11.134730 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whj6h\" (UniqueName: \"kubernetes.io/projected/6fd91972-6bfc-4041-abc2-8f4298584603-kube-api-access-whj6h\") on node \"crc\" DevicePath \"\"" Mar 01 09:15:11 crc kubenswrapper[4792]: I0301 09:15:11.134764 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fd91972-6bfc-4041-abc2-8f4298584603-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:15:11 crc kubenswrapper[4792]: I0301 09:15:11.134774 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fd91972-6bfc-4041-abc2-8f4298584603-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:15:11 crc kubenswrapper[4792]: I0301 09:15:11.744941 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p6b9w" event={"ID":"6fd91972-6bfc-4041-abc2-8f4298584603","Type":"ContainerDied","Data":"477ff8111ce50f202ac20e7801f2ac3bd82c5ed546b9e6aeee69367ec0d09908"} Mar 01 09:15:11 crc kubenswrapper[4792]: I0301 09:15:11.744998 4792 scope.go:117] "RemoveContainer" containerID="9b1ebd40c76d6a05f41f90e7e6ff9f66cc3e0ca75b2cbd8232430f1380f165ea" Mar 01 09:15:11 crc kubenswrapper[4792]: I0301 09:15:11.745216 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p6b9w" Mar 01 09:15:11 crc kubenswrapper[4792]: I0301 09:15:11.761729 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p6b9w"] Mar 01 09:15:11 crc kubenswrapper[4792]: I0301 09:15:11.768810 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-p6b9w"] Mar 01 09:15:11 crc kubenswrapper[4792]: I0301 09:15:11.769333 4792 scope.go:117] "RemoveContainer" containerID="303602554c4af893edadfe462be0a7b315bc51a29028b034dfe41a16ef93ff3e" Mar 01 09:15:11 crc kubenswrapper[4792]: I0301 09:15:11.787416 4792 scope.go:117] "RemoveContainer" containerID="aa50e0a49246a84ff0433571a1c37df20be8eda2033b2980a1562df09196f210" Mar 01 09:15:13 crc kubenswrapper[4792]: I0301 09:15:13.418755 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fd91972-6bfc-4041-abc2-8f4298584603" path="/var/lib/kubelet/pods/6fd91972-6bfc-4041-abc2-8f4298584603/volumes" Mar 01 09:15:34 crc kubenswrapper[4792]: I0301 09:15:34.943325 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:15:34 crc kubenswrapper[4792]: I0301 09:15:34.944075 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.344661 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-5rg6z"] Mar 01 09:15:36 crc kubenswrapper[4792]: E0301 09:15:36.345133 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fd91972-6bfc-4041-abc2-8f4298584603" containerName="extract-content" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.345145 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fd91972-6bfc-4041-abc2-8f4298584603" containerName="extract-content" Mar 01 09:15:36 crc kubenswrapper[4792]: E0301 09:15:36.345159 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8558286c-6cb2-4061-bb84-07803d33b576" containerName="collect-profiles" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.345166 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8558286c-6cb2-4061-bb84-07803d33b576" containerName="collect-profiles" Mar 01 09:15:36 crc kubenswrapper[4792]: E0301 09:15:36.345178 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fd91972-6bfc-4041-abc2-8f4298584603" containerName="extract-utilities" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.345185 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fd91972-6bfc-4041-abc2-8f4298584603" containerName="extract-utilities" Mar 01 09:15:36 crc kubenswrapper[4792]: E0301 09:15:36.345192 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fd91972-6bfc-4041-abc2-8f4298584603" containerName="registry-server" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.345198 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fd91972-6bfc-4041-abc2-8f4298584603" containerName="registry-server" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.345282 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="8558286c-6cb2-4061-bb84-07803d33b576" containerName="collect-profiles" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.345297 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fd91972-6bfc-4041-abc2-8f4298584603" containerName="registry-server" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.345644 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.364208 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-5rg6z"] Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.449153 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5345b06a-b353-4d3b-aee6-3e69c35a6325-registry-tls\") pod \"image-registry-66df7c8f76-5rg6z\" (UID: \"5345b06a-b353-4d3b-aee6-3e69c35a6325\") " pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.449202 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttznf\" (UniqueName: \"kubernetes.io/projected/5345b06a-b353-4d3b-aee6-3e69c35a6325-kube-api-access-ttznf\") pod \"image-registry-66df7c8f76-5rg6z\" (UID: \"5345b06a-b353-4d3b-aee6-3e69c35a6325\") " pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.449243 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-5rg6z\" (UID: \"5345b06a-b353-4d3b-aee6-3e69c35a6325\") " pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.449282 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5345b06a-b353-4d3b-aee6-3e69c35a6325-ca-trust-extracted\") pod \"image-registry-66df7c8f76-5rg6z\" (UID: \"5345b06a-b353-4d3b-aee6-3e69c35a6325\") " pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.449313 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5345b06a-b353-4d3b-aee6-3e69c35a6325-bound-sa-token\") pod \"image-registry-66df7c8f76-5rg6z\" (UID: \"5345b06a-b353-4d3b-aee6-3e69c35a6325\") " pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.449371 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5345b06a-b353-4d3b-aee6-3e69c35a6325-registry-certificates\") pod \"image-registry-66df7c8f76-5rg6z\" (UID: \"5345b06a-b353-4d3b-aee6-3e69c35a6325\") " pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.449403 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5345b06a-b353-4d3b-aee6-3e69c35a6325-trusted-ca\") pod \"image-registry-66df7c8f76-5rg6z\" (UID: \"5345b06a-b353-4d3b-aee6-3e69c35a6325\") " pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.449455 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5345b06a-b353-4d3b-aee6-3e69c35a6325-installation-pull-secrets\") pod \"image-registry-66df7c8f76-5rg6z\" (UID: \"5345b06a-b353-4d3b-aee6-3e69c35a6325\") " pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.474688 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-5rg6z\" (UID: \"5345b06a-b353-4d3b-aee6-3e69c35a6325\") " pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.550634 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5345b06a-b353-4d3b-aee6-3e69c35a6325-ca-trust-extracted\") pod \"image-registry-66df7c8f76-5rg6z\" (UID: \"5345b06a-b353-4d3b-aee6-3e69c35a6325\") " pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.550771 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5345b06a-b353-4d3b-aee6-3e69c35a6325-bound-sa-token\") pod \"image-registry-66df7c8f76-5rg6z\" (UID: \"5345b06a-b353-4d3b-aee6-3e69c35a6325\") " pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.551516 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5345b06a-b353-4d3b-aee6-3e69c35a6325-ca-trust-extracted\") pod \"image-registry-66df7c8f76-5rg6z\" (UID: \"5345b06a-b353-4d3b-aee6-3e69c35a6325\") " pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.551572 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5345b06a-b353-4d3b-aee6-3e69c35a6325-registry-certificates\") pod \"image-registry-66df7c8f76-5rg6z\" (UID: \"5345b06a-b353-4d3b-aee6-3e69c35a6325\") " pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.551599 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5345b06a-b353-4d3b-aee6-3e69c35a6325-trusted-ca\") pod \"image-registry-66df7c8f76-5rg6z\" (UID: \"5345b06a-b353-4d3b-aee6-3e69c35a6325\") " pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.551636 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5345b06a-b353-4d3b-aee6-3e69c35a6325-installation-pull-secrets\") pod \"image-registry-66df7c8f76-5rg6z\" (UID: \"5345b06a-b353-4d3b-aee6-3e69c35a6325\") " pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.551677 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5345b06a-b353-4d3b-aee6-3e69c35a6325-registry-tls\") pod \"image-registry-66df7c8f76-5rg6z\" (UID: \"5345b06a-b353-4d3b-aee6-3e69c35a6325\") " pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.551726 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttznf\" (UniqueName: \"kubernetes.io/projected/5345b06a-b353-4d3b-aee6-3e69c35a6325-kube-api-access-ttznf\") pod \"image-registry-66df7c8f76-5rg6z\" (UID: \"5345b06a-b353-4d3b-aee6-3e69c35a6325\") " pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.553500 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5345b06a-b353-4d3b-aee6-3e69c35a6325-trusted-ca\") pod \"image-registry-66df7c8f76-5rg6z\" (UID: \"5345b06a-b353-4d3b-aee6-3e69c35a6325\") " pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.555319 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5345b06a-b353-4d3b-aee6-3e69c35a6325-registry-certificates\") pod \"image-registry-66df7c8f76-5rg6z\" (UID: \"5345b06a-b353-4d3b-aee6-3e69c35a6325\") " pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.558277 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5345b06a-b353-4d3b-aee6-3e69c35a6325-registry-tls\") pod \"image-registry-66df7c8f76-5rg6z\" (UID: \"5345b06a-b353-4d3b-aee6-3e69c35a6325\") " pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.563630 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5345b06a-b353-4d3b-aee6-3e69c35a6325-installation-pull-secrets\") pod \"image-registry-66df7c8f76-5rg6z\" (UID: \"5345b06a-b353-4d3b-aee6-3e69c35a6325\") " pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.565733 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5345b06a-b353-4d3b-aee6-3e69c35a6325-bound-sa-token\") pod \"image-registry-66df7c8f76-5rg6z\" (UID: \"5345b06a-b353-4d3b-aee6-3e69c35a6325\") " pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.572836 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttznf\" (UniqueName: \"kubernetes.io/projected/5345b06a-b353-4d3b-aee6-3e69c35a6325-kube-api-access-ttznf\") pod \"image-registry-66df7c8f76-5rg6z\" (UID: \"5345b06a-b353-4d3b-aee6-3e69c35a6325\") " pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.663008 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:37 crc kubenswrapper[4792]: I0301 09:15:37.097094 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-5rg6z"] Mar 01 09:15:37 crc kubenswrapper[4792]: I0301 09:15:37.881252 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" event={"ID":"5345b06a-b353-4d3b-aee6-3e69c35a6325","Type":"ContainerStarted","Data":"b943412d0fc1e2de925dfa8818689ec70ad2378afa3ec77e5b5c25996cd150c9"} Mar 01 09:15:37 crc kubenswrapper[4792]: I0301 09:15:37.881754 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:37 crc kubenswrapper[4792]: I0301 09:15:37.881786 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" event={"ID":"5345b06a-b353-4d3b-aee6-3e69c35a6325","Type":"ContainerStarted","Data":"796761eff49436efc7fb008294beb4a0f81a2f40a24492a018407426479ccf5b"} Mar 01 09:15:37 crc kubenswrapper[4792]: I0301 09:15:37.903872 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" podStartSLOduration=1.903842274 podStartE2EDuration="1.903842274s" podCreationTimestamp="2026-03-01 09:15:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:15:37.89813619 +0000 UTC m=+467.140015387" watchObservedRunningTime="2026-03-01 09:15:37.903842274 +0000 UTC m=+467.145721491" Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.738385 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cw675"] Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.739001 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cw675" podUID="dff0d675-52dd-4cac-a7be-8750333c28e3" containerName="registry-server" containerID="cri-o://5bb5a5f0949169743626acf007f4aa939920851854703451752f35c1714d5b63" gracePeriod=30 Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.745176 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wxb87"] Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.745445 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wxb87" podUID="9073e3da-2d6f-48a3-907a-e347f28559ae" containerName="registry-server" containerID="cri-o://9dea4572b2cae8fb3cd81b4d5fe3e648bf003be8e11b2fe3e243bf7601f66f32" gracePeriod=30 Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.754319 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gk6c6"] Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.754653 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-gk6c6" podUID="e37e6dcb-be13-4787-8555-3ba1050f7b77" containerName="marketplace-operator" containerID="cri-o://8c78eddf897c2741ce992ebcf0cd416c60d059664ef1623df2239b0550809bd0" gracePeriod=30 Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.774859 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n28r8"] Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.775102 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-n28r8" podUID="e03dee1d-d7ca-422c-8af6-faa0c1af3863" containerName="registry-server" containerID="cri-o://1897e3e89e2859f0dcfe575b896a35cdde57822147e478674713823e7d25153f" gracePeriod=30 Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.786297 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gfkbs"] Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.787044 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gfkbs" Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.788522 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7nwc2"] Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.788769 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7nwc2" podUID="22c7c368-3523-4224-aebd-59b29640bed0" containerName="registry-server" containerID="cri-o://6796f3254defaa157a9986218237b224b14d5a54cd6e180bf82e9634212462ec" gracePeriod=30 Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.816631 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gfkbs"] Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.931591 4792 generic.go:334] "Generic (PLEG): container finished" podID="e37e6dcb-be13-4787-8555-3ba1050f7b77" containerID="8c78eddf897c2741ce992ebcf0cd416c60d059664ef1623df2239b0550809bd0" exitCode=0 Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.931931 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gk6c6" event={"ID":"e37e6dcb-be13-4787-8555-3ba1050f7b77","Type":"ContainerDied","Data":"8c78eddf897c2741ce992ebcf0cd416c60d059664ef1623df2239b0550809bd0"} Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.956794 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/46fe59e7-8122-4621-ae8d-237a91daee5e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gfkbs\" (UID: \"46fe59e7-8122-4621-ae8d-237a91daee5e\") " pod="openshift-marketplace/marketplace-operator-79b997595-gfkbs" Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.956847 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/46fe59e7-8122-4621-ae8d-237a91daee5e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gfkbs\" (UID: \"46fe59e7-8122-4621-ae8d-237a91daee5e\") " pod="openshift-marketplace/marketplace-operator-79b997595-gfkbs" Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.956949 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6stz4\" (UniqueName: \"kubernetes.io/projected/46fe59e7-8122-4621-ae8d-237a91daee5e-kube-api-access-6stz4\") pod \"marketplace-operator-79b997595-gfkbs\" (UID: \"46fe59e7-8122-4621-ae8d-237a91daee5e\") " pod="openshift-marketplace/marketplace-operator-79b997595-gfkbs" Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.959000 4792 generic.go:334] "Generic (PLEG): container finished" podID="9073e3da-2d6f-48a3-907a-e347f28559ae" containerID="9dea4572b2cae8fb3cd81b4d5fe3e648bf003be8e11b2fe3e243bf7601f66f32" exitCode=0 Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.959090 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wxb87" event={"ID":"9073e3da-2d6f-48a3-907a-e347f28559ae","Type":"ContainerDied","Data":"9dea4572b2cae8fb3cd81b4d5fe3e648bf003be8e11b2fe3e243bf7601f66f32"} Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.969676 4792 generic.go:334] "Generic (PLEG): container finished" podID="dff0d675-52dd-4cac-a7be-8750333c28e3" containerID="5bb5a5f0949169743626acf007f4aa939920851854703451752f35c1714d5b63" exitCode=0 Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.969732 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cw675" event={"ID":"dff0d675-52dd-4cac-a7be-8750333c28e3","Type":"ContainerDied","Data":"5bb5a5f0949169743626acf007f4aa939920851854703451752f35c1714d5b63"} Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.975041 4792 generic.go:334] "Generic (PLEG): container finished" podID="e03dee1d-d7ca-422c-8af6-faa0c1af3863" containerID="1897e3e89e2859f0dcfe575b896a35cdde57822147e478674713823e7d25153f" exitCode=0 Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.975085 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n28r8" event={"ID":"e03dee1d-d7ca-422c-8af6-faa0c1af3863","Type":"ContainerDied","Data":"1897e3e89e2859f0dcfe575b896a35cdde57822147e478674713823e7d25153f"} Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.071528 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6stz4\" (UniqueName: \"kubernetes.io/projected/46fe59e7-8122-4621-ae8d-237a91daee5e-kube-api-access-6stz4\") pod \"marketplace-operator-79b997595-gfkbs\" (UID: \"46fe59e7-8122-4621-ae8d-237a91daee5e\") " pod="openshift-marketplace/marketplace-operator-79b997595-gfkbs" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.071585 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/46fe59e7-8122-4621-ae8d-237a91daee5e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gfkbs\" (UID: \"46fe59e7-8122-4621-ae8d-237a91daee5e\") " pod="openshift-marketplace/marketplace-operator-79b997595-gfkbs" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.071606 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/46fe59e7-8122-4621-ae8d-237a91daee5e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gfkbs\" (UID: \"46fe59e7-8122-4621-ae8d-237a91daee5e\") " pod="openshift-marketplace/marketplace-operator-79b997595-gfkbs" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.074675 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/46fe59e7-8122-4621-ae8d-237a91daee5e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gfkbs\" (UID: \"46fe59e7-8122-4621-ae8d-237a91daee5e\") " pod="openshift-marketplace/marketplace-operator-79b997595-gfkbs" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.090875 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/46fe59e7-8122-4621-ae8d-237a91daee5e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gfkbs\" (UID: \"46fe59e7-8122-4621-ae8d-237a91daee5e\") " pod="openshift-marketplace/marketplace-operator-79b997595-gfkbs" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.097393 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6stz4\" (UniqueName: \"kubernetes.io/projected/46fe59e7-8122-4621-ae8d-237a91daee5e-kube-api-access-6stz4\") pod \"marketplace-operator-79b997595-gfkbs\" (UID: \"46fe59e7-8122-4621-ae8d-237a91daee5e\") " pod="openshift-marketplace/marketplace-operator-79b997595-gfkbs" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.113178 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gfkbs" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.217125 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cw675" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.298563 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wxb87" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.305186 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n28r8" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.307012 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gk6c6" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.374389 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thnsb\" (UniqueName: \"kubernetes.io/projected/dff0d675-52dd-4cac-a7be-8750333c28e3-kube-api-access-thnsb\") pod \"dff0d675-52dd-4cac-a7be-8750333c28e3\" (UID: \"dff0d675-52dd-4cac-a7be-8750333c28e3\") " Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.374434 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dff0d675-52dd-4cac-a7be-8750333c28e3-catalog-content\") pod \"dff0d675-52dd-4cac-a7be-8750333c28e3\" (UID: \"dff0d675-52dd-4cac-a7be-8750333c28e3\") " Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.374529 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dff0d675-52dd-4cac-a7be-8750333c28e3-utilities\") pod \"dff0d675-52dd-4cac-a7be-8750333c28e3\" (UID: \"dff0d675-52dd-4cac-a7be-8750333c28e3\") " Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.375417 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dff0d675-52dd-4cac-a7be-8750333c28e3-utilities" (OuterVolumeSpecName: "utilities") pod "dff0d675-52dd-4cac-a7be-8750333c28e3" (UID: "dff0d675-52dd-4cac-a7be-8750333c28e3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.379356 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dff0d675-52dd-4cac-a7be-8750333c28e3-kube-api-access-thnsb" (OuterVolumeSpecName: "kube-api-access-thnsb") pod "dff0d675-52dd-4cac-a7be-8750333c28e3" (UID: "dff0d675-52dd-4cac-a7be-8750333c28e3"). InnerVolumeSpecName "kube-api-access-thnsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.379566 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7nwc2" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.448877 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dff0d675-52dd-4cac-a7be-8750333c28e3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dff0d675-52dd-4cac-a7be-8750333c28e3" (UID: "dff0d675-52dd-4cac-a7be-8750333c28e3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.475461 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nc28k\" (UniqueName: \"kubernetes.io/projected/9073e3da-2d6f-48a3-907a-e347f28559ae-kube-api-access-nc28k\") pod \"9073e3da-2d6f-48a3-907a-e347f28559ae\" (UID: \"9073e3da-2d6f-48a3-907a-e347f28559ae\") " Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.475506 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22c7c368-3523-4224-aebd-59b29640bed0-catalog-content\") pod \"22c7c368-3523-4224-aebd-59b29640bed0\" (UID: \"22c7c368-3523-4224-aebd-59b29640bed0\") " Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.475527 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e37e6dcb-be13-4787-8555-3ba1050f7b77-marketplace-trusted-ca\") pod \"e37e6dcb-be13-4787-8555-3ba1050f7b77\" (UID: \"e37e6dcb-be13-4787-8555-3ba1050f7b77\") " Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.475547 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9073e3da-2d6f-48a3-907a-e347f28559ae-utilities\") pod \"9073e3da-2d6f-48a3-907a-e347f28559ae\" (UID: \"9073e3da-2d6f-48a3-907a-e347f28559ae\") " Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.475579 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22c7c368-3523-4224-aebd-59b29640bed0-utilities\") pod \"22c7c368-3523-4224-aebd-59b29640bed0\" (UID: \"22c7c368-3523-4224-aebd-59b29640bed0\") " Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.475615 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kc5gl\" (UniqueName: \"kubernetes.io/projected/e03dee1d-d7ca-422c-8af6-faa0c1af3863-kube-api-access-kc5gl\") pod \"e03dee1d-d7ca-422c-8af6-faa0c1af3863\" (UID: \"e03dee1d-d7ca-422c-8af6-faa0c1af3863\") " Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.475634 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9073e3da-2d6f-48a3-907a-e347f28559ae-catalog-content\") pod \"9073e3da-2d6f-48a3-907a-e347f28559ae\" (UID: \"9073e3da-2d6f-48a3-907a-e347f28559ae\") " Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.475661 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e03dee1d-d7ca-422c-8af6-faa0c1af3863-catalog-content\") pod \"e03dee1d-d7ca-422c-8af6-faa0c1af3863\" (UID: \"e03dee1d-d7ca-422c-8af6-faa0c1af3863\") " Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.475685 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e37e6dcb-be13-4787-8555-3ba1050f7b77-marketplace-operator-metrics\") pod \"e37e6dcb-be13-4787-8555-3ba1050f7b77\" (UID: \"e37e6dcb-be13-4787-8555-3ba1050f7b77\") " Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.475712 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e03dee1d-d7ca-422c-8af6-faa0c1af3863-utilities\") pod \"e03dee1d-d7ca-422c-8af6-faa0c1af3863\" (UID: \"e03dee1d-d7ca-422c-8af6-faa0c1af3863\") " Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.475737 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djdkr\" (UniqueName: \"kubernetes.io/projected/22c7c368-3523-4224-aebd-59b29640bed0-kube-api-access-djdkr\") pod \"22c7c368-3523-4224-aebd-59b29640bed0\" (UID: \"22c7c368-3523-4224-aebd-59b29640bed0\") " Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.475771 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7pkt\" (UniqueName: \"kubernetes.io/projected/e37e6dcb-be13-4787-8555-3ba1050f7b77-kube-api-access-r7pkt\") pod \"e37e6dcb-be13-4787-8555-3ba1050f7b77\" (UID: \"e37e6dcb-be13-4787-8555-3ba1050f7b77\") " Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.475963 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dff0d675-52dd-4cac-a7be-8750333c28e3-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.475976 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thnsb\" (UniqueName: \"kubernetes.io/projected/dff0d675-52dd-4cac-a7be-8750333c28e3-kube-api-access-thnsb\") on node \"crc\" DevicePath \"\"" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.475988 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dff0d675-52dd-4cac-a7be-8750333c28e3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.477690 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22c7c368-3523-4224-aebd-59b29640bed0-utilities" (OuterVolumeSpecName: "utilities") pod "22c7c368-3523-4224-aebd-59b29640bed0" (UID: "22c7c368-3523-4224-aebd-59b29640bed0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.478944 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9073e3da-2d6f-48a3-907a-e347f28559ae-kube-api-access-nc28k" (OuterVolumeSpecName: "kube-api-access-nc28k") pod "9073e3da-2d6f-48a3-907a-e347f28559ae" (UID: "9073e3da-2d6f-48a3-907a-e347f28559ae"). InnerVolumeSpecName "kube-api-access-nc28k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.478976 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e03dee1d-d7ca-422c-8af6-faa0c1af3863-utilities" (OuterVolumeSpecName: "utilities") pod "e03dee1d-d7ca-422c-8af6-faa0c1af3863" (UID: "e03dee1d-d7ca-422c-8af6-faa0c1af3863"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.480412 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e03dee1d-d7ca-422c-8af6-faa0c1af3863-kube-api-access-kc5gl" (OuterVolumeSpecName: "kube-api-access-kc5gl") pod "e03dee1d-d7ca-422c-8af6-faa0c1af3863" (UID: "e03dee1d-d7ca-422c-8af6-faa0c1af3863"). InnerVolumeSpecName "kube-api-access-kc5gl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.482296 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c7c368-3523-4224-aebd-59b29640bed0-kube-api-access-djdkr" (OuterVolumeSpecName: "kube-api-access-djdkr") pod "22c7c368-3523-4224-aebd-59b29640bed0" (UID: "22c7c368-3523-4224-aebd-59b29640bed0"). InnerVolumeSpecName "kube-api-access-djdkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.482932 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9073e3da-2d6f-48a3-907a-e347f28559ae-utilities" (OuterVolumeSpecName: "utilities") pod "9073e3da-2d6f-48a3-907a-e347f28559ae" (UID: "9073e3da-2d6f-48a3-907a-e347f28559ae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.483356 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e37e6dcb-be13-4787-8555-3ba1050f7b77-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "e37e6dcb-be13-4787-8555-3ba1050f7b77" (UID: "e37e6dcb-be13-4787-8555-3ba1050f7b77"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.476995 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e37e6dcb-be13-4787-8555-3ba1050f7b77-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "e37e6dcb-be13-4787-8555-3ba1050f7b77" (UID: "e37e6dcb-be13-4787-8555-3ba1050f7b77"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.490750 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e37e6dcb-be13-4787-8555-3ba1050f7b77-kube-api-access-r7pkt" (OuterVolumeSpecName: "kube-api-access-r7pkt") pod "e37e6dcb-be13-4787-8555-3ba1050f7b77" (UID: "e37e6dcb-be13-4787-8555-3ba1050f7b77"). InnerVolumeSpecName "kube-api-access-r7pkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.505602 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e03dee1d-d7ca-422c-8af6-faa0c1af3863-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e03dee1d-d7ca-422c-8af6-faa0c1af3863" (UID: "e03dee1d-d7ca-422c-8af6-faa0c1af3863"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.533606 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9073e3da-2d6f-48a3-907a-e347f28559ae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9073e3da-2d6f-48a3-907a-e347f28559ae" (UID: "9073e3da-2d6f-48a3-907a-e347f28559ae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.580557 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7pkt\" (UniqueName: \"kubernetes.io/projected/e37e6dcb-be13-4787-8555-3ba1050f7b77-kube-api-access-r7pkt\") on node \"crc\" DevicePath \"\"" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.580583 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nc28k\" (UniqueName: \"kubernetes.io/projected/9073e3da-2d6f-48a3-907a-e347f28559ae-kube-api-access-nc28k\") on node \"crc\" DevicePath \"\"" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.580595 4792 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e37e6dcb-be13-4787-8555-3ba1050f7b77-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.580605 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9073e3da-2d6f-48a3-907a-e347f28559ae-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.580614 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22c7c368-3523-4224-aebd-59b29640bed0-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.580624 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kc5gl\" (UniqueName: \"kubernetes.io/projected/e03dee1d-d7ca-422c-8af6-faa0c1af3863-kube-api-access-kc5gl\") on node \"crc\" DevicePath \"\"" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.580632 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9073e3da-2d6f-48a3-907a-e347f28559ae-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.580640 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e03dee1d-d7ca-422c-8af6-faa0c1af3863-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.580649 4792 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e37e6dcb-be13-4787-8555-3ba1050f7b77-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.580657 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e03dee1d-d7ca-422c-8af6-faa0c1af3863-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.580666 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djdkr\" (UniqueName: \"kubernetes.io/projected/22c7c368-3523-4224-aebd-59b29640bed0-kube-api-access-djdkr\") on node \"crc\" DevicePath \"\"" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.603736 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22c7c368-3523-4224-aebd-59b29640bed0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "22c7c368-3523-4224-aebd-59b29640bed0" (UID: "22c7c368-3523-4224-aebd-59b29640bed0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.682069 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22c7c368-3523-4224-aebd-59b29640bed0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.702592 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gfkbs"] Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.990283 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gk6c6" event={"ID":"e37e6dcb-be13-4787-8555-3ba1050f7b77","Type":"ContainerDied","Data":"5fe7291196c34fc28ce9dd0b3ec6175bb85545475a2aafe649770e45dc76a617"} Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.990341 4792 scope.go:117] "RemoveContainer" containerID="8c78eddf897c2741ce992ebcf0cd416c60d059664ef1623df2239b0550809bd0" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.990428 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gk6c6" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.000539 4792 generic.go:334] "Generic (PLEG): container finished" podID="22c7c368-3523-4224-aebd-59b29640bed0" containerID="6796f3254defaa157a9986218237b224b14d5a54cd6e180bf82e9634212462ec" exitCode=0 Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.000580 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7nwc2" event={"ID":"22c7c368-3523-4224-aebd-59b29640bed0","Type":"ContainerDied","Data":"6796f3254defaa157a9986218237b224b14d5a54cd6e180bf82e9634212462ec"} Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.000612 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7nwc2" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.000636 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7nwc2" event={"ID":"22c7c368-3523-4224-aebd-59b29640bed0","Type":"ContainerDied","Data":"52cac0b41f9f5231fc4b99f244ac1d2bd2e66edf0a6640e4bac8c95c340ce560"} Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.003491 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wxb87" event={"ID":"9073e3da-2d6f-48a3-907a-e347f28559ae","Type":"ContainerDied","Data":"f1bc5d15012443ccc9990bce32da439c2d26cd8bac7e62fa7b81593bfe925710"} Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.003578 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wxb87" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.018199 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cw675" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.018506 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cw675" event={"ID":"dff0d675-52dd-4cac-a7be-8750333c28e3","Type":"ContainerDied","Data":"339bd57c1cc13f4119c941d59da1a1ea961d6edda5147620d400c9a04f2e8ceb"} Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.021225 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n28r8" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.021236 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n28r8" event={"ID":"e03dee1d-d7ca-422c-8af6-faa0c1af3863","Type":"ContainerDied","Data":"385e5f3c0c4fe34a9ca54a291cae71ccbb846b542d87341115b79874eadd771a"} Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.021306 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gk6c6"] Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.025055 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gfkbs" event={"ID":"46fe59e7-8122-4621-ae8d-237a91daee5e","Type":"ContainerStarted","Data":"42cd5bea5c56421991c632f84b06664b84686faff6b0f1d3b86dd94bdc098c36"} Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.025088 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gfkbs" event={"ID":"46fe59e7-8122-4621-ae8d-237a91daee5e","Type":"ContainerStarted","Data":"1236fb768a5238a78e1ed5ceed2499ea772b0349c833723d91b36ba9abaee434"} Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.027365 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-gfkbs" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.027435 4792 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-gfkbs container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.71:8080/healthz\": dial tcp 10.217.0.71:8080: connect: connection refused" start-of-body= Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.027459 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-gfkbs" podUID="46fe59e7-8122-4621-ae8d-237a91daee5e" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.71:8080/healthz\": dial tcp 10.217.0.71:8080: connect: connection refused" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.028113 4792 scope.go:117] "RemoveContainer" containerID="6796f3254defaa157a9986218237b224b14d5a54cd6e180bf82e9634212462ec" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.038376 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gk6c6"] Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.050361 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-gfkbs" podStartSLOduration=2.050339321 podStartE2EDuration="2.050339321s" podCreationTimestamp="2026-03-01 09:15:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:15:45.045328674 +0000 UTC m=+474.287207871" watchObservedRunningTime="2026-03-01 09:15:45.050339321 +0000 UTC m=+474.292218518" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.064046 4792 scope.go:117] "RemoveContainer" containerID="225f228a4c6b7dec18ef0a0e39f231e76893c213f11c4f32a59ac86f3a689ed1" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.071245 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7nwc2"] Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.076771 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7nwc2"] Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.087522 4792 scope.go:117] "RemoveContainer" containerID="821167de69ed6dbed932952dc938c85a7e5a7b97b04ac4c8184f76750cd36e46" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.096639 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cw675"] Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.106743 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cw675"] Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.114683 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n28r8"] Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.114751 4792 scope.go:117] "RemoveContainer" containerID="6796f3254defaa157a9986218237b224b14d5a54cd6e180bf82e9634212462ec" Mar 01 09:15:45 crc kubenswrapper[4792]: E0301 09:15:45.115255 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6796f3254defaa157a9986218237b224b14d5a54cd6e180bf82e9634212462ec\": container with ID starting with 6796f3254defaa157a9986218237b224b14d5a54cd6e180bf82e9634212462ec not found: ID does not exist" containerID="6796f3254defaa157a9986218237b224b14d5a54cd6e180bf82e9634212462ec" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.115291 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6796f3254defaa157a9986218237b224b14d5a54cd6e180bf82e9634212462ec"} err="failed to get container status \"6796f3254defaa157a9986218237b224b14d5a54cd6e180bf82e9634212462ec\": rpc error: code = NotFound desc = could not find container \"6796f3254defaa157a9986218237b224b14d5a54cd6e180bf82e9634212462ec\": container with ID starting with 6796f3254defaa157a9986218237b224b14d5a54cd6e180bf82e9634212462ec not found: ID does not exist" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.115315 4792 scope.go:117] "RemoveContainer" containerID="225f228a4c6b7dec18ef0a0e39f231e76893c213f11c4f32a59ac86f3a689ed1" Mar 01 09:15:45 crc kubenswrapper[4792]: E0301 09:15:45.115556 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"225f228a4c6b7dec18ef0a0e39f231e76893c213f11c4f32a59ac86f3a689ed1\": container with ID starting with 225f228a4c6b7dec18ef0a0e39f231e76893c213f11c4f32a59ac86f3a689ed1 not found: ID does not exist" containerID="225f228a4c6b7dec18ef0a0e39f231e76893c213f11c4f32a59ac86f3a689ed1" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.115581 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"225f228a4c6b7dec18ef0a0e39f231e76893c213f11c4f32a59ac86f3a689ed1"} err="failed to get container status \"225f228a4c6b7dec18ef0a0e39f231e76893c213f11c4f32a59ac86f3a689ed1\": rpc error: code = NotFound desc = could not find container \"225f228a4c6b7dec18ef0a0e39f231e76893c213f11c4f32a59ac86f3a689ed1\": container with ID starting with 225f228a4c6b7dec18ef0a0e39f231e76893c213f11c4f32a59ac86f3a689ed1 not found: ID does not exist" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.115595 4792 scope.go:117] "RemoveContainer" containerID="821167de69ed6dbed932952dc938c85a7e5a7b97b04ac4c8184f76750cd36e46" Mar 01 09:15:45 crc kubenswrapper[4792]: E0301 09:15:45.115761 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"821167de69ed6dbed932952dc938c85a7e5a7b97b04ac4c8184f76750cd36e46\": container with ID starting with 821167de69ed6dbed932952dc938c85a7e5a7b97b04ac4c8184f76750cd36e46 not found: ID does not exist" containerID="821167de69ed6dbed932952dc938c85a7e5a7b97b04ac4c8184f76750cd36e46" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.115780 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"821167de69ed6dbed932952dc938c85a7e5a7b97b04ac4c8184f76750cd36e46"} err="failed to get container status \"821167de69ed6dbed932952dc938c85a7e5a7b97b04ac4c8184f76750cd36e46\": rpc error: code = NotFound desc = could not find container \"821167de69ed6dbed932952dc938c85a7e5a7b97b04ac4c8184f76750cd36e46\": container with ID starting with 821167de69ed6dbed932952dc938c85a7e5a7b97b04ac4c8184f76750cd36e46 not found: ID does not exist" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.115792 4792 scope.go:117] "RemoveContainer" containerID="9dea4572b2cae8fb3cd81b4d5fe3e648bf003be8e11b2fe3e243bf7601f66f32" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.116821 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-n28r8"] Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.133208 4792 scope.go:117] "RemoveContainer" containerID="4b4748f2f641b6c36501b4a63c23878d8149ab901a48ba6ca75290682667f801" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.138961 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wxb87"] Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.142042 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wxb87"] Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.157820 4792 scope.go:117] "RemoveContainer" containerID="cabcf6681e53e945ec2b29b89c123577ba10471c42575321a6f1da2be37ffe2a" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.173379 4792 scope.go:117] "RemoveContainer" containerID="5bb5a5f0949169743626acf007f4aa939920851854703451752f35c1714d5b63" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.186840 4792 scope.go:117] "RemoveContainer" containerID="60c5c14884f306ec02b79c73b52f33a5df66be0f71db3ee1b928c0b932dd06d6" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.227021 4792 scope.go:117] "RemoveContainer" containerID="2793d9d53a76fa1bceb0317b96453e19780c382a91f9c7fd6f680978a1c2a121" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.241086 4792 scope.go:117] "RemoveContainer" containerID="1897e3e89e2859f0dcfe575b896a35cdde57822147e478674713823e7d25153f" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.254094 4792 scope.go:117] "RemoveContainer" containerID="d7a22eb25032f48508905d8110fc2779a9b8e1d8380aa44e93999853b14a1f56" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.268930 4792 scope.go:117] "RemoveContainer" containerID="0c84982bc61501954eca9c9293432f6ed8f745c671471318271733558785ba39" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.414741 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c7c368-3523-4224-aebd-59b29640bed0" path="/var/lib/kubelet/pods/22c7c368-3523-4224-aebd-59b29640bed0/volumes" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.415493 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9073e3da-2d6f-48a3-907a-e347f28559ae" path="/var/lib/kubelet/pods/9073e3da-2d6f-48a3-907a-e347f28559ae/volumes" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.416055 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dff0d675-52dd-4cac-a7be-8750333c28e3" path="/var/lib/kubelet/pods/dff0d675-52dd-4cac-a7be-8750333c28e3/volumes" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.417019 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e03dee1d-d7ca-422c-8af6-faa0c1af3863" path="/var/lib/kubelet/pods/e03dee1d-d7ca-422c-8af6-faa0c1af3863/volumes" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.417594 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e37e6dcb-be13-4787-8555-3ba1050f7b77" path="/var/lib/kubelet/pods/e37e6dcb-be13-4787-8555-3ba1050f7b77/volumes" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.952307 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zjvjw"] Mar 01 09:15:45 crc kubenswrapper[4792]: E0301 09:15:45.953605 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e37e6dcb-be13-4787-8555-3ba1050f7b77" containerName="marketplace-operator" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.953624 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e37e6dcb-be13-4787-8555-3ba1050f7b77" containerName="marketplace-operator" Mar 01 09:15:45 crc kubenswrapper[4792]: E0301 09:15:45.953636 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e03dee1d-d7ca-422c-8af6-faa0c1af3863" containerName="extract-utilities" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.953644 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e03dee1d-d7ca-422c-8af6-faa0c1af3863" containerName="extract-utilities" Mar 01 09:15:45 crc kubenswrapper[4792]: E0301 09:15:45.953660 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dff0d675-52dd-4cac-a7be-8750333c28e3" containerName="extract-utilities" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.953667 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="dff0d675-52dd-4cac-a7be-8750333c28e3" containerName="extract-utilities" Mar 01 09:15:45 crc kubenswrapper[4792]: E0301 09:15:45.953677 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22c7c368-3523-4224-aebd-59b29640bed0" containerName="extract-utilities" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.953685 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="22c7c368-3523-4224-aebd-59b29640bed0" containerName="extract-utilities" Mar 01 09:15:45 crc kubenswrapper[4792]: E0301 09:15:45.953694 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dff0d675-52dd-4cac-a7be-8750333c28e3" containerName="registry-server" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.953701 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="dff0d675-52dd-4cac-a7be-8750333c28e3" containerName="registry-server" Mar 01 09:15:45 crc kubenswrapper[4792]: E0301 09:15:45.953716 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22c7c368-3523-4224-aebd-59b29640bed0" containerName="registry-server" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.953724 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="22c7c368-3523-4224-aebd-59b29640bed0" containerName="registry-server" Mar 01 09:15:45 crc kubenswrapper[4792]: E0301 09:15:45.953733 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e03dee1d-d7ca-422c-8af6-faa0c1af3863" containerName="extract-content" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.953740 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e03dee1d-d7ca-422c-8af6-faa0c1af3863" containerName="extract-content" Mar 01 09:15:45 crc kubenswrapper[4792]: E0301 09:15:45.953752 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9073e3da-2d6f-48a3-907a-e347f28559ae" containerName="extract-utilities" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.953759 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9073e3da-2d6f-48a3-907a-e347f28559ae" containerName="extract-utilities" Mar 01 09:15:45 crc kubenswrapper[4792]: E0301 09:15:45.953768 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dff0d675-52dd-4cac-a7be-8750333c28e3" containerName="extract-content" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.953777 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="dff0d675-52dd-4cac-a7be-8750333c28e3" containerName="extract-content" Mar 01 09:15:45 crc kubenswrapper[4792]: E0301 09:15:45.953789 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9073e3da-2d6f-48a3-907a-e347f28559ae" containerName="extract-content" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.953796 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9073e3da-2d6f-48a3-907a-e347f28559ae" containerName="extract-content" Mar 01 09:15:45 crc kubenswrapper[4792]: E0301 09:15:45.953809 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22c7c368-3523-4224-aebd-59b29640bed0" containerName="extract-content" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.953816 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="22c7c368-3523-4224-aebd-59b29640bed0" containerName="extract-content" Mar 01 09:15:45 crc kubenswrapper[4792]: E0301 09:15:45.953824 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e03dee1d-d7ca-422c-8af6-faa0c1af3863" containerName="registry-server" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.953832 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e03dee1d-d7ca-422c-8af6-faa0c1af3863" containerName="registry-server" Mar 01 09:15:45 crc kubenswrapper[4792]: E0301 09:15:45.953842 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9073e3da-2d6f-48a3-907a-e347f28559ae" containerName="registry-server" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.953849 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9073e3da-2d6f-48a3-907a-e347f28559ae" containerName="registry-server" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.953963 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="dff0d675-52dd-4cac-a7be-8750333c28e3" containerName="registry-server" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.953980 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e37e6dcb-be13-4787-8555-3ba1050f7b77" containerName="marketplace-operator" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.953989 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="9073e3da-2d6f-48a3-907a-e347f28559ae" containerName="registry-server" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.954000 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e03dee1d-d7ca-422c-8af6-faa0c1af3863" containerName="registry-server" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.954012 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="22c7c368-3523-4224-aebd-59b29640bed0" containerName="registry-server" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.955740 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zjvjw" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.956833 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zjvjw"] Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.957723 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.038212 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-gfkbs" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.101830 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxjb6\" (UniqueName: \"kubernetes.io/projected/3003e690-c3dd-4236-a95c-a0fb6ccb438e-kube-api-access-hxjb6\") pod \"redhat-marketplace-zjvjw\" (UID: \"3003e690-c3dd-4236-a95c-a0fb6ccb438e\") " pod="openshift-marketplace/redhat-marketplace-zjvjw" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.101931 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3003e690-c3dd-4236-a95c-a0fb6ccb438e-utilities\") pod \"redhat-marketplace-zjvjw\" (UID: \"3003e690-c3dd-4236-a95c-a0fb6ccb438e\") " pod="openshift-marketplace/redhat-marketplace-zjvjw" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.101958 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3003e690-c3dd-4236-a95c-a0fb6ccb438e-catalog-content\") pod \"redhat-marketplace-zjvjw\" (UID: \"3003e690-c3dd-4236-a95c-a0fb6ccb438e\") " pod="openshift-marketplace/redhat-marketplace-zjvjw" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.151945 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7fdqh"] Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.152878 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7fdqh" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.154674 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.161419 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7fdqh"] Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.202967 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3003e690-c3dd-4236-a95c-a0fb6ccb438e-utilities\") pod \"redhat-marketplace-zjvjw\" (UID: \"3003e690-c3dd-4236-a95c-a0fb6ccb438e\") " pod="openshift-marketplace/redhat-marketplace-zjvjw" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.203014 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3003e690-c3dd-4236-a95c-a0fb6ccb438e-catalog-content\") pod \"redhat-marketplace-zjvjw\" (UID: \"3003e690-c3dd-4236-a95c-a0fb6ccb438e\") " pod="openshift-marketplace/redhat-marketplace-zjvjw" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.203063 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxjb6\" (UniqueName: \"kubernetes.io/projected/3003e690-c3dd-4236-a95c-a0fb6ccb438e-kube-api-access-hxjb6\") pod \"redhat-marketplace-zjvjw\" (UID: \"3003e690-c3dd-4236-a95c-a0fb6ccb438e\") " pod="openshift-marketplace/redhat-marketplace-zjvjw" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.203509 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3003e690-c3dd-4236-a95c-a0fb6ccb438e-utilities\") pod \"redhat-marketplace-zjvjw\" (UID: \"3003e690-c3dd-4236-a95c-a0fb6ccb438e\") " pod="openshift-marketplace/redhat-marketplace-zjvjw" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.203739 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3003e690-c3dd-4236-a95c-a0fb6ccb438e-catalog-content\") pod \"redhat-marketplace-zjvjw\" (UID: \"3003e690-c3dd-4236-a95c-a0fb6ccb438e\") " pod="openshift-marketplace/redhat-marketplace-zjvjw" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.220647 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxjb6\" (UniqueName: \"kubernetes.io/projected/3003e690-c3dd-4236-a95c-a0fb6ccb438e-kube-api-access-hxjb6\") pod \"redhat-marketplace-zjvjw\" (UID: \"3003e690-c3dd-4236-a95c-a0fb6ccb438e\") " pod="openshift-marketplace/redhat-marketplace-zjvjw" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.282998 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zjvjw" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.304369 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38bc0c09-286e-427a-95c2-8e2c9213b142-catalog-content\") pod \"redhat-operators-7fdqh\" (UID: \"38bc0c09-286e-427a-95c2-8e2c9213b142\") " pod="openshift-marketplace/redhat-operators-7fdqh" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.304620 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bwgl\" (UniqueName: \"kubernetes.io/projected/38bc0c09-286e-427a-95c2-8e2c9213b142-kube-api-access-4bwgl\") pod \"redhat-operators-7fdqh\" (UID: \"38bc0c09-286e-427a-95c2-8e2c9213b142\") " pod="openshift-marketplace/redhat-operators-7fdqh" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.304665 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38bc0c09-286e-427a-95c2-8e2c9213b142-utilities\") pod \"redhat-operators-7fdqh\" (UID: \"38bc0c09-286e-427a-95c2-8e2c9213b142\") " pod="openshift-marketplace/redhat-operators-7fdqh" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.406104 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bwgl\" (UniqueName: \"kubernetes.io/projected/38bc0c09-286e-427a-95c2-8e2c9213b142-kube-api-access-4bwgl\") pod \"redhat-operators-7fdqh\" (UID: \"38bc0c09-286e-427a-95c2-8e2c9213b142\") " pod="openshift-marketplace/redhat-operators-7fdqh" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.406148 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38bc0c09-286e-427a-95c2-8e2c9213b142-catalog-content\") pod \"redhat-operators-7fdqh\" (UID: \"38bc0c09-286e-427a-95c2-8e2c9213b142\") " pod="openshift-marketplace/redhat-operators-7fdqh" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.406197 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38bc0c09-286e-427a-95c2-8e2c9213b142-utilities\") pod \"redhat-operators-7fdqh\" (UID: \"38bc0c09-286e-427a-95c2-8e2c9213b142\") " pod="openshift-marketplace/redhat-operators-7fdqh" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.406751 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38bc0c09-286e-427a-95c2-8e2c9213b142-utilities\") pod \"redhat-operators-7fdqh\" (UID: \"38bc0c09-286e-427a-95c2-8e2c9213b142\") " pod="openshift-marketplace/redhat-operators-7fdqh" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.406769 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38bc0c09-286e-427a-95c2-8e2c9213b142-catalog-content\") pod \"redhat-operators-7fdqh\" (UID: \"38bc0c09-286e-427a-95c2-8e2c9213b142\") " pod="openshift-marketplace/redhat-operators-7fdqh" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.433272 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bwgl\" (UniqueName: \"kubernetes.io/projected/38bc0c09-286e-427a-95c2-8e2c9213b142-kube-api-access-4bwgl\") pod \"redhat-operators-7fdqh\" (UID: \"38bc0c09-286e-427a-95c2-8e2c9213b142\") " pod="openshift-marketplace/redhat-operators-7fdqh" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.479427 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7fdqh" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.649347 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7fdqh"] Mar 01 09:15:46 crc kubenswrapper[4792]: W0301 09:15:46.662631 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38bc0c09_286e_427a_95c2_8e2c9213b142.slice/crio-11328d282ef0220b6921dadaddb7e58797ad6ade046d4093abc16c9afd96b991 WatchSource:0}: Error finding container 11328d282ef0220b6921dadaddb7e58797ad6ade046d4093abc16c9afd96b991: Status 404 returned error can't find the container with id 11328d282ef0220b6921dadaddb7e58797ad6ade046d4093abc16c9afd96b991 Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.684675 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zjvjw"] Mar 01 09:15:46 crc kubenswrapper[4792]: W0301 09:15:46.692593 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3003e690_c3dd_4236_a95c_a0fb6ccb438e.slice/crio-75f1c5432baa41cfe8d19814ad008dcc8898229ebacd240889398863afc86cc0 WatchSource:0}: Error finding container 75f1c5432baa41cfe8d19814ad008dcc8898229ebacd240889398863afc86cc0: Status 404 returned error can't find the container with id 75f1c5432baa41cfe8d19814ad008dcc8898229ebacd240889398863afc86cc0 Mar 01 09:15:47 crc kubenswrapper[4792]: I0301 09:15:47.041539 4792 generic.go:334] "Generic (PLEG): container finished" podID="38bc0c09-286e-427a-95c2-8e2c9213b142" containerID="c13d3b38df8892b424aaefbf200315183e532ad4e85474cdcebdfa2a35e8d0f9" exitCode=0 Mar 01 09:15:47 crc kubenswrapper[4792]: I0301 09:15:47.041924 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7fdqh" event={"ID":"38bc0c09-286e-427a-95c2-8e2c9213b142","Type":"ContainerDied","Data":"c13d3b38df8892b424aaefbf200315183e532ad4e85474cdcebdfa2a35e8d0f9"} Mar 01 09:15:47 crc kubenswrapper[4792]: I0301 09:15:47.041955 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7fdqh" event={"ID":"38bc0c09-286e-427a-95c2-8e2c9213b142","Type":"ContainerStarted","Data":"11328d282ef0220b6921dadaddb7e58797ad6ade046d4093abc16c9afd96b991"} Mar 01 09:15:47 crc kubenswrapper[4792]: I0301 09:15:47.052410 4792 generic.go:334] "Generic (PLEG): container finished" podID="3003e690-c3dd-4236-a95c-a0fb6ccb438e" containerID="c4c9b2c9b9a9f0f3a82ec42aed5a463fb7f5caafa0cf3d2f1487056d63745338" exitCode=0 Mar 01 09:15:47 crc kubenswrapper[4792]: I0301 09:15:47.053363 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zjvjw" event={"ID":"3003e690-c3dd-4236-a95c-a0fb6ccb438e","Type":"ContainerDied","Data":"c4c9b2c9b9a9f0f3a82ec42aed5a463fb7f5caafa0cf3d2f1487056d63745338"} Mar 01 09:15:47 crc kubenswrapper[4792]: I0301 09:15:47.053416 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zjvjw" event={"ID":"3003e690-c3dd-4236-a95c-a0fb6ccb438e","Type":"ContainerStarted","Data":"75f1c5432baa41cfe8d19814ad008dcc8898229ebacd240889398863afc86cc0"} Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.059270 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7fdqh" event={"ID":"38bc0c09-286e-427a-95c2-8e2c9213b142","Type":"ContainerStarted","Data":"839dbd9c94d016359cf8c4184d8a4654289151b502be0d350e0f115d237e1290"} Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.064770 4792 generic.go:334] "Generic (PLEG): container finished" podID="3003e690-c3dd-4236-a95c-a0fb6ccb438e" containerID="63da93f364346bdc218e269d6eedf40c892f902c3beea9faf5fa777a6feba6b1" exitCode=0 Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.064810 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zjvjw" event={"ID":"3003e690-c3dd-4236-a95c-a0fb6ccb438e","Type":"ContainerDied","Data":"63da93f364346bdc218e269d6eedf40c892f902c3beea9faf5fa777a6feba6b1"} Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.355580 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sksw8"] Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.356482 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sksw8" Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.360230 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.407566 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sksw8"] Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.530543 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55c448d6-e926-4b07-8aec-8195d42d2e30-utilities\") pod \"certified-operators-sksw8\" (UID: \"55c448d6-e926-4b07-8aec-8195d42d2e30\") " pod="openshift-marketplace/certified-operators-sksw8" Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.530639 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55c448d6-e926-4b07-8aec-8195d42d2e30-catalog-content\") pod \"certified-operators-sksw8\" (UID: \"55c448d6-e926-4b07-8aec-8195d42d2e30\") " pod="openshift-marketplace/certified-operators-sksw8" Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.530737 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d6xs\" (UniqueName: \"kubernetes.io/projected/55c448d6-e926-4b07-8aec-8195d42d2e30-kube-api-access-2d6xs\") pod \"certified-operators-sksw8\" (UID: \"55c448d6-e926-4b07-8aec-8195d42d2e30\") " pod="openshift-marketplace/certified-operators-sksw8" Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.558244 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-48zdf"] Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.559154 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-48zdf" Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.561154 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.564477 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-48zdf"] Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.632351 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55c448d6-e926-4b07-8aec-8195d42d2e30-utilities\") pod \"certified-operators-sksw8\" (UID: \"55c448d6-e926-4b07-8aec-8195d42d2e30\") " pod="openshift-marketplace/certified-operators-sksw8" Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.632539 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55c448d6-e926-4b07-8aec-8195d42d2e30-catalog-content\") pod \"certified-operators-sksw8\" (UID: \"55c448d6-e926-4b07-8aec-8195d42d2e30\") " pod="openshift-marketplace/certified-operators-sksw8" Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.632659 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d6xs\" (UniqueName: \"kubernetes.io/projected/55c448d6-e926-4b07-8aec-8195d42d2e30-kube-api-access-2d6xs\") pod \"certified-operators-sksw8\" (UID: \"55c448d6-e926-4b07-8aec-8195d42d2e30\") " pod="openshift-marketplace/certified-operators-sksw8" Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.633511 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55c448d6-e926-4b07-8aec-8195d42d2e30-utilities\") pod \"certified-operators-sksw8\" (UID: \"55c448d6-e926-4b07-8aec-8195d42d2e30\") " pod="openshift-marketplace/certified-operators-sksw8" Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.633610 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55c448d6-e926-4b07-8aec-8195d42d2e30-catalog-content\") pod \"certified-operators-sksw8\" (UID: \"55c448d6-e926-4b07-8aec-8195d42d2e30\") " pod="openshift-marketplace/certified-operators-sksw8" Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.651644 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d6xs\" (UniqueName: \"kubernetes.io/projected/55c448d6-e926-4b07-8aec-8195d42d2e30-kube-api-access-2d6xs\") pod \"certified-operators-sksw8\" (UID: \"55c448d6-e926-4b07-8aec-8195d42d2e30\") " pod="openshift-marketplace/certified-operators-sksw8" Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.679965 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sksw8" Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.733807 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d875f1af-e90b-4882-b472-f91651d468a6-catalog-content\") pod \"community-operators-48zdf\" (UID: \"d875f1af-e90b-4882-b472-f91651d468a6\") " pod="openshift-marketplace/community-operators-48zdf" Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.733853 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz5td\" (UniqueName: \"kubernetes.io/projected/d875f1af-e90b-4882-b472-f91651d468a6-kube-api-access-hz5td\") pod \"community-operators-48zdf\" (UID: \"d875f1af-e90b-4882-b472-f91651d468a6\") " pod="openshift-marketplace/community-operators-48zdf" Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.734089 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d875f1af-e90b-4882-b472-f91651d468a6-utilities\") pod \"community-operators-48zdf\" (UID: \"d875f1af-e90b-4882-b472-f91651d468a6\") " pod="openshift-marketplace/community-operators-48zdf" Mar 01 09:15:49 crc kubenswrapper[4792]: I0301 09:15:48.835546 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d875f1af-e90b-4882-b472-f91651d468a6-utilities\") pod \"community-operators-48zdf\" (UID: \"d875f1af-e90b-4882-b472-f91651d468a6\") " pod="openshift-marketplace/community-operators-48zdf" Mar 01 09:15:49 crc kubenswrapper[4792]: I0301 09:15:48.836534 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d875f1af-e90b-4882-b472-f91651d468a6-catalog-content\") pod \"community-operators-48zdf\" (UID: \"d875f1af-e90b-4882-b472-f91651d468a6\") " pod="openshift-marketplace/community-operators-48zdf" Mar 01 09:15:49 crc kubenswrapper[4792]: I0301 09:15:48.836846 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz5td\" (UniqueName: \"kubernetes.io/projected/d875f1af-e90b-4882-b472-f91651d468a6-kube-api-access-hz5td\") pod \"community-operators-48zdf\" (UID: \"d875f1af-e90b-4882-b472-f91651d468a6\") " pod="openshift-marketplace/community-operators-48zdf" Mar 01 09:15:49 crc kubenswrapper[4792]: I0301 09:15:48.836462 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d875f1af-e90b-4882-b472-f91651d468a6-utilities\") pod \"community-operators-48zdf\" (UID: \"d875f1af-e90b-4882-b472-f91651d468a6\") " pod="openshift-marketplace/community-operators-48zdf" Mar 01 09:15:49 crc kubenswrapper[4792]: I0301 09:15:48.836819 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d875f1af-e90b-4882-b472-f91651d468a6-catalog-content\") pod \"community-operators-48zdf\" (UID: \"d875f1af-e90b-4882-b472-f91651d468a6\") " pod="openshift-marketplace/community-operators-48zdf" Mar 01 09:15:49 crc kubenswrapper[4792]: I0301 09:15:48.856289 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz5td\" (UniqueName: \"kubernetes.io/projected/d875f1af-e90b-4882-b472-f91651d468a6-kube-api-access-hz5td\") pod \"community-operators-48zdf\" (UID: \"d875f1af-e90b-4882-b472-f91651d468a6\") " pod="openshift-marketplace/community-operators-48zdf" Mar 01 09:15:49 crc kubenswrapper[4792]: I0301 09:15:48.891863 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sksw8"] Mar 01 09:15:49 crc kubenswrapper[4792]: I0301 09:15:48.896051 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-48zdf" Mar 01 09:15:49 crc kubenswrapper[4792]: W0301 09:15:48.897026 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55c448d6_e926_4b07_8aec_8195d42d2e30.slice/crio-7188e1badb0fe92bbaddbc026f4711272ea58838c819448b2a409d67c5885dbc WatchSource:0}: Error finding container 7188e1badb0fe92bbaddbc026f4711272ea58838c819448b2a409d67c5885dbc: Status 404 returned error can't find the container with id 7188e1badb0fe92bbaddbc026f4711272ea58838c819448b2a409d67c5885dbc Mar 01 09:15:49 crc kubenswrapper[4792]: I0301 09:15:49.072436 4792 generic.go:334] "Generic (PLEG): container finished" podID="38bc0c09-286e-427a-95c2-8e2c9213b142" containerID="839dbd9c94d016359cf8c4184d8a4654289151b502be0d350e0f115d237e1290" exitCode=0 Mar 01 09:15:49 crc kubenswrapper[4792]: I0301 09:15:49.072516 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7fdqh" event={"ID":"38bc0c09-286e-427a-95c2-8e2c9213b142","Type":"ContainerDied","Data":"839dbd9c94d016359cf8c4184d8a4654289151b502be0d350e0f115d237e1290"} Mar 01 09:15:49 crc kubenswrapper[4792]: I0301 09:15:49.078458 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zjvjw" event={"ID":"3003e690-c3dd-4236-a95c-a0fb6ccb438e","Type":"ContainerStarted","Data":"5b9418a53311ff310523f43d73f4a3f2bea95acfbee1c9d2d5e16f15273d74b6"} Mar 01 09:15:49 crc kubenswrapper[4792]: I0301 09:15:49.082493 4792 generic.go:334] "Generic (PLEG): container finished" podID="55c448d6-e926-4b07-8aec-8195d42d2e30" containerID="69f02268daf6c0c583cf7f502874715d5874c2b78b768e3cbfdb267828eb7cd1" exitCode=0 Mar 01 09:15:49 crc kubenswrapper[4792]: I0301 09:15:49.082548 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sksw8" event={"ID":"55c448d6-e926-4b07-8aec-8195d42d2e30","Type":"ContainerDied","Data":"69f02268daf6c0c583cf7f502874715d5874c2b78b768e3cbfdb267828eb7cd1"} Mar 01 09:15:49 crc kubenswrapper[4792]: I0301 09:15:49.082566 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sksw8" event={"ID":"55c448d6-e926-4b07-8aec-8195d42d2e30","Type":"ContainerStarted","Data":"7188e1badb0fe92bbaddbc026f4711272ea58838c819448b2a409d67c5885dbc"} Mar 01 09:15:49 crc kubenswrapper[4792]: I0301 09:15:49.909869 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zjvjw" podStartSLOduration=3.474931759 podStartE2EDuration="4.909850191s" podCreationTimestamp="2026-03-01 09:15:45 +0000 UTC" firstStartedPulling="2026-03-01 09:15:47.054833933 +0000 UTC m=+476.296713130" lastFinishedPulling="2026-03-01 09:15:48.489752365 +0000 UTC m=+477.731631562" observedRunningTime="2026-03-01 09:15:49.137040021 +0000 UTC m=+478.378919218" watchObservedRunningTime="2026-03-01 09:15:49.909850191 +0000 UTC m=+479.151729388" Mar 01 09:15:49 crc kubenswrapper[4792]: I0301 09:15:49.912669 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-48zdf"] Mar 01 09:15:49 crc kubenswrapper[4792]: W0301 09:15:49.921075 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd875f1af_e90b_4882_b472_f91651d468a6.slice/crio-e28c7ca553ea613c89182637d5fa5350ca10a96c5efd6579efcdfacaa062ae3a WatchSource:0}: Error finding container e28c7ca553ea613c89182637d5fa5350ca10a96c5efd6579efcdfacaa062ae3a: Status 404 returned error can't find the container with id e28c7ca553ea613c89182637d5fa5350ca10a96c5efd6579efcdfacaa062ae3a Mar 01 09:15:50 crc kubenswrapper[4792]: I0301 09:15:50.090943 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-48zdf" event={"ID":"d875f1af-e90b-4882-b472-f91651d468a6","Type":"ContainerDied","Data":"38e219276e0ef85d6a1ed88e5e195f9432165d376cd2b7270a896c136d1f9c41"} Mar 01 09:15:50 crc kubenswrapper[4792]: I0301 09:15:50.090889 4792 generic.go:334] "Generic (PLEG): container finished" podID="d875f1af-e90b-4882-b472-f91651d468a6" containerID="38e219276e0ef85d6a1ed88e5e195f9432165d376cd2b7270a896c136d1f9c41" exitCode=0 Mar 01 09:15:50 crc kubenswrapper[4792]: I0301 09:15:50.091099 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-48zdf" event={"ID":"d875f1af-e90b-4882-b472-f91651d468a6","Type":"ContainerStarted","Data":"e28c7ca553ea613c89182637d5fa5350ca10a96c5efd6579efcdfacaa062ae3a"} Mar 01 09:15:50 crc kubenswrapper[4792]: I0301 09:15:50.094000 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7fdqh" event={"ID":"38bc0c09-286e-427a-95c2-8e2c9213b142","Type":"ContainerStarted","Data":"c64c9d5273ae6c0bdebf9e4e070eb4a5f2be4f9396bee3796f8bb287aa3dfa8a"} Mar 01 09:15:50 crc kubenswrapper[4792]: I0301 09:15:50.097938 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sksw8" event={"ID":"55c448d6-e926-4b07-8aec-8195d42d2e30","Type":"ContainerStarted","Data":"d3850228aa71b2c0cb870102a30e8019bafab1c2433eb51d1d2f07b39afa4c78"} Mar 01 09:15:50 crc kubenswrapper[4792]: I0301 09:15:50.127718 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7fdqh" podStartSLOduration=1.698284781 podStartE2EDuration="4.127702859s" podCreationTimestamp="2026-03-01 09:15:46 +0000 UTC" firstStartedPulling="2026-03-01 09:15:47.043222669 +0000 UTC m=+476.285101866" lastFinishedPulling="2026-03-01 09:15:49.472640747 +0000 UTC m=+478.714519944" observedRunningTime="2026-03-01 09:15:50.126068198 +0000 UTC m=+479.367947395" watchObservedRunningTime="2026-03-01 09:15:50.127702859 +0000 UTC m=+479.369582056" Mar 01 09:15:51 crc kubenswrapper[4792]: I0301 09:15:51.103485 4792 generic.go:334] "Generic (PLEG): container finished" podID="55c448d6-e926-4b07-8aec-8195d42d2e30" containerID="d3850228aa71b2c0cb870102a30e8019bafab1c2433eb51d1d2f07b39afa4c78" exitCode=0 Mar 01 09:15:51 crc kubenswrapper[4792]: I0301 09:15:51.103553 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sksw8" event={"ID":"55c448d6-e926-4b07-8aec-8195d42d2e30","Type":"ContainerDied","Data":"d3850228aa71b2c0cb870102a30e8019bafab1c2433eb51d1d2f07b39afa4c78"} Mar 01 09:15:51 crc kubenswrapper[4792]: I0301 09:15:51.103582 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sksw8" event={"ID":"55c448d6-e926-4b07-8aec-8195d42d2e30","Type":"ContainerStarted","Data":"6ad93f8a6eda0fc3085969611c1dfac7079fe4c76d1610f5bd0732cb92d39fa4"} Mar 01 09:15:51 crc kubenswrapper[4792]: I0301 09:15:51.105487 4792 generic.go:334] "Generic (PLEG): container finished" podID="d875f1af-e90b-4882-b472-f91651d468a6" containerID="760877f9a7d61beb560ecc8cd400c7c3b45e280e1fd92ca600a9762634d49799" exitCode=0 Mar 01 09:15:51 crc kubenswrapper[4792]: I0301 09:15:51.105532 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-48zdf" event={"ID":"d875f1af-e90b-4882-b472-f91651d468a6","Type":"ContainerDied","Data":"760877f9a7d61beb560ecc8cd400c7c3b45e280e1fd92ca600a9762634d49799"} Mar 01 09:15:51 crc kubenswrapper[4792]: I0301 09:15:51.128543 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sksw8" podStartSLOduration=1.497088545 podStartE2EDuration="3.128528165s" podCreationTimestamp="2026-03-01 09:15:48 +0000 UTC" firstStartedPulling="2026-03-01 09:15:49.083560269 +0000 UTC m=+478.325439466" lastFinishedPulling="2026-03-01 09:15:50.714999899 +0000 UTC m=+479.956879086" observedRunningTime="2026-03-01 09:15:51.123117268 +0000 UTC m=+480.364996465" watchObservedRunningTime="2026-03-01 09:15:51.128528165 +0000 UTC m=+480.370407362" Mar 01 09:15:52 crc kubenswrapper[4792]: I0301 09:15:52.114318 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-48zdf" event={"ID":"d875f1af-e90b-4882-b472-f91651d468a6","Type":"ContainerStarted","Data":"a01d67a8724088d0e436e85a9a9d642be57e702c9cd58c95fb5b239c0a0717e0"} Mar 01 09:15:52 crc kubenswrapper[4792]: I0301 09:15:52.136636 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-48zdf" podStartSLOduration=2.702169674 podStartE2EDuration="4.136603544s" podCreationTimestamp="2026-03-01 09:15:48 +0000 UTC" firstStartedPulling="2026-03-01 09:15:50.092385776 +0000 UTC m=+479.334264963" lastFinishedPulling="2026-03-01 09:15:51.526819646 +0000 UTC m=+480.768698833" observedRunningTime="2026-03-01 09:15:52.134241674 +0000 UTC m=+481.376120861" watchObservedRunningTime="2026-03-01 09:15:52.136603544 +0000 UTC m=+481.378482751" Mar 01 09:15:56 crc kubenswrapper[4792]: I0301 09:15:56.283498 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zjvjw" Mar 01 09:15:56 crc kubenswrapper[4792]: I0301 09:15:56.284610 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zjvjw" Mar 01 09:15:56 crc kubenswrapper[4792]: I0301 09:15:56.324830 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zjvjw" Mar 01 09:15:56 crc kubenswrapper[4792]: I0301 09:15:56.480159 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7fdqh" Mar 01 09:15:56 crc kubenswrapper[4792]: I0301 09:15:56.480208 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7fdqh" Mar 01 09:15:56 crc kubenswrapper[4792]: I0301 09:15:56.515997 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7fdqh" Mar 01 09:15:56 crc kubenswrapper[4792]: I0301 09:15:56.670008 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:56 crc kubenswrapper[4792]: I0301 09:15:56.725027 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4jwnr"] Mar 01 09:15:57 crc kubenswrapper[4792]: I0301 09:15:57.190022 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7fdqh" Mar 01 09:15:57 crc kubenswrapper[4792]: I0301 09:15:57.205734 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zjvjw" Mar 01 09:15:58 crc kubenswrapper[4792]: I0301 09:15:58.680532 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sksw8" Mar 01 09:15:58 crc kubenswrapper[4792]: I0301 09:15:58.681203 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sksw8" Mar 01 09:15:58 crc kubenswrapper[4792]: I0301 09:15:58.745538 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sksw8" Mar 01 09:15:58 crc kubenswrapper[4792]: I0301 09:15:58.896840 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-48zdf" Mar 01 09:15:58 crc kubenswrapper[4792]: I0301 09:15:58.897123 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-48zdf" Mar 01 09:15:58 crc kubenswrapper[4792]: I0301 09:15:58.943522 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-48zdf" Mar 01 09:15:59 crc kubenswrapper[4792]: I0301 09:15:59.180898 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sksw8" Mar 01 09:15:59 crc kubenswrapper[4792]: I0301 09:15:59.191565 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-48zdf" Mar 01 09:16:00 crc kubenswrapper[4792]: I0301 09:16:00.140266 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539276-sbq86"] Mar 01 09:16:00 crc kubenswrapper[4792]: I0301 09:16:00.141397 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539276-sbq86" Mar 01 09:16:00 crc kubenswrapper[4792]: I0301 09:16:00.144819 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:16:00 crc kubenswrapper[4792]: I0301 09:16:00.145376 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:16:00 crc kubenswrapper[4792]: I0301 09:16:00.148327 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:16:00 crc kubenswrapper[4792]: I0301 09:16:00.155977 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539276-sbq86"] Mar 01 09:16:00 crc kubenswrapper[4792]: I0301 09:16:00.274984 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjx6t\" (UniqueName: \"kubernetes.io/projected/0ffc24e3-2b40-4368-91a7-474239cc46fc-kube-api-access-zjx6t\") pod \"auto-csr-approver-29539276-sbq86\" (UID: \"0ffc24e3-2b40-4368-91a7-474239cc46fc\") " pod="openshift-infra/auto-csr-approver-29539276-sbq86" Mar 01 09:16:00 crc kubenswrapper[4792]: I0301 09:16:00.376768 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjx6t\" (UniqueName: \"kubernetes.io/projected/0ffc24e3-2b40-4368-91a7-474239cc46fc-kube-api-access-zjx6t\") pod \"auto-csr-approver-29539276-sbq86\" (UID: \"0ffc24e3-2b40-4368-91a7-474239cc46fc\") " pod="openshift-infra/auto-csr-approver-29539276-sbq86" Mar 01 09:16:00 crc kubenswrapper[4792]: I0301 09:16:00.395400 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjx6t\" (UniqueName: \"kubernetes.io/projected/0ffc24e3-2b40-4368-91a7-474239cc46fc-kube-api-access-zjx6t\") pod \"auto-csr-approver-29539276-sbq86\" (UID: \"0ffc24e3-2b40-4368-91a7-474239cc46fc\") " pod="openshift-infra/auto-csr-approver-29539276-sbq86" Mar 01 09:16:00 crc kubenswrapper[4792]: I0301 09:16:00.467257 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539276-sbq86" Mar 01 09:16:00 crc kubenswrapper[4792]: I0301 09:16:00.664651 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539276-sbq86"] Mar 01 09:16:00 crc kubenswrapper[4792]: W0301 09:16:00.670751 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ffc24e3_2b40_4368_91a7_474239cc46fc.slice/crio-10d4f99a6afeafcf8f1f54bf00feff9a933803d7a9ba797926dca7f642308a46 WatchSource:0}: Error finding container 10d4f99a6afeafcf8f1f54bf00feff9a933803d7a9ba797926dca7f642308a46: Status 404 returned error can't find the container with id 10d4f99a6afeafcf8f1f54bf00feff9a933803d7a9ba797926dca7f642308a46 Mar 01 09:16:01 crc kubenswrapper[4792]: I0301 09:16:01.160215 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539276-sbq86" event={"ID":"0ffc24e3-2b40-4368-91a7-474239cc46fc","Type":"ContainerStarted","Data":"10d4f99a6afeafcf8f1f54bf00feff9a933803d7a9ba797926dca7f642308a46"} Mar 01 09:16:04 crc kubenswrapper[4792]: I0301 09:16:04.176647 4792 generic.go:334] "Generic (PLEG): container finished" podID="0ffc24e3-2b40-4368-91a7-474239cc46fc" containerID="7f82c367589f33dd358f1e6a48f6206b0470bf80f8be1de16c8420482e80dba1" exitCode=0 Mar 01 09:16:04 crc kubenswrapper[4792]: I0301 09:16:04.176723 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539276-sbq86" event={"ID":"0ffc24e3-2b40-4368-91a7-474239cc46fc","Type":"ContainerDied","Data":"7f82c367589f33dd358f1e6a48f6206b0470bf80f8be1de16c8420482e80dba1"} Mar 01 09:16:04 crc kubenswrapper[4792]: I0301 09:16:04.943177 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:16:04 crc kubenswrapper[4792]: I0301 09:16:04.943227 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:16:05 crc kubenswrapper[4792]: I0301 09:16:05.441135 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539276-sbq86" Mar 01 09:16:05 crc kubenswrapper[4792]: I0301 09:16:05.540466 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjx6t\" (UniqueName: \"kubernetes.io/projected/0ffc24e3-2b40-4368-91a7-474239cc46fc-kube-api-access-zjx6t\") pod \"0ffc24e3-2b40-4368-91a7-474239cc46fc\" (UID: \"0ffc24e3-2b40-4368-91a7-474239cc46fc\") " Mar 01 09:16:05 crc kubenswrapper[4792]: I0301 09:16:05.552157 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ffc24e3-2b40-4368-91a7-474239cc46fc-kube-api-access-zjx6t" (OuterVolumeSpecName: "kube-api-access-zjx6t") pod "0ffc24e3-2b40-4368-91a7-474239cc46fc" (UID: "0ffc24e3-2b40-4368-91a7-474239cc46fc"). InnerVolumeSpecName "kube-api-access-zjx6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:16:05 crc kubenswrapper[4792]: I0301 09:16:05.641852 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjx6t\" (UniqueName: \"kubernetes.io/projected/0ffc24e3-2b40-4368-91a7-474239cc46fc-kube-api-access-zjx6t\") on node \"crc\" DevicePath \"\"" Mar 01 09:16:06 crc kubenswrapper[4792]: I0301 09:16:06.187678 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539276-sbq86" event={"ID":"0ffc24e3-2b40-4368-91a7-474239cc46fc","Type":"ContainerDied","Data":"10d4f99a6afeafcf8f1f54bf00feff9a933803d7a9ba797926dca7f642308a46"} Mar 01 09:16:06 crc kubenswrapper[4792]: I0301 09:16:06.187713 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10d4f99a6afeafcf8f1f54bf00feff9a933803d7a9ba797926dca7f642308a46" Mar 01 09:16:06 crc kubenswrapper[4792]: I0301 09:16:06.187739 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539276-sbq86" Mar 01 09:16:06 crc kubenswrapper[4792]: I0301 09:16:06.507832 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539270-q7hck"] Mar 01 09:16:06 crc kubenswrapper[4792]: I0301 09:16:06.517435 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539270-q7hck"] Mar 01 09:16:07 crc kubenswrapper[4792]: I0301 09:16:07.421046 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4130507-2de2-48c2-9c3f-e9474aeca556" path="/var/lib/kubelet/pods/b4130507-2de2-48c2-9c3f-e9474aeca556/volumes" Mar 01 09:16:21 crc kubenswrapper[4792]: I0301 09:16:21.783285 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" podUID="f147eb3a-0f65-4ecb-b1a2-5d561c21253c" containerName="registry" containerID="cri-o://0779d989a019df4f783c6ed27e1b08237b43aef3c94320d704c90c5367c173ef" gracePeriod=30 Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.192785 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.281399 4792 generic.go:334] "Generic (PLEG): container finished" podID="f147eb3a-0f65-4ecb-b1a2-5d561c21253c" containerID="0779d989a019df4f783c6ed27e1b08237b43aef3c94320d704c90c5367c173ef" exitCode=0 Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.281457 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.281473 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" event={"ID":"f147eb3a-0f65-4ecb-b1a2-5d561c21253c","Type":"ContainerDied","Data":"0779d989a019df4f783c6ed27e1b08237b43aef3c94320d704c90c5367c173ef"} Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.281506 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" event={"ID":"f147eb3a-0f65-4ecb-b1a2-5d561c21253c","Type":"ContainerDied","Data":"21cb9ec22cbd36ee1804b1e2f96533e30f8509709de84a1859f63834b8df4b3f"} Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.281527 4792 scope.go:117] "RemoveContainer" containerID="0779d989a019df4f783c6ed27e1b08237b43aef3c94320d704c90c5367c173ef" Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.293581 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-installation-pull-secrets\") pod \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.294102 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.294137 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-registry-tls\") pod \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.294160 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-bound-sa-token\") pod \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.294178 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-trusted-ca\") pod \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.294205 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k789z\" (UniqueName: \"kubernetes.io/projected/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-kube-api-access-k789z\") pod \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.294231 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-ca-trust-extracted\") pod \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.294271 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-registry-certificates\") pod \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.296184 4792 scope.go:117] "RemoveContainer" containerID="0779d989a019df4f783c6ed27e1b08237b43aef3c94320d704c90c5367c173ef" Mar 01 09:16:22 crc kubenswrapper[4792]: E0301 09:16:22.296546 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0779d989a019df4f783c6ed27e1b08237b43aef3c94320d704c90c5367c173ef\": container with ID starting with 0779d989a019df4f783c6ed27e1b08237b43aef3c94320d704c90c5367c173ef not found: ID does not exist" containerID="0779d989a019df4f783c6ed27e1b08237b43aef3c94320d704c90c5367c173ef" Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.296573 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0779d989a019df4f783c6ed27e1b08237b43aef3c94320d704c90c5367c173ef"} err="failed to get container status \"0779d989a019df4f783c6ed27e1b08237b43aef3c94320d704c90c5367c173ef\": rpc error: code = NotFound desc = could not find container \"0779d989a019df4f783c6ed27e1b08237b43aef3c94320d704c90c5367c173ef\": container with ID starting with 0779d989a019df4f783c6ed27e1b08237b43aef3c94320d704c90c5367c173ef not found: ID does not exist" Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.296702 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "f147eb3a-0f65-4ecb-b1a2-5d561c21253c" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.297223 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "f147eb3a-0f65-4ecb-b1a2-5d561c21253c" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.309227 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "f147eb3a-0f65-4ecb-b1a2-5d561c21253c" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.309890 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-kube-api-access-k789z" (OuterVolumeSpecName: "kube-api-access-k789z") pod "f147eb3a-0f65-4ecb-b1a2-5d561c21253c" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c"). InnerVolumeSpecName "kube-api-access-k789z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.309948 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "f147eb3a-0f65-4ecb-b1a2-5d561c21253c" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.309986 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "f147eb3a-0f65-4ecb-b1a2-5d561c21253c" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.310069 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "f147eb3a-0f65-4ecb-b1a2-5d561c21253c" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.312347 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "f147eb3a-0f65-4ecb-b1a2-5d561c21253c" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.397312 4792 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.397531 4792 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.397612 4792 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.397664 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k789z\" (UniqueName: \"kubernetes.io/projected/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-kube-api-access-k789z\") on node \"crc\" DevicePath \"\"" Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.397717 4792 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.397840 4792 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.397893 4792 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.624429 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4jwnr"] Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.631303 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4jwnr"] Mar 01 09:16:23 crc kubenswrapper[4792]: I0301 09:16:23.416796 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f147eb3a-0f65-4ecb-b1a2-5d561c21253c" path="/var/lib/kubelet/pods/f147eb3a-0f65-4ecb-b1a2-5d561c21253c/volumes" Mar 01 09:16:34 crc kubenswrapper[4792]: I0301 09:16:34.943163 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:16:34 crc kubenswrapper[4792]: I0301 09:16:34.943703 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:16:34 crc kubenswrapper[4792]: I0301 09:16:34.943743 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:16:34 crc kubenswrapper[4792]: I0301 09:16:34.944252 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d953fb6bdd1dde8b39f7c850e5987057bc7f87ba2081744a1e12425c6ecc8289"} pod="openshift-machine-config-operator/machine-config-daemon-bqszv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 01 09:16:34 crc kubenswrapper[4792]: I0301 09:16:34.944309 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" containerID="cri-o://d953fb6bdd1dde8b39f7c850e5987057bc7f87ba2081744a1e12425c6ecc8289" gracePeriod=600 Mar 01 09:16:35 crc kubenswrapper[4792]: I0301 09:16:35.355611 4792 generic.go:334] "Generic (PLEG): container finished" podID="9105f6b0-6f16-47aa-8009-73736a90b765" containerID="d953fb6bdd1dde8b39f7c850e5987057bc7f87ba2081744a1e12425c6ecc8289" exitCode=0 Mar 01 09:16:35 crc kubenswrapper[4792]: I0301 09:16:35.355694 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerDied","Data":"d953fb6bdd1dde8b39f7c850e5987057bc7f87ba2081744a1e12425c6ecc8289"} Mar 01 09:16:35 crc kubenswrapper[4792]: I0301 09:16:35.356059 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerStarted","Data":"3d6c435219d8bc835f1f13dc6f334e1ca86fd764226b575caaadee4dad9e5209"} Mar 01 09:16:35 crc kubenswrapper[4792]: I0301 09:16:35.356102 4792 scope.go:117] "RemoveContainer" containerID="47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac" Mar 01 09:18:00 crc kubenswrapper[4792]: I0301 09:18:00.134250 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539278-dkkbp"] Mar 01 09:18:00 crc kubenswrapper[4792]: E0301 09:18:00.135287 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ffc24e3-2b40-4368-91a7-474239cc46fc" containerName="oc" Mar 01 09:18:00 crc kubenswrapper[4792]: I0301 09:18:00.135305 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ffc24e3-2b40-4368-91a7-474239cc46fc" containerName="oc" Mar 01 09:18:00 crc kubenswrapper[4792]: E0301 09:18:00.135319 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f147eb3a-0f65-4ecb-b1a2-5d561c21253c" containerName="registry" Mar 01 09:18:00 crc kubenswrapper[4792]: I0301 09:18:00.135329 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f147eb3a-0f65-4ecb-b1a2-5d561c21253c" containerName="registry" Mar 01 09:18:00 crc kubenswrapper[4792]: I0301 09:18:00.135477 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ffc24e3-2b40-4368-91a7-474239cc46fc" containerName="oc" Mar 01 09:18:00 crc kubenswrapper[4792]: I0301 09:18:00.135489 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f147eb3a-0f65-4ecb-b1a2-5d561c21253c" containerName="registry" Mar 01 09:18:00 crc kubenswrapper[4792]: I0301 09:18:00.136003 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539278-dkkbp" Mar 01 09:18:00 crc kubenswrapper[4792]: I0301 09:18:00.140046 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539278-dkkbp"] Mar 01 09:18:00 crc kubenswrapper[4792]: I0301 09:18:00.140244 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:18:00 crc kubenswrapper[4792]: I0301 09:18:00.140501 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:18:00 crc kubenswrapper[4792]: I0301 09:18:00.140693 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:18:00 crc kubenswrapper[4792]: I0301 09:18:00.265250 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgk65\" (UniqueName: \"kubernetes.io/projected/0d17aa56-5b61-403d-9d20-cb300aabc44d-kube-api-access-wgk65\") pod \"auto-csr-approver-29539278-dkkbp\" (UID: \"0d17aa56-5b61-403d-9d20-cb300aabc44d\") " pod="openshift-infra/auto-csr-approver-29539278-dkkbp" Mar 01 09:18:00 crc kubenswrapper[4792]: I0301 09:18:00.366724 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgk65\" (UniqueName: \"kubernetes.io/projected/0d17aa56-5b61-403d-9d20-cb300aabc44d-kube-api-access-wgk65\") pod \"auto-csr-approver-29539278-dkkbp\" (UID: \"0d17aa56-5b61-403d-9d20-cb300aabc44d\") " pod="openshift-infra/auto-csr-approver-29539278-dkkbp" Mar 01 09:18:00 crc kubenswrapper[4792]: I0301 09:18:00.390617 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgk65\" (UniqueName: \"kubernetes.io/projected/0d17aa56-5b61-403d-9d20-cb300aabc44d-kube-api-access-wgk65\") pod \"auto-csr-approver-29539278-dkkbp\" (UID: \"0d17aa56-5b61-403d-9d20-cb300aabc44d\") " pod="openshift-infra/auto-csr-approver-29539278-dkkbp" Mar 01 09:18:00 crc kubenswrapper[4792]: I0301 09:18:00.461253 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539278-dkkbp" Mar 01 09:18:00 crc kubenswrapper[4792]: I0301 09:18:00.864816 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539278-dkkbp"] Mar 01 09:18:00 crc kubenswrapper[4792]: I0301 09:18:00.874983 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 01 09:18:01 crc kubenswrapper[4792]: I0301 09:18:01.860062 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539278-dkkbp" event={"ID":"0d17aa56-5b61-403d-9d20-cb300aabc44d","Type":"ContainerStarted","Data":"403087d613e3adc9d0ef98336533cbc95bef7d9cc37a9d027b28a5ec1411b85d"} Mar 01 09:18:02 crc kubenswrapper[4792]: I0301 09:18:02.870687 4792 generic.go:334] "Generic (PLEG): container finished" podID="0d17aa56-5b61-403d-9d20-cb300aabc44d" containerID="ad33205b5c6776c36f5f90bc6d51a56bb6cf073bf39f5d634c13c03da022cc95" exitCode=0 Mar 01 09:18:02 crc kubenswrapper[4792]: I0301 09:18:02.870775 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539278-dkkbp" event={"ID":"0d17aa56-5b61-403d-9d20-cb300aabc44d","Type":"ContainerDied","Data":"ad33205b5c6776c36f5f90bc6d51a56bb6cf073bf39f5d634c13c03da022cc95"} Mar 01 09:18:04 crc kubenswrapper[4792]: I0301 09:18:04.143585 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539278-dkkbp" Mar 01 09:18:04 crc kubenswrapper[4792]: I0301 09:18:04.307260 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgk65\" (UniqueName: \"kubernetes.io/projected/0d17aa56-5b61-403d-9d20-cb300aabc44d-kube-api-access-wgk65\") pod \"0d17aa56-5b61-403d-9d20-cb300aabc44d\" (UID: \"0d17aa56-5b61-403d-9d20-cb300aabc44d\") " Mar 01 09:18:04 crc kubenswrapper[4792]: I0301 09:18:04.312047 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d17aa56-5b61-403d-9d20-cb300aabc44d-kube-api-access-wgk65" (OuterVolumeSpecName: "kube-api-access-wgk65") pod "0d17aa56-5b61-403d-9d20-cb300aabc44d" (UID: "0d17aa56-5b61-403d-9d20-cb300aabc44d"). InnerVolumeSpecName "kube-api-access-wgk65". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:18:04 crc kubenswrapper[4792]: I0301 09:18:04.409155 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgk65\" (UniqueName: \"kubernetes.io/projected/0d17aa56-5b61-403d-9d20-cb300aabc44d-kube-api-access-wgk65\") on node \"crc\" DevicePath \"\"" Mar 01 09:18:04 crc kubenswrapper[4792]: I0301 09:18:04.883543 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539278-dkkbp" event={"ID":"0d17aa56-5b61-403d-9d20-cb300aabc44d","Type":"ContainerDied","Data":"403087d613e3adc9d0ef98336533cbc95bef7d9cc37a9d027b28a5ec1411b85d"} Mar 01 09:18:04 crc kubenswrapper[4792]: I0301 09:18:04.883585 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539278-dkkbp" Mar 01 09:18:04 crc kubenswrapper[4792]: I0301 09:18:04.883603 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="403087d613e3adc9d0ef98336533cbc95bef7d9cc37a9d027b28a5ec1411b85d" Mar 01 09:18:05 crc kubenswrapper[4792]: I0301 09:18:05.205109 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539272-nq8dk"] Mar 01 09:18:05 crc kubenswrapper[4792]: I0301 09:18:05.213495 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539272-nq8dk"] Mar 01 09:18:05 crc kubenswrapper[4792]: I0301 09:18:05.414906 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d574e82-f840-4f0c-982d-f6a133bd64ae" path="/var/lib/kubelet/pods/8d574e82-f840-4f0c-982d-f6a133bd64ae/volumes" Mar 01 09:18:51 crc kubenswrapper[4792]: I0301 09:18:51.787018 4792 scope.go:117] "RemoveContainer" containerID="5f51f2f66c61a102a6b43ee525bfb8b5ff9da77472d4107c4db4ba5e29f6a9ee" Mar 01 09:18:51 crc kubenswrapper[4792]: I0301 09:18:51.830050 4792 scope.go:117] "RemoveContainer" containerID="f692f356115e5b53ef6a4d81f9a4c258c05c49397508f23df7e1bd78fc94331c" Mar 01 09:19:04 crc kubenswrapper[4792]: I0301 09:19:04.943067 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:19:04 crc kubenswrapper[4792]: I0301 09:19:04.943745 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:19:34 crc kubenswrapper[4792]: I0301 09:19:34.942786 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:19:34 crc kubenswrapper[4792]: I0301 09:19:34.943387 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:20:00 crc kubenswrapper[4792]: I0301 09:20:00.140109 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539280-gz7v9"] Mar 01 09:20:00 crc kubenswrapper[4792]: E0301 09:20:00.140898 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d17aa56-5b61-403d-9d20-cb300aabc44d" containerName="oc" Mar 01 09:20:00 crc kubenswrapper[4792]: I0301 09:20:00.140935 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d17aa56-5b61-403d-9d20-cb300aabc44d" containerName="oc" Mar 01 09:20:00 crc kubenswrapper[4792]: I0301 09:20:00.141064 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d17aa56-5b61-403d-9d20-cb300aabc44d" containerName="oc" Mar 01 09:20:00 crc kubenswrapper[4792]: I0301 09:20:00.141461 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539280-gz7v9" Mar 01 09:20:00 crc kubenswrapper[4792]: I0301 09:20:00.143224 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:20:00 crc kubenswrapper[4792]: I0301 09:20:00.144245 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:20:00 crc kubenswrapper[4792]: I0301 09:20:00.144386 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:20:00 crc kubenswrapper[4792]: I0301 09:20:00.153895 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539280-gz7v9"] Mar 01 09:20:00 crc kubenswrapper[4792]: I0301 09:20:00.165095 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frqvt\" (UniqueName: \"kubernetes.io/projected/71c922d5-9de8-48d8-9f96-ad47d1d4017e-kube-api-access-frqvt\") pod \"auto-csr-approver-29539280-gz7v9\" (UID: \"71c922d5-9de8-48d8-9f96-ad47d1d4017e\") " pod="openshift-infra/auto-csr-approver-29539280-gz7v9" Mar 01 09:20:00 crc kubenswrapper[4792]: I0301 09:20:00.265981 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frqvt\" (UniqueName: \"kubernetes.io/projected/71c922d5-9de8-48d8-9f96-ad47d1d4017e-kube-api-access-frqvt\") pod \"auto-csr-approver-29539280-gz7v9\" (UID: \"71c922d5-9de8-48d8-9f96-ad47d1d4017e\") " pod="openshift-infra/auto-csr-approver-29539280-gz7v9" Mar 01 09:20:00 crc kubenswrapper[4792]: I0301 09:20:00.285862 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frqvt\" (UniqueName: \"kubernetes.io/projected/71c922d5-9de8-48d8-9f96-ad47d1d4017e-kube-api-access-frqvt\") pod \"auto-csr-approver-29539280-gz7v9\" (UID: \"71c922d5-9de8-48d8-9f96-ad47d1d4017e\") " pod="openshift-infra/auto-csr-approver-29539280-gz7v9" Mar 01 09:20:00 crc kubenswrapper[4792]: I0301 09:20:00.483024 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539280-gz7v9" Mar 01 09:20:00 crc kubenswrapper[4792]: I0301 09:20:00.683245 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539280-gz7v9"] Mar 01 09:20:01 crc kubenswrapper[4792]: I0301 09:20:01.634842 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539280-gz7v9" event={"ID":"71c922d5-9de8-48d8-9f96-ad47d1d4017e","Type":"ContainerStarted","Data":"95d34678a38dd8b8a79d0061e52054c7c7d27d8ffc5a67d6b3d18d4b9720d4e6"} Mar 01 09:20:02 crc kubenswrapper[4792]: I0301 09:20:02.642426 4792 generic.go:334] "Generic (PLEG): container finished" podID="71c922d5-9de8-48d8-9f96-ad47d1d4017e" containerID="752079600b535956d369c891a21eba391b40ef46c0f767fc3b0fcdc6ceb1bddc" exitCode=0 Mar 01 09:20:02 crc kubenswrapper[4792]: I0301 09:20:02.642502 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539280-gz7v9" event={"ID":"71c922d5-9de8-48d8-9f96-ad47d1d4017e","Type":"ContainerDied","Data":"752079600b535956d369c891a21eba391b40ef46c0f767fc3b0fcdc6ceb1bddc"} Mar 01 09:20:03 crc kubenswrapper[4792]: I0301 09:20:03.898751 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539280-gz7v9" Mar 01 09:20:03 crc kubenswrapper[4792]: I0301 09:20:03.907124 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frqvt\" (UniqueName: \"kubernetes.io/projected/71c922d5-9de8-48d8-9f96-ad47d1d4017e-kube-api-access-frqvt\") pod \"71c922d5-9de8-48d8-9f96-ad47d1d4017e\" (UID: \"71c922d5-9de8-48d8-9f96-ad47d1d4017e\") " Mar 01 09:20:03 crc kubenswrapper[4792]: I0301 09:20:03.914974 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71c922d5-9de8-48d8-9f96-ad47d1d4017e-kube-api-access-frqvt" (OuterVolumeSpecName: "kube-api-access-frqvt") pod "71c922d5-9de8-48d8-9f96-ad47d1d4017e" (UID: "71c922d5-9de8-48d8-9f96-ad47d1d4017e"). InnerVolumeSpecName "kube-api-access-frqvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:20:04 crc kubenswrapper[4792]: I0301 09:20:04.008521 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frqvt\" (UniqueName: \"kubernetes.io/projected/71c922d5-9de8-48d8-9f96-ad47d1d4017e-kube-api-access-frqvt\") on node \"crc\" DevicePath \"\"" Mar 01 09:20:04 crc kubenswrapper[4792]: I0301 09:20:04.657382 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539280-gz7v9" event={"ID":"71c922d5-9de8-48d8-9f96-ad47d1d4017e","Type":"ContainerDied","Data":"95d34678a38dd8b8a79d0061e52054c7c7d27d8ffc5a67d6b3d18d4b9720d4e6"} Mar 01 09:20:04 crc kubenswrapper[4792]: I0301 09:20:04.657827 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95d34678a38dd8b8a79d0061e52054c7c7d27d8ffc5a67d6b3d18d4b9720d4e6" Mar 01 09:20:04 crc kubenswrapper[4792]: I0301 09:20:04.657465 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539280-gz7v9" Mar 01 09:20:04 crc kubenswrapper[4792]: I0301 09:20:04.943391 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:20:04 crc kubenswrapper[4792]: I0301 09:20:04.943466 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:20:04 crc kubenswrapper[4792]: I0301 09:20:04.943524 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:20:04 crc kubenswrapper[4792]: I0301 09:20:04.944398 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3d6c435219d8bc835f1f13dc6f334e1ca86fd764226b575caaadee4dad9e5209"} pod="openshift-machine-config-operator/machine-config-daemon-bqszv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 01 09:20:04 crc kubenswrapper[4792]: I0301 09:20:04.944486 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" containerID="cri-o://3d6c435219d8bc835f1f13dc6f334e1ca86fd764226b575caaadee4dad9e5209" gracePeriod=600 Mar 01 09:20:04 crc kubenswrapper[4792]: I0301 09:20:04.987958 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539274-zckv2"] Mar 01 09:20:04 crc kubenswrapper[4792]: I0301 09:20:04.997859 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539274-zckv2"] Mar 01 09:20:05 crc kubenswrapper[4792]: I0301 09:20:05.427111 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81a3bf03-822b-4b69-93a3-b420d8f58efd" path="/var/lib/kubelet/pods/81a3bf03-822b-4b69-93a3-b420d8f58efd/volumes" Mar 01 09:20:05 crc kubenswrapper[4792]: I0301 09:20:05.664514 4792 generic.go:334] "Generic (PLEG): container finished" podID="9105f6b0-6f16-47aa-8009-73736a90b765" containerID="3d6c435219d8bc835f1f13dc6f334e1ca86fd764226b575caaadee4dad9e5209" exitCode=0 Mar 01 09:20:05 crc kubenswrapper[4792]: I0301 09:20:05.664563 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerDied","Data":"3d6c435219d8bc835f1f13dc6f334e1ca86fd764226b575caaadee4dad9e5209"} Mar 01 09:20:05 crc kubenswrapper[4792]: I0301 09:20:05.664590 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerStarted","Data":"f223d76bf80d673696c61fa09c13cc363c202a24d360c9e2da1e52c335e05521"} Mar 01 09:20:05 crc kubenswrapper[4792]: I0301 09:20:05.664607 4792 scope.go:117] "RemoveContainer" containerID="d953fb6bdd1dde8b39f7c850e5987057bc7f87ba2081744a1e12425c6ecc8289" Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.718001 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-tm5s6"] Mar 01 09:20:48 crc kubenswrapper[4792]: E0301 09:20:48.718807 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71c922d5-9de8-48d8-9f96-ad47d1d4017e" containerName="oc" Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.718822 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="71c922d5-9de8-48d8-9f96-ad47d1d4017e" containerName="oc" Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.718967 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="71c922d5-9de8-48d8-9f96-ad47d1d4017e" containerName="oc" Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.719410 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-tm5s6" Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.726575 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.727025 4792 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-bz5ls" Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.727327 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.729689 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-tm5s6"] Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.735415 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-4qgsm"] Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.740980 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-4qgsm" Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.749460 4792 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-njvxn" Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.758720 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-4qgsm"] Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.781513 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-rckpb"] Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.782365 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-rckpb" Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.790404 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6tgd\" (UniqueName: \"kubernetes.io/projected/bf71ada0-c7b2-4255-bb2c-31ec3309a29d-kube-api-access-z6tgd\") pod \"cert-manager-858654f9db-4qgsm\" (UID: \"bf71ada0-c7b2-4255-bb2c-31ec3309a29d\") " pod="cert-manager/cert-manager-858654f9db-4qgsm" Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.790470 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fqhx\" (UniqueName: \"kubernetes.io/projected/2071887a-31a9-428d-92d0-bf8a361011ca-kube-api-access-9fqhx\") pod \"cert-manager-cainjector-cf98fcc89-tm5s6\" (UID: \"2071887a-31a9-428d-92d0-bf8a361011ca\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-tm5s6" Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.791011 4792 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-dc8jw" Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.796315 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-rckpb"] Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.891260 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2vcr\" (UniqueName: \"kubernetes.io/projected/a03eedd4-ecde-4905-95a7-c43b45ef9da9-kube-api-access-g2vcr\") pod \"cert-manager-webhook-687f57d79b-rckpb\" (UID: \"a03eedd4-ecde-4905-95a7-c43b45ef9da9\") " pod="cert-manager/cert-manager-webhook-687f57d79b-rckpb" Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.891510 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6tgd\" (UniqueName: \"kubernetes.io/projected/bf71ada0-c7b2-4255-bb2c-31ec3309a29d-kube-api-access-z6tgd\") pod \"cert-manager-858654f9db-4qgsm\" (UID: \"bf71ada0-c7b2-4255-bb2c-31ec3309a29d\") " pod="cert-manager/cert-manager-858654f9db-4qgsm" Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.891635 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fqhx\" (UniqueName: \"kubernetes.io/projected/2071887a-31a9-428d-92d0-bf8a361011ca-kube-api-access-9fqhx\") pod \"cert-manager-cainjector-cf98fcc89-tm5s6\" (UID: \"2071887a-31a9-428d-92d0-bf8a361011ca\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-tm5s6" Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.915754 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fqhx\" (UniqueName: \"kubernetes.io/projected/2071887a-31a9-428d-92d0-bf8a361011ca-kube-api-access-9fqhx\") pod \"cert-manager-cainjector-cf98fcc89-tm5s6\" (UID: \"2071887a-31a9-428d-92d0-bf8a361011ca\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-tm5s6" Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.922670 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6tgd\" (UniqueName: \"kubernetes.io/projected/bf71ada0-c7b2-4255-bb2c-31ec3309a29d-kube-api-access-z6tgd\") pod \"cert-manager-858654f9db-4qgsm\" (UID: \"bf71ada0-c7b2-4255-bb2c-31ec3309a29d\") " pod="cert-manager/cert-manager-858654f9db-4qgsm" Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.992715 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2vcr\" (UniqueName: \"kubernetes.io/projected/a03eedd4-ecde-4905-95a7-c43b45ef9da9-kube-api-access-g2vcr\") pod \"cert-manager-webhook-687f57d79b-rckpb\" (UID: \"a03eedd4-ecde-4905-95a7-c43b45ef9da9\") " pod="cert-manager/cert-manager-webhook-687f57d79b-rckpb" Mar 01 09:20:49 crc kubenswrapper[4792]: I0301 09:20:49.008171 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2vcr\" (UniqueName: \"kubernetes.io/projected/a03eedd4-ecde-4905-95a7-c43b45ef9da9-kube-api-access-g2vcr\") pod \"cert-manager-webhook-687f57d79b-rckpb\" (UID: \"a03eedd4-ecde-4905-95a7-c43b45ef9da9\") " pod="cert-manager/cert-manager-webhook-687f57d79b-rckpb" Mar 01 09:20:49 crc kubenswrapper[4792]: I0301 09:20:49.053011 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-tm5s6" Mar 01 09:20:49 crc kubenswrapper[4792]: I0301 09:20:49.073215 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-4qgsm" Mar 01 09:20:49 crc kubenswrapper[4792]: I0301 09:20:49.094144 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-rckpb" Mar 01 09:20:49 crc kubenswrapper[4792]: I0301 09:20:49.493779 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-tm5s6"] Mar 01 09:20:49 crc kubenswrapper[4792]: I0301 09:20:49.497779 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-4qgsm"] Mar 01 09:20:49 crc kubenswrapper[4792]: W0301 09:20:49.510542 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2071887a_31a9_428d_92d0_bf8a361011ca.slice/crio-00323368cd944ded4fac2cb52d8703ab37cd9a849120bafe9a11659e90d485a3 WatchSource:0}: Error finding container 00323368cd944ded4fac2cb52d8703ab37cd9a849120bafe9a11659e90d485a3: Status 404 returned error can't find the container with id 00323368cd944ded4fac2cb52d8703ab37cd9a849120bafe9a11659e90d485a3 Mar 01 09:20:49 crc kubenswrapper[4792]: I0301 09:20:49.557496 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-rckpb"] Mar 01 09:20:49 crc kubenswrapper[4792]: I0301 09:20:49.688001 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-rckpb" event={"ID":"a03eedd4-ecde-4905-95a7-c43b45ef9da9","Type":"ContainerStarted","Data":"47449944f9bdaccbb20eb4038f0bd25ccb1bd8c02d38d7e9132edb7737008e09"} Mar 01 09:20:49 crc kubenswrapper[4792]: I0301 09:20:49.689271 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-4qgsm" event={"ID":"bf71ada0-c7b2-4255-bb2c-31ec3309a29d","Type":"ContainerStarted","Data":"8497a59c82276de798f225f8a47dafc1a3d068d0592c3a9073082c3b2de27d7b"} Mar 01 09:20:49 crc kubenswrapper[4792]: I0301 09:20:49.690307 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-tm5s6" event={"ID":"2071887a-31a9-428d-92d0-bf8a361011ca","Type":"ContainerStarted","Data":"00323368cd944ded4fac2cb52d8703ab37cd9a849120bafe9a11659e90d485a3"} Mar 01 09:20:51 crc kubenswrapper[4792]: I0301 09:20:51.898283 4792 scope.go:117] "RemoveContainer" containerID="8ef57da9b21fb114ba0f54a09e6174de667175684b505b54b0c846389388b402" Mar 01 09:20:53 crc kubenswrapper[4792]: I0301 09:20:53.711643 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-tm5s6" event={"ID":"2071887a-31a9-428d-92d0-bf8a361011ca","Type":"ContainerStarted","Data":"37e2f9809f32423b862ed39deb790929e382850f002f42f2a85d369a95317f6e"} Mar 01 09:20:53 crc kubenswrapper[4792]: I0301 09:20:53.713172 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-rckpb" event={"ID":"a03eedd4-ecde-4905-95a7-c43b45ef9da9","Type":"ContainerStarted","Data":"31faf93bab9a2d4d492d2210a05a44e8bab4c57026a929527668b7edcccfd4ce"} Mar 01 09:20:53 crc kubenswrapper[4792]: I0301 09:20:53.713281 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-rckpb" Mar 01 09:20:53 crc kubenswrapper[4792]: I0301 09:20:53.713965 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-4qgsm" event={"ID":"bf71ada0-c7b2-4255-bb2c-31ec3309a29d","Type":"ContainerStarted","Data":"909a22d7f23369670e6fa7fc2dfa31a5f93d0d6e7233311d411b3b32275ba42e"} Mar 01 09:20:53 crc kubenswrapper[4792]: I0301 09:20:53.725421 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-tm5s6" podStartSLOduration=2.02458537 podStartE2EDuration="5.725402698s" podCreationTimestamp="2026-03-01 09:20:48 +0000 UTC" firstStartedPulling="2026-03-01 09:20:49.512298049 +0000 UTC m=+778.754177286" lastFinishedPulling="2026-03-01 09:20:53.213115417 +0000 UTC m=+782.454994614" observedRunningTime="2026-03-01 09:20:53.724860254 +0000 UTC m=+782.966739461" watchObservedRunningTime="2026-03-01 09:20:53.725402698 +0000 UTC m=+782.967281895" Mar 01 09:20:53 crc kubenswrapper[4792]: I0301 09:20:53.742270 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-rckpb" podStartSLOduration=2.036089143 podStartE2EDuration="5.742251663s" podCreationTimestamp="2026-03-01 09:20:48 +0000 UTC" firstStartedPulling="2026-03-01 09:20:49.561143395 +0000 UTC m=+778.803022592" lastFinishedPulling="2026-03-01 09:20:53.267305875 +0000 UTC m=+782.509185112" observedRunningTime="2026-03-01 09:20:53.741564586 +0000 UTC m=+782.983443783" watchObservedRunningTime="2026-03-01 09:20:53.742251663 +0000 UTC m=+782.984130860" Mar 01 09:20:53 crc kubenswrapper[4792]: I0301 09:20:53.763106 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-4qgsm" podStartSLOduration=2.060401783 podStartE2EDuration="5.763088107s" podCreationTimestamp="2026-03-01 09:20:48 +0000 UTC" firstStartedPulling="2026-03-01 09:20:49.510008163 +0000 UTC m=+778.751887360" lastFinishedPulling="2026-03-01 09:20:53.212694487 +0000 UTC m=+782.454573684" observedRunningTime="2026-03-01 09:20:53.75955078 +0000 UTC m=+783.001429977" watchObservedRunningTime="2026-03-01 09:20:53.763088107 +0000 UTC m=+783.004967304" Mar 01 09:20:59 crc kubenswrapper[4792]: I0301 09:20:59.098333 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-rckpb" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.273091 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7pp7m"] Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.274000 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovn-controller" containerID="cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948" gracePeriod=30 Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.274135 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="sbdb" containerID="cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0" gracePeriod=30 Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.274168 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="nbdb" containerID="cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe" gracePeriod=30 Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.274196 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="northd" containerID="cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e" gracePeriod=30 Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.274224 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8" gracePeriod=30 Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.274250 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="kube-rbac-proxy-node" containerID="cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3" gracePeriod=30 Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.274277 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovn-acl-logging" containerID="cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a" gracePeriod=30 Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.370959 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovnkube-controller" containerID="cri-o://9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788" gracePeriod=30 Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.627928 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pp7m_e2bd7bac-21cf-4657-ab84-68a14f99f8f0/ovnkube-controller/3.log" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.631692 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pp7m_e2bd7bac-21cf-4657-ab84-68a14f99f8f0/ovn-acl-logging/0.log" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.632258 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pp7m_e2bd7bac-21cf-4657-ab84-68a14f99f8f0/ovn-controller/0.log" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.632719 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.687486 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bv7mw"] Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.687685 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="kubecfg-setup" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.687698 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="kubecfg-setup" Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.687708 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovnkube-controller" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.687714 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovnkube-controller" Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.687725 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="kube-rbac-proxy-node" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.687732 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="kube-rbac-proxy-node" Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.687742 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="sbdb" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.687748 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="sbdb" Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.687756 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="kube-rbac-proxy-ovn-metrics" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.687763 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="kube-rbac-proxy-ovn-metrics" Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.687770 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="northd" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.687775 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="northd" Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.687782 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovnkube-controller" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.687788 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovnkube-controller" Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.687795 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovn-controller" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.687801 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovn-controller" Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.687809 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovnkube-controller" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.687815 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovnkube-controller" Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.687825 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovn-acl-logging" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.687831 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovn-acl-logging" Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.687839 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="nbdb" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.687845 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="nbdb" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.687939 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="kube-rbac-proxy-node" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.687949 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="sbdb" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.687959 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovnkube-controller" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.687964 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="kube-rbac-proxy-ovn-metrics" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.687971 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovn-controller" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.687980 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="nbdb" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.687989 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovn-acl-logging" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.687997 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="northd" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.688004 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovnkube-controller" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.688010 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovnkube-controller" Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.688103 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovnkube-controller" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.688109 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovnkube-controller" Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.688119 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovnkube-controller" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.688125 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovnkube-controller" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.688585 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovnkube-controller" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.688602 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovnkube-controller" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.696309 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.762669 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-systemd-units\") pod \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.762733 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqvlh\" (UniqueName: \"kubernetes.io/projected/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-kube-api-access-kqvlh\") pod \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.762781 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-kubelet\") pod \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.762804 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-cni-bin\") pod \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.762829 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-run-openvswitch\") pod \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.762863 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-run-ovn-kubernetes\") pod \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.762875 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-run-systemd\") pod \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.762890 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-ovnkube-config\") pod \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.762928 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-cni-netd\") pod \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.762944 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-slash\") pod \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.762961 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-node-log\") pod \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.762977 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-ovnkube-script-lib\") pod \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763004 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-env-overrides\") pod \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763033 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763057 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-log-socket\") pod \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763101 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-run-ovn\") pod \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763120 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-run-netns\") pod \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763136 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-var-lib-openvswitch\") pod \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763150 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-etc-openvswitch\") pod \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763173 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-ovn-node-metrics-cert\") pod \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763285 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-run-systemd\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763304 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-run-openvswitch\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763320 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-log-socket\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763338 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-host-run-netns\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763363 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-etc-openvswitch\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763379 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fa26d667-5bcd-4849-b79d-e47e08e703d1-ovnkube-config\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763402 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-host-run-ovn-kubernetes\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763427 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-run-ovn\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763443 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fa26d667-5bcd-4849-b79d-e47e08e703d1-ovn-node-metrics-cert\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763461 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b8fv\" (UniqueName: \"kubernetes.io/projected/fa26d667-5bcd-4849-b79d-e47e08e703d1-kube-api-access-8b8fv\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763476 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-var-lib-openvswitch\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763501 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-host-cni-bin\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763516 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fa26d667-5bcd-4849-b79d-e47e08e703d1-env-overrides\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763537 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-host-cni-netd\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763557 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763576 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-host-kubelet\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763595 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fa26d667-5bcd-4849-b79d-e47e08e703d1-ovnkube-script-lib\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763612 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-systemd-units\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763624 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-node-log\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763641 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-host-slash\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763728 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "e2bd7bac-21cf-4657-ab84-68a14f99f8f0" (UID: "e2bd7bac-21cf-4657-ab84-68a14f99f8f0"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763975 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "e2bd7bac-21cf-4657-ab84-68a14f99f8f0" (UID: "e2bd7bac-21cf-4657-ab84-68a14f99f8f0"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.764141 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "e2bd7bac-21cf-4657-ab84-68a14f99f8f0" (UID: "e2bd7bac-21cf-4657-ab84-68a14f99f8f0"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.764147 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "e2bd7bac-21cf-4657-ab84-68a14f99f8f0" (UID: "e2bd7bac-21cf-4657-ab84-68a14f99f8f0"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.764156 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "e2bd7bac-21cf-4657-ab84-68a14f99f8f0" (UID: "e2bd7bac-21cf-4657-ab84-68a14f99f8f0"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.764207 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "e2bd7bac-21cf-4657-ab84-68a14f99f8f0" (UID: "e2bd7bac-21cf-4657-ab84-68a14f99f8f0"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.764243 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-log-socket" (OuterVolumeSpecName: "log-socket") pod "e2bd7bac-21cf-4657-ab84-68a14f99f8f0" (UID: "e2bd7bac-21cf-4657-ab84-68a14f99f8f0"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.764260 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "e2bd7bac-21cf-4657-ab84-68a14f99f8f0" (UID: "e2bd7bac-21cf-4657-ab84-68a14f99f8f0"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.764276 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "e2bd7bac-21cf-4657-ab84-68a14f99f8f0" (UID: "e2bd7bac-21cf-4657-ab84-68a14f99f8f0"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.764289 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "e2bd7bac-21cf-4657-ab84-68a14f99f8f0" (UID: "e2bd7bac-21cf-4657-ab84-68a14f99f8f0"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.764313 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "e2bd7bac-21cf-4657-ab84-68a14f99f8f0" (UID: "e2bd7bac-21cf-4657-ab84-68a14f99f8f0"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.764342 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "e2bd7bac-21cf-4657-ab84-68a14f99f8f0" (UID: "e2bd7bac-21cf-4657-ab84-68a14f99f8f0"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.764378 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-slash" (OuterVolumeSpecName: "host-slash") pod "e2bd7bac-21cf-4657-ab84-68a14f99f8f0" (UID: "e2bd7bac-21cf-4657-ab84-68a14f99f8f0"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.764402 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-node-log" (OuterVolumeSpecName: "node-log") pod "e2bd7bac-21cf-4657-ab84-68a14f99f8f0" (UID: "e2bd7bac-21cf-4657-ab84-68a14f99f8f0"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.764667 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "e2bd7bac-21cf-4657-ab84-68a14f99f8f0" (UID: "e2bd7bac-21cf-4657-ab84-68a14f99f8f0"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.764656 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "e2bd7bac-21cf-4657-ab84-68a14f99f8f0" (UID: "e2bd7bac-21cf-4657-ab84-68a14f99f8f0"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.764881 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "e2bd7bac-21cf-4657-ab84-68a14f99f8f0" (UID: "e2bd7bac-21cf-4657-ab84-68a14f99f8f0"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.769492 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-kube-api-access-kqvlh" (OuterVolumeSpecName: "kube-api-access-kqvlh") pod "e2bd7bac-21cf-4657-ab84-68a14f99f8f0" (UID: "e2bd7bac-21cf-4657-ab84-68a14f99f8f0"). InnerVolumeSpecName "kube-api-access-kqvlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.769648 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "e2bd7bac-21cf-4657-ab84-68a14f99f8f0" (UID: "e2bd7bac-21cf-4657-ab84-68a14f99f8f0"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.776800 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "e2bd7bac-21cf-4657-ab84-68a14f99f8f0" (UID: "e2bd7bac-21cf-4657-ab84-68a14f99f8f0"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.813205 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pq28p_ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3/kube-multus/2.log" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.813692 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pq28p_ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3/kube-multus/1.log" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.813741 4792 generic.go:334] "Generic (PLEG): container finished" podID="ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3" containerID="43e6bc214d9cf4f260a6b3f92b2bb7a5207d98f68c3fc275ce08d07a9684d65b" exitCode=2 Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.813812 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pq28p" event={"ID":"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3","Type":"ContainerDied","Data":"43e6bc214d9cf4f260a6b3f92b2bb7a5207d98f68c3fc275ce08d07a9684d65b"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.813855 4792 scope.go:117] "RemoveContainer" containerID="833126c3956c8927e1b68252bd9962e43df9c6e09dc2b98a20208c2db19a5fc1" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.814472 4792 scope.go:117] "RemoveContainer" containerID="43e6bc214d9cf4f260a6b3f92b2bb7a5207d98f68c3fc275ce08d07a9684d65b" Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.814838 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-pq28p_openshift-multus(ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3)\"" pod="openshift-multus/multus-pq28p" podUID="ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.817432 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pp7m_e2bd7bac-21cf-4657-ab84-68a14f99f8f0/ovnkube-controller/3.log" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.820775 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pp7m_e2bd7bac-21cf-4657-ab84-68a14f99f8f0/ovn-acl-logging/0.log" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821221 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pp7m_e2bd7bac-21cf-4657-ab84-68a14f99f8f0/ovn-controller/0.log" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821514 4792 generic.go:334] "Generic (PLEG): container finished" podID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerID="9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788" exitCode=0 Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821536 4792 generic.go:334] "Generic (PLEG): container finished" podID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerID="63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0" exitCode=0 Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821544 4792 generic.go:334] "Generic (PLEG): container finished" podID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerID="ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe" exitCode=0 Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821555 4792 generic.go:334] "Generic (PLEG): container finished" podID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerID="0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e" exitCode=0 Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821562 4792 generic.go:334] "Generic (PLEG): container finished" podID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerID="05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8" exitCode=0 Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821569 4792 generic.go:334] "Generic (PLEG): container finished" podID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerID="5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3" exitCode=0 Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821575 4792 generic.go:334] "Generic (PLEG): container finished" podID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerID="0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a" exitCode=143 Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821581 4792 generic.go:334] "Generic (PLEG): container finished" podID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerID="d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948" exitCode=143 Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821609 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerDied","Data":"9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821635 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerDied","Data":"63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821646 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerDied","Data":"ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821655 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerDied","Data":"0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821667 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerDied","Data":"05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821676 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerDied","Data":"5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821687 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821696 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821702 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821707 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821712 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821716 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821721 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821726 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821731 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821735 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821742 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerDied","Data":"0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821750 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821756 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821762 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821767 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821772 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821777 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821782 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821788 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821792 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821798 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821805 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerDied","Data":"d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821812 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821818 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821823 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821828 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821834 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821838 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821844 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821849 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821854 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821859 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821865 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerDied","Data":"50a5a13eb582ab0332ca2180448f447182fdf584f81ede3ccf1a5ef0fe6bed57"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821873 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821882 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821888 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821894 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821899 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821936 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821942 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821947 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821952 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821957 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.822069 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.842077 4792 scope.go:117] "RemoveContainer" containerID="9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.862724 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7pp7m"] Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.864459 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fa26d667-5bcd-4849-b79d-e47e08e703d1-ovnkube-script-lib\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.864528 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-systemd-units\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.864560 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-node-log\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.864593 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-host-slash\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.864621 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-run-systemd\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.864646 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-run-openvswitch\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.864664 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-systemd-units\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.864678 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-log-socket\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.864664 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-node-log\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.864735 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-host-run-netns\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.864748 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-run-systemd\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.864766 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-run-openvswitch\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.864772 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-host-slash\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.864791 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-host-run-netns\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.864801 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-etc-openvswitch\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.864846 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fa26d667-5bcd-4849-b79d-e47e08e703d1-ovnkube-config\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.864736 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-log-socket\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.864894 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-host-run-ovn-kubernetes\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.864974 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b8fv\" (UniqueName: \"kubernetes.io/projected/fa26d667-5bcd-4849-b79d-e47e08e703d1-kube-api-access-8b8fv\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865005 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-run-ovn\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865031 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fa26d667-5bcd-4849-b79d-e47e08e703d1-ovn-node-metrics-cert\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865049 4792 scope.go:117] "RemoveContainer" containerID="157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865076 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-var-lib-openvswitch\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.864818 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-etc-openvswitch\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865053 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-var-lib-openvswitch\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865120 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-host-run-ovn-kubernetes\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865168 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-host-cni-bin\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865187 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fa26d667-5bcd-4849-b79d-e47e08e703d1-env-overrides\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865222 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-host-cni-netd\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865261 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865282 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-host-kubelet\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865330 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fa26d667-5bcd-4849-b79d-e47e08e703d1-ovnkube-script-lib\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865355 4792 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865366 4792 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865399 4792 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865409 4792 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865418 4792 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-slash\") on node \"crc\" DevicePath \"\"" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865427 4792 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865443 4792 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-node-log\") on node \"crc\" DevicePath \"\"" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865454 4792 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865464 4792 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865470 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-run-ovn\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865442 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fa26d667-5bcd-4849-b79d-e47e08e703d1-ovnkube-config\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865475 4792 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-log-socket\") on node \"crc\" DevicePath \"\"" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865503 4792 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865511 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-host-cni-netd\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865513 4792 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865532 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-host-cni-bin\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865558 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865575 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-host-kubelet\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865590 4792 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865599 4792 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865608 4792 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865626 4792 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865635 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqvlh\" (UniqueName: \"kubernetes.io/projected/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-kube-api-access-kqvlh\") on node \"crc\" DevicePath \"\"" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865644 4792 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865652 4792 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865661 4792 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865777 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fa26d667-5bcd-4849-b79d-e47e08e703d1-env-overrides\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.868229 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7pp7m"] Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.868497 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fa26d667-5bcd-4849-b79d-e47e08e703d1-ovn-node-metrics-cert\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.881722 4792 scope.go:117] "RemoveContainer" containerID="63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.882972 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b8fv\" (UniqueName: \"kubernetes.io/projected/fa26d667-5bcd-4849-b79d-e47e08e703d1-kube-api-access-8b8fv\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.894704 4792 scope.go:117] "RemoveContainer" containerID="ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.905511 4792 scope.go:117] "RemoveContainer" containerID="0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.917414 4792 scope.go:117] "RemoveContainer" containerID="05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.929886 4792 scope.go:117] "RemoveContainer" containerID="5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.942326 4792 scope.go:117] "RemoveContainer" containerID="0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.956203 4792 scope.go:117] "RemoveContainer" containerID="d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.970628 4792 scope.go:117] "RemoveContainer" containerID="d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.988219 4792 scope.go:117] "RemoveContainer" containerID="9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788" Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.989603 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788\": container with ID starting with 9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788 not found: ID does not exist" containerID="9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.989713 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788"} err="failed to get container status \"9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788\": rpc error: code = NotFound desc = could not find container \"9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788\": container with ID starting with 9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788 not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.989800 4792 scope.go:117] "RemoveContainer" containerID="157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c" Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.990510 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c\": container with ID starting with 157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c not found: ID does not exist" containerID="157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.990659 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c"} err="failed to get container status \"157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c\": rpc error: code = NotFound desc = could not find container \"157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c\": container with ID starting with 157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.990710 4792 scope.go:117] "RemoveContainer" containerID="63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0" Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.991409 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\": container with ID starting with 63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0 not found: ID does not exist" containerID="63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.991434 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0"} err="failed to get container status \"63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\": rpc error: code = NotFound desc = could not find container \"63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\": container with ID starting with 63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0 not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.991457 4792 scope.go:117] "RemoveContainer" containerID="ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe" Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.991898 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\": container with ID starting with ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe not found: ID does not exist" containerID="ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.991964 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe"} err="failed to get container status \"ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\": rpc error: code = NotFound desc = could not find container \"ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\": container with ID starting with ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.992010 4792 scope.go:117] "RemoveContainer" containerID="0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e" Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.992405 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\": container with ID starting with 0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e not found: ID does not exist" containerID="0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.992445 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e"} err="failed to get container status \"0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\": rpc error: code = NotFound desc = could not find container \"0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\": container with ID starting with 0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.992463 4792 scope.go:117] "RemoveContainer" containerID="05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8" Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.992891 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\": container with ID starting with 05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8 not found: ID does not exist" containerID="05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.992984 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8"} err="failed to get container status \"05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\": rpc error: code = NotFound desc = could not find container \"05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\": container with ID starting with 05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8 not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.993046 4792 scope.go:117] "RemoveContainer" containerID="5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3" Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.993462 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\": container with ID starting with 5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3 not found: ID does not exist" containerID="5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.993515 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3"} err="failed to get container status \"5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\": rpc error: code = NotFound desc = could not find container \"5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\": container with ID starting with 5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3 not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.993538 4792 scope.go:117] "RemoveContainer" containerID="0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a" Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.993876 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\": container with ID starting with 0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a not found: ID does not exist" containerID="0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.993982 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a"} err="failed to get container status \"0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\": rpc error: code = NotFound desc = could not find container \"0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\": container with ID starting with 0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.994090 4792 scope.go:117] "RemoveContainer" containerID="d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948" Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.994584 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\": container with ID starting with d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948 not found: ID does not exist" containerID="d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.994624 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948"} err="failed to get container status \"d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\": rpc error: code = NotFound desc = could not find container \"d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\": container with ID starting with d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948 not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.994642 4792 scope.go:117] "RemoveContainer" containerID="d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6" Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.994998 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\": container with ID starting with d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6 not found: ID does not exist" containerID="d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.995022 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6"} err="failed to get container status \"d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\": rpc error: code = NotFound desc = could not find container \"d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\": container with ID starting with d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6 not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.995038 4792 scope.go:117] "RemoveContainer" containerID="9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.995306 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788"} err="failed to get container status \"9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788\": rpc error: code = NotFound desc = could not find container \"9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788\": container with ID starting with 9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788 not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.995332 4792 scope.go:117] "RemoveContainer" containerID="157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.995567 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c"} err="failed to get container status \"157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c\": rpc error: code = NotFound desc = could not find container \"157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c\": container with ID starting with 157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.995588 4792 scope.go:117] "RemoveContainer" containerID="63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.995987 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0"} err="failed to get container status \"63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\": rpc error: code = NotFound desc = could not find container \"63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\": container with ID starting with 63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0 not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.996008 4792 scope.go:117] "RemoveContainer" containerID="ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.996273 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe"} err="failed to get container status \"ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\": rpc error: code = NotFound desc = could not find container \"ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\": container with ID starting with ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.996293 4792 scope.go:117] "RemoveContainer" containerID="0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.996644 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e"} err="failed to get container status \"0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\": rpc error: code = NotFound desc = could not find container \"0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\": container with ID starting with 0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.996665 4792 scope.go:117] "RemoveContainer" containerID="05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.996850 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8"} err="failed to get container status \"05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\": rpc error: code = NotFound desc = could not find container \"05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\": container with ID starting with 05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8 not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.996871 4792 scope.go:117] "RemoveContainer" containerID="5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.997220 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3"} err="failed to get container status \"5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\": rpc error: code = NotFound desc = could not find container \"5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\": container with ID starting with 5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3 not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.997243 4792 scope.go:117] "RemoveContainer" containerID="0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.997520 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a"} err="failed to get container status \"0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\": rpc error: code = NotFound desc = could not find container \"0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\": container with ID starting with 0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.997542 4792 scope.go:117] "RemoveContainer" containerID="d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.997819 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948"} err="failed to get container status \"d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\": rpc error: code = NotFound desc = could not find container \"d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\": container with ID starting with d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948 not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.997860 4792 scope.go:117] "RemoveContainer" containerID="d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.998342 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6"} err="failed to get container status \"d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\": rpc error: code = NotFound desc = could not find container \"d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\": container with ID starting with d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6 not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.998365 4792 scope.go:117] "RemoveContainer" containerID="9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.998583 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788"} err="failed to get container status \"9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788\": rpc error: code = NotFound desc = could not find container \"9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788\": container with ID starting with 9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788 not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.998606 4792 scope.go:117] "RemoveContainer" containerID="157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.998803 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c"} err="failed to get container status \"157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c\": rpc error: code = NotFound desc = could not find container \"157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c\": container with ID starting with 157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.998836 4792 scope.go:117] "RemoveContainer" containerID="63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.999234 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0"} err="failed to get container status \"63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\": rpc error: code = NotFound desc = could not find container \"63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\": container with ID starting with 63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0 not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.999258 4792 scope.go:117] "RemoveContainer" containerID="ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.999509 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe"} err="failed to get container status \"ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\": rpc error: code = NotFound desc = could not find container \"ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\": container with ID starting with ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.999529 4792 scope.go:117] "RemoveContainer" containerID="0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.999871 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e"} err="failed to get container status \"0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\": rpc error: code = NotFound desc = could not find container \"0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\": container with ID starting with 0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.999895 4792 scope.go:117] "RemoveContainer" containerID="05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.001290 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8"} err="failed to get container status \"05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\": rpc error: code = NotFound desc = could not find container \"05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\": container with ID starting with 05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8 not found: ID does not exist" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.001313 4792 scope.go:117] "RemoveContainer" containerID="5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.001681 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3"} err="failed to get container status \"5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\": rpc error: code = NotFound desc = could not find container \"5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\": container with ID starting with 5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3 not found: ID does not exist" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.001708 4792 scope.go:117] "RemoveContainer" containerID="0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.001980 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a"} err="failed to get container status \"0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\": rpc error: code = NotFound desc = could not find container \"0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\": container with ID starting with 0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a not found: ID does not exist" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.002007 4792 scope.go:117] "RemoveContainer" containerID="d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.002377 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948"} err="failed to get container status \"d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\": rpc error: code = NotFound desc = could not find container \"d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\": container with ID starting with d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948 not found: ID does not exist" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.002399 4792 scope.go:117] "RemoveContainer" containerID="d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.002681 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6"} err="failed to get container status \"d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\": rpc error: code = NotFound desc = could not find container \"d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\": container with ID starting with d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6 not found: ID does not exist" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.002709 4792 scope.go:117] "RemoveContainer" containerID="9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.002989 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788"} err="failed to get container status \"9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788\": rpc error: code = NotFound desc = could not find container \"9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788\": container with ID starting with 9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788 not found: ID does not exist" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.003013 4792 scope.go:117] "RemoveContainer" containerID="157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.003201 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c"} err="failed to get container status \"157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c\": rpc error: code = NotFound desc = could not find container \"157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c\": container with ID starting with 157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c not found: ID does not exist" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.003225 4792 scope.go:117] "RemoveContainer" containerID="63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.004272 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0"} err="failed to get container status \"63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\": rpc error: code = NotFound desc = could not find container \"63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\": container with ID starting with 63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0 not found: ID does not exist" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.004295 4792 scope.go:117] "RemoveContainer" containerID="ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.004742 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe"} err="failed to get container status \"ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\": rpc error: code = NotFound desc = could not find container \"ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\": container with ID starting with ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe not found: ID does not exist" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.004773 4792 scope.go:117] "RemoveContainer" containerID="0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.005199 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e"} err="failed to get container status \"0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\": rpc error: code = NotFound desc = could not find container \"0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\": container with ID starting with 0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e not found: ID does not exist" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.005229 4792 scope.go:117] "RemoveContainer" containerID="05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.005457 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8"} err="failed to get container status \"05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\": rpc error: code = NotFound desc = could not find container \"05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\": container with ID starting with 05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8 not found: ID does not exist" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.005478 4792 scope.go:117] "RemoveContainer" containerID="5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.006048 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3"} err="failed to get container status \"5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\": rpc error: code = NotFound desc = could not find container \"5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\": container with ID starting with 5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3 not found: ID does not exist" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.006069 4792 scope.go:117] "RemoveContainer" containerID="0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.006316 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a"} err="failed to get container status \"0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\": rpc error: code = NotFound desc = could not find container \"0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\": container with ID starting with 0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a not found: ID does not exist" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.006337 4792 scope.go:117] "RemoveContainer" containerID="d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.006872 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948"} err="failed to get container status \"d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\": rpc error: code = NotFound desc = could not find container \"d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\": container with ID starting with d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948 not found: ID does not exist" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.006896 4792 scope.go:117] "RemoveContainer" containerID="d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.007257 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6"} err="failed to get container status \"d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\": rpc error: code = NotFound desc = could not find container \"d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\": container with ID starting with d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6 not found: ID does not exist" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.020948 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:08 crc kubenswrapper[4792]: W0301 09:21:08.043931 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa26d667_5bcd_4849_b79d_e47e08e703d1.slice/crio-fe65abd025369aebfe45a140a086bc8b6b2e24c306998db6e63f2381f1ac688e WatchSource:0}: Error finding container fe65abd025369aebfe45a140a086bc8b6b2e24c306998db6e63f2381f1ac688e: Status 404 returned error can't find the container with id fe65abd025369aebfe45a140a086bc8b6b2e24c306998db6e63f2381f1ac688e Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.829491 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pq28p_ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3/kube-multus/2.log" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.832252 4792 generic.go:334] "Generic (PLEG): container finished" podID="fa26d667-5bcd-4849-b79d-e47e08e703d1" containerID="6ad83038e4755fd9a4617fe948f19c9667dee0891394f503d180010493494e46" exitCode=0 Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.832288 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" event={"ID":"fa26d667-5bcd-4849-b79d-e47e08e703d1","Type":"ContainerDied","Data":"6ad83038e4755fd9a4617fe948f19c9667dee0891394f503d180010493494e46"} Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.832311 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" event={"ID":"fa26d667-5bcd-4849-b79d-e47e08e703d1","Type":"ContainerStarted","Data":"fe65abd025369aebfe45a140a086bc8b6b2e24c306998db6e63f2381f1ac688e"} Mar 01 09:21:09 crc kubenswrapper[4792]: I0301 09:21:09.415081 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" path="/var/lib/kubelet/pods/e2bd7bac-21cf-4657-ab84-68a14f99f8f0/volumes" Mar 01 09:21:09 crc kubenswrapper[4792]: I0301 09:21:09.844669 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" event={"ID":"fa26d667-5bcd-4849-b79d-e47e08e703d1","Type":"ContainerStarted","Data":"2eae60e96dacbd394de9d78ba8aaebfeb80a83491f5e7a574680cb7c0d675d03"} Mar 01 09:21:09 crc kubenswrapper[4792]: I0301 09:21:09.844721 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" event={"ID":"fa26d667-5bcd-4849-b79d-e47e08e703d1","Type":"ContainerStarted","Data":"c52bfc2bed86d384469b53f4a7cfd14f8174bdbd6b00e0c169d18730ac906eb9"} Mar 01 09:21:09 crc kubenswrapper[4792]: I0301 09:21:09.844734 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" event={"ID":"fa26d667-5bcd-4849-b79d-e47e08e703d1","Type":"ContainerStarted","Data":"3372bcd280ade68b9a66bfecefd5d6817ff01835afca0f9d6dd203fa2b2fd016"} Mar 01 09:21:09 crc kubenswrapper[4792]: I0301 09:21:09.844745 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" event={"ID":"fa26d667-5bcd-4849-b79d-e47e08e703d1","Type":"ContainerStarted","Data":"8001de267e82d6191b4533745448927cda99dbcd39bcaadeaeedd60ea3c6a167"} Mar 01 09:21:09 crc kubenswrapper[4792]: I0301 09:21:09.844756 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" event={"ID":"fa26d667-5bcd-4849-b79d-e47e08e703d1","Type":"ContainerStarted","Data":"e87257de761d1810d6bd3514e204c87ebf71f832dc4c29e423b010f0523889bc"} Mar 01 09:21:09 crc kubenswrapper[4792]: I0301 09:21:09.844767 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" event={"ID":"fa26d667-5bcd-4849-b79d-e47e08e703d1","Type":"ContainerStarted","Data":"75347305ce5e53b1700c3a13e4c3619fd2e4bcdfbf6da5dfa5ec0c63ac1491b6"} Mar 01 09:21:11 crc kubenswrapper[4792]: I0301 09:21:11.859857 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" event={"ID":"fa26d667-5bcd-4849-b79d-e47e08e703d1","Type":"ContainerStarted","Data":"013c9d261e6682f9b171e22ddb5f4a018944dc084003803521411736649fbf53"} Mar 01 09:21:14 crc kubenswrapper[4792]: I0301 09:21:14.887548 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" event={"ID":"fa26d667-5bcd-4849-b79d-e47e08e703d1","Type":"ContainerStarted","Data":"da8fe09d99e658d57a1f211fbd8110b35a1ff986c90fd4fa4bfee69ae029b0a7"} Mar 01 09:21:14 crc kubenswrapper[4792]: I0301 09:21:14.889007 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:14 crc kubenswrapper[4792]: I0301 09:21:14.889065 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:14 crc kubenswrapper[4792]: I0301 09:21:14.890591 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:14 crc kubenswrapper[4792]: I0301 09:21:14.914229 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:14 crc kubenswrapper[4792]: I0301 09:21:14.915246 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:14 crc kubenswrapper[4792]: I0301 09:21:14.919814 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" podStartSLOduration=7.919800515 podStartE2EDuration="7.919800515s" podCreationTimestamp="2026-03-01 09:21:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:21:14.914833602 +0000 UTC m=+804.156712789" watchObservedRunningTime="2026-03-01 09:21:14.919800515 +0000 UTC m=+804.161679712" Mar 01 09:21:22 crc kubenswrapper[4792]: I0301 09:21:22.408645 4792 scope.go:117] "RemoveContainer" containerID="43e6bc214d9cf4f260a6b3f92b2bb7a5207d98f68c3fc275ce08d07a9684d65b" Mar 01 09:21:22 crc kubenswrapper[4792]: E0301 09:21:22.409400 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-pq28p_openshift-multus(ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3)\"" pod="openshift-multus/multus-pq28p" podUID="ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3" Mar 01 09:21:35 crc kubenswrapper[4792]: I0301 09:21:35.408629 4792 scope.go:117] "RemoveContainer" containerID="43e6bc214d9cf4f260a6b3f92b2bb7a5207d98f68c3fc275ce08d07a9684d65b" Mar 01 09:21:35 crc kubenswrapper[4792]: I0301 09:21:35.906161 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst"] Mar 01 09:21:35 crc kubenswrapper[4792]: I0301 09:21:35.908082 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst" Mar 01 09:21:35 crc kubenswrapper[4792]: I0301 09:21:35.910390 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 01 09:21:35 crc kubenswrapper[4792]: I0301 09:21:35.915152 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst"] Mar 01 09:21:36 crc kubenswrapper[4792]: I0301 09:21:36.035387 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst\" (UID: \"8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst" Mar 01 09:21:36 crc kubenswrapper[4792]: I0301 09:21:36.035501 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst\" (UID: \"8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst" Mar 01 09:21:36 crc kubenswrapper[4792]: I0301 09:21:36.035551 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhxg2\" (UniqueName: \"kubernetes.io/projected/8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd-kube-api-access-xhxg2\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst\" (UID: \"8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst" Mar 01 09:21:36 crc kubenswrapper[4792]: I0301 09:21:36.042259 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pq28p_ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3/kube-multus/2.log" Mar 01 09:21:36 crc kubenswrapper[4792]: I0301 09:21:36.042315 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pq28p" event={"ID":"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3","Type":"ContainerStarted","Data":"8e17d1f3b79d200f140ec6fd0c086d624c03058d1874684b990b78a70fe1d430"} Mar 01 09:21:36 crc kubenswrapper[4792]: I0301 09:21:36.136617 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst\" (UID: \"8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst" Mar 01 09:21:36 crc kubenswrapper[4792]: I0301 09:21:36.136978 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst\" (UID: \"8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst" Mar 01 09:21:36 crc kubenswrapper[4792]: I0301 09:21:36.137035 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhxg2\" (UniqueName: \"kubernetes.io/projected/8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd-kube-api-access-xhxg2\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst\" (UID: \"8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst" Mar 01 09:21:36 crc kubenswrapper[4792]: I0301 09:21:36.137186 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst\" (UID: \"8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst" Mar 01 09:21:36 crc kubenswrapper[4792]: I0301 09:21:36.137388 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst\" (UID: \"8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst" Mar 01 09:21:36 crc kubenswrapper[4792]: I0301 09:21:36.156604 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhxg2\" (UniqueName: \"kubernetes.io/projected/8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd-kube-api-access-xhxg2\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst\" (UID: \"8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst" Mar 01 09:21:36 crc kubenswrapper[4792]: I0301 09:21:36.219826 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst" Mar 01 09:21:36 crc kubenswrapper[4792]: E0301 09:21:36.243203 4792 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst_openshift-marketplace_8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd_0(982d1baaac43b414f35be9de01a2fcb770603a9e76361e62b5a12971ac545955): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 01 09:21:36 crc kubenswrapper[4792]: E0301 09:21:36.243270 4792 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst_openshift-marketplace_8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd_0(982d1baaac43b414f35be9de01a2fcb770603a9e76361e62b5a12971ac545955): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst" Mar 01 09:21:36 crc kubenswrapper[4792]: E0301 09:21:36.243291 4792 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst_openshift-marketplace_8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd_0(982d1baaac43b414f35be9de01a2fcb770603a9e76361e62b5a12971ac545955): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst" Mar 01 09:21:36 crc kubenswrapper[4792]: E0301 09:21:36.243335 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst_openshift-marketplace(8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst_openshift-marketplace(8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst_openshift-marketplace_8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd_0(982d1baaac43b414f35be9de01a2fcb770603a9e76361e62b5a12971ac545955): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst" podUID="8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd" Mar 01 09:21:37 crc kubenswrapper[4792]: I0301 09:21:37.047842 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst" Mar 01 09:21:37 crc kubenswrapper[4792]: I0301 09:21:37.048509 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst" Mar 01 09:21:37 crc kubenswrapper[4792]: I0301 09:21:37.298781 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst"] Mar 01 09:21:38 crc kubenswrapper[4792]: I0301 09:21:38.041952 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:38 crc kubenswrapper[4792]: I0301 09:21:38.056046 4792 generic.go:334] "Generic (PLEG): container finished" podID="8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd" containerID="76925cc358923b750081a35dd80c9bf1cc4033be7b2c6170e4dc21dd66f8ec00" exitCode=0 Mar 01 09:21:38 crc kubenswrapper[4792]: I0301 09:21:38.056091 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst" event={"ID":"8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd","Type":"ContainerDied","Data":"76925cc358923b750081a35dd80c9bf1cc4033be7b2c6170e4dc21dd66f8ec00"} Mar 01 09:21:38 crc kubenswrapper[4792]: I0301 09:21:38.056115 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst" event={"ID":"8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd","Type":"ContainerStarted","Data":"fa7a7493167702c92a410f30bd06af51593614b91aa3e9ddce0d957224b436d9"} Mar 01 09:21:40 crc kubenswrapper[4792]: I0301 09:21:40.067865 4792 generic.go:334] "Generic (PLEG): container finished" podID="8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd" containerID="b8da3653fca540d47b970da8d99ec257f6fa500230f30cb86cb66bf52be288aa" exitCode=0 Mar 01 09:21:40 crc kubenswrapper[4792]: I0301 09:21:40.067935 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst" event={"ID":"8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd","Type":"ContainerDied","Data":"b8da3653fca540d47b970da8d99ec257f6fa500230f30cb86cb66bf52be288aa"} Mar 01 09:21:40 crc kubenswrapper[4792]: E0301 09:21:40.395988 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b2fbe4e_4a71_4ce1_b7cc_3063b89d65bd.slice/crio-conmon-5d89cfe5e15e45905d7e95e396f3b50c3188d4882555350d9b4867807a42dc09.scope\": RecentStats: unable to find data in memory cache]" Mar 01 09:21:41 crc kubenswrapper[4792]: I0301 09:21:41.078055 4792 generic.go:334] "Generic (PLEG): container finished" podID="8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd" containerID="5d89cfe5e15e45905d7e95e396f3b50c3188d4882555350d9b4867807a42dc09" exitCode=0 Mar 01 09:21:41 crc kubenswrapper[4792]: I0301 09:21:41.078114 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst" event={"ID":"8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd","Type":"ContainerDied","Data":"5d89cfe5e15e45905d7e95e396f3b50c3188d4882555350d9b4867807a42dc09"} Mar 01 09:21:42 crc kubenswrapper[4792]: I0301 09:21:42.336444 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst" Mar 01 09:21:42 crc kubenswrapper[4792]: I0301 09:21:42.517206 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd-util\") pod \"8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd\" (UID: \"8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd\") " Mar 01 09:21:42 crc kubenswrapper[4792]: I0301 09:21:42.517281 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhxg2\" (UniqueName: \"kubernetes.io/projected/8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd-kube-api-access-xhxg2\") pod \"8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd\" (UID: \"8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd\") " Mar 01 09:21:42 crc kubenswrapper[4792]: I0301 09:21:42.517315 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd-bundle\") pod \"8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd\" (UID: \"8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd\") " Mar 01 09:21:42 crc kubenswrapper[4792]: I0301 09:21:42.517850 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd-bundle" (OuterVolumeSpecName: "bundle") pod "8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd" (UID: "8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:21:42 crc kubenswrapper[4792]: I0301 09:21:42.525942 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd-kube-api-access-xhxg2" (OuterVolumeSpecName: "kube-api-access-xhxg2") pod "8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd" (UID: "8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd"). InnerVolumeSpecName "kube-api-access-xhxg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:21:42 crc kubenswrapper[4792]: I0301 09:21:42.534733 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd-util" (OuterVolumeSpecName: "util") pod "8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd" (UID: "8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:21:42 crc kubenswrapper[4792]: I0301 09:21:42.618973 4792 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd-util\") on node \"crc\" DevicePath \"\"" Mar 01 09:21:42 crc kubenswrapper[4792]: I0301 09:21:42.619052 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhxg2\" (UniqueName: \"kubernetes.io/projected/8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd-kube-api-access-xhxg2\") on node \"crc\" DevicePath \"\"" Mar 01 09:21:42 crc kubenswrapper[4792]: I0301 09:21:42.619070 4792 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:21:43 crc kubenswrapper[4792]: I0301 09:21:43.093651 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst" event={"ID":"8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd","Type":"ContainerDied","Data":"fa7a7493167702c92a410f30bd06af51593614b91aa3e9ddce0d957224b436d9"} Mar 01 09:21:43 crc kubenswrapper[4792]: I0301 09:21:43.094143 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa7a7493167702c92a410f30bd06af51593614b91aa3e9ddce0d957224b436d9" Mar 01 09:21:43 crc kubenswrapper[4792]: I0301 09:21:43.094227 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst" Mar 01 09:21:44 crc kubenswrapper[4792]: I0301 09:21:44.429872 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-chfpw"] Mar 01 09:21:44 crc kubenswrapper[4792]: E0301 09:21:44.430139 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd" containerName="pull" Mar 01 09:21:44 crc kubenswrapper[4792]: I0301 09:21:44.430151 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd" containerName="pull" Mar 01 09:21:44 crc kubenswrapper[4792]: E0301 09:21:44.430160 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd" containerName="util" Mar 01 09:21:44 crc kubenswrapper[4792]: I0301 09:21:44.430167 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd" containerName="util" Mar 01 09:21:44 crc kubenswrapper[4792]: E0301 09:21:44.430177 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd" containerName="extract" Mar 01 09:21:44 crc kubenswrapper[4792]: I0301 09:21:44.430184 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd" containerName="extract" Mar 01 09:21:44 crc kubenswrapper[4792]: I0301 09:21:44.430276 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd" containerName="extract" Mar 01 09:21:44 crc kubenswrapper[4792]: I0301 09:21:44.430641 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-chfpw" Mar 01 09:21:44 crc kubenswrapper[4792]: I0301 09:21:44.432198 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-s9kc7" Mar 01 09:21:44 crc kubenswrapper[4792]: I0301 09:21:44.433418 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 01 09:21:44 crc kubenswrapper[4792]: I0301 09:21:44.434418 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 01 09:21:44 crc kubenswrapper[4792]: I0301 09:21:44.479418 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-chfpw"] Mar 01 09:21:44 crc kubenswrapper[4792]: I0301 09:21:44.540752 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9dxl\" (UniqueName: \"kubernetes.io/projected/fb942d1c-2a1a-4265-ae29-02f185d4cc40-kube-api-access-g9dxl\") pod \"nmstate-operator-75c5dccd6c-chfpw\" (UID: \"fb942d1c-2a1a-4265-ae29-02f185d4cc40\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-chfpw" Mar 01 09:21:44 crc kubenswrapper[4792]: I0301 09:21:44.642423 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9dxl\" (UniqueName: \"kubernetes.io/projected/fb942d1c-2a1a-4265-ae29-02f185d4cc40-kube-api-access-g9dxl\") pod \"nmstate-operator-75c5dccd6c-chfpw\" (UID: \"fb942d1c-2a1a-4265-ae29-02f185d4cc40\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-chfpw" Mar 01 09:21:44 crc kubenswrapper[4792]: I0301 09:21:44.658841 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9dxl\" (UniqueName: \"kubernetes.io/projected/fb942d1c-2a1a-4265-ae29-02f185d4cc40-kube-api-access-g9dxl\") pod \"nmstate-operator-75c5dccd6c-chfpw\" (UID: \"fb942d1c-2a1a-4265-ae29-02f185d4cc40\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-chfpw" Mar 01 09:21:44 crc kubenswrapper[4792]: I0301 09:21:44.744161 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-chfpw" Mar 01 09:21:44 crc kubenswrapper[4792]: I0301 09:21:44.992413 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-chfpw"] Mar 01 09:21:45 crc kubenswrapper[4792]: I0301 09:21:45.104144 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-chfpw" event={"ID":"fb942d1c-2a1a-4265-ae29-02f185d4cc40","Type":"ContainerStarted","Data":"56a3baf522f3befc255427a7e92fb6d3eaea9fe2c0ba4170d7dcb4b23f4876bc"} Mar 01 09:21:48 crc kubenswrapper[4792]: I0301 09:21:48.120790 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-chfpw" event={"ID":"fb942d1c-2a1a-4265-ae29-02f185d4cc40","Type":"ContainerStarted","Data":"246728d3a0a95716bc9c743cd4742fc1074032d3fac38b2b4c5926f20fa51c17"} Mar 01 09:21:48 crc kubenswrapper[4792]: I0301 09:21:48.134160 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-chfpw" podStartSLOduration=1.8528042660000001 podStartE2EDuration="4.134146652s" podCreationTimestamp="2026-03-01 09:21:44 +0000 UTC" firstStartedPulling="2026-03-01 09:21:44.999953816 +0000 UTC m=+834.241833023" lastFinishedPulling="2026-03-01 09:21:47.281296212 +0000 UTC m=+836.523175409" observedRunningTime="2026-03-01 09:21:48.133164757 +0000 UTC m=+837.375043954" watchObservedRunningTime="2026-03-01 09:21:48.134146652 +0000 UTC m=+837.376025849" Mar 01 09:21:48 crc kubenswrapper[4792]: I0301 09:21:48.216445 4792 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.116090 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-97cv9"] Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.116925 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-97cv9" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.118853 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-9pv2w" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.134075 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-97cv9"] Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.159034 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-zwhpc"] Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.159765 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-zwhpc" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.161659 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.172756 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-zwhpc"] Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.179053 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-9j2tz"] Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.179858 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-9j2tz" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.220392 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj7d9\" (UniqueName: \"kubernetes.io/projected/bfe2cc56-28ca-4201-ba5a-4208dd1ec818-kube-api-access-lj7d9\") pod \"nmstate-metrics-69594cc75-97cv9\" (UID: \"bfe2cc56-28ca-4201-ba5a-4208dd1ec818\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-97cv9" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.282999 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mtxkm"] Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.283785 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mtxkm" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.286525 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-kt9z4" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.286525 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.288782 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.302811 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mtxkm"] Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.321986 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkh22\" (UniqueName: \"kubernetes.io/projected/aa2300d6-10c0-4dc9-812a-fcb30f09920e-kube-api-access-hkh22\") pod \"nmstate-webhook-786f45cff4-zwhpc\" (UID: \"aa2300d6-10c0-4dc9-812a-fcb30f09920e\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-zwhpc" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.322025 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5mxc\" (UniqueName: \"kubernetes.io/projected/7105919f-ddac-45db-a8f7-bd927e5737df-kube-api-access-h5mxc\") pod \"nmstate-handler-9j2tz\" (UID: \"7105919f-ddac-45db-a8f7-bd927e5737df\") " pod="openshift-nmstate/nmstate-handler-9j2tz" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.322070 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7105919f-ddac-45db-a8f7-bd927e5737df-nmstate-lock\") pod \"nmstate-handler-9j2tz\" (UID: \"7105919f-ddac-45db-a8f7-bd927e5737df\") " pod="openshift-nmstate/nmstate-handler-9j2tz" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.322102 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7105919f-ddac-45db-a8f7-bd927e5737df-ovs-socket\") pod \"nmstate-handler-9j2tz\" (UID: \"7105919f-ddac-45db-a8f7-bd927e5737df\") " pod="openshift-nmstate/nmstate-handler-9j2tz" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.322123 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7105919f-ddac-45db-a8f7-bd927e5737df-dbus-socket\") pod \"nmstate-handler-9j2tz\" (UID: \"7105919f-ddac-45db-a8f7-bd927e5737df\") " pod="openshift-nmstate/nmstate-handler-9j2tz" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.322152 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/aa2300d6-10c0-4dc9-812a-fcb30f09920e-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-zwhpc\" (UID: \"aa2300d6-10c0-4dc9-812a-fcb30f09920e\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-zwhpc" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.322176 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lj7d9\" (UniqueName: \"kubernetes.io/projected/bfe2cc56-28ca-4201-ba5a-4208dd1ec818-kube-api-access-lj7d9\") pod \"nmstate-metrics-69594cc75-97cv9\" (UID: \"bfe2cc56-28ca-4201-ba5a-4208dd1ec818\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-97cv9" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.341032 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj7d9\" (UniqueName: \"kubernetes.io/projected/bfe2cc56-28ca-4201-ba5a-4208dd1ec818-kube-api-access-lj7d9\") pod \"nmstate-metrics-69594cc75-97cv9\" (UID: \"bfe2cc56-28ca-4201-ba5a-4208dd1ec818\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-97cv9" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.423505 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkh22\" (UniqueName: \"kubernetes.io/projected/aa2300d6-10c0-4dc9-812a-fcb30f09920e-kube-api-access-hkh22\") pod \"nmstate-webhook-786f45cff4-zwhpc\" (UID: \"aa2300d6-10c0-4dc9-812a-fcb30f09920e\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-zwhpc" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.423555 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5mxc\" (UniqueName: \"kubernetes.io/projected/7105919f-ddac-45db-a8f7-bd927e5737df-kube-api-access-h5mxc\") pod \"nmstate-handler-9j2tz\" (UID: \"7105919f-ddac-45db-a8f7-bd927e5737df\") " pod="openshift-nmstate/nmstate-handler-9j2tz" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.423594 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gvz9\" (UniqueName: \"kubernetes.io/projected/f7ca92c8-f38b-4a0a-b330-5809993cbb49-kube-api-access-4gvz9\") pod \"nmstate-console-plugin-5dcbbd79cf-mtxkm\" (UID: \"f7ca92c8-f38b-4a0a-b330-5809993cbb49\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mtxkm" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.423623 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7105919f-ddac-45db-a8f7-bd927e5737df-nmstate-lock\") pod \"nmstate-handler-9j2tz\" (UID: \"7105919f-ddac-45db-a8f7-bd927e5737df\") " pod="openshift-nmstate/nmstate-handler-9j2tz" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.423650 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f7ca92c8-f38b-4a0a-b330-5809993cbb49-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-mtxkm\" (UID: \"f7ca92c8-f38b-4a0a-b330-5809993cbb49\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mtxkm" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.423721 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7105919f-ddac-45db-a8f7-bd927e5737df-ovs-socket\") pod \"nmstate-handler-9j2tz\" (UID: \"7105919f-ddac-45db-a8f7-bd927e5737df\") " pod="openshift-nmstate/nmstate-handler-9j2tz" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.423752 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7105919f-ddac-45db-a8f7-bd927e5737df-dbus-socket\") pod \"nmstate-handler-9j2tz\" (UID: \"7105919f-ddac-45db-a8f7-bd927e5737df\") " pod="openshift-nmstate/nmstate-handler-9j2tz" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.423780 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f7ca92c8-f38b-4a0a-b330-5809993cbb49-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-mtxkm\" (UID: \"f7ca92c8-f38b-4a0a-b330-5809993cbb49\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mtxkm" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.423822 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/aa2300d6-10c0-4dc9-812a-fcb30f09920e-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-zwhpc\" (UID: \"aa2300d6-10c0-4dc9-812a-fcb30f09920e\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-zwhpc" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.424497 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7105919f-ddac-45db-a8f7-bd927e5737df-nmstate-lock\") pod \"nmstate-handler-9j2tz\" (UID: \"7105919f-ddac-45db-a8f7-bd927e5737df\") " pod="openshift-nmstate/nmstate-handler-9j2tz" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.424561 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7105919f-ddac-45db-a8f7-bd927e5737df-ovs-socket\") pod \"nmstate-handler-9j2tz\" (UID: \"7105919f-ddac-45db-a8f7-bd927e5737df\") " pod="openshift-nmstate/nmstate-handler-9j2tz" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.424881 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7105919f-ddac-45db-a8f7-bd927e5737df-dbus-socket\") pod \"nmstate-handler-9j2tz\" (UID: \"7105919f-ddac-45db-a8f7-bd927e5737df\") " pod="openshift-nmstate/nmstate-handler-9j2tz" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.429633 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/aa2300d6-10c0-4dc9-812a-fcb30f09920e-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-zwhpc\" (UID: \"aa2300d6-10c0-4dc9-812a-fcb30f09920e\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-zwhpc" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.431312 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-97cv9" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.475694 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5mxc\" (UniqueName: \"kubernetes.io/projected/7105919f-ddac-45db-a8f7-bd927e5737df-kube-api-access-h5mxc\") pod \"nmstate-handler-9j2tz\" (UID: \"7105919f-ddac-45db-a8f7-bd927e5737df\") " pod="openshift-nmstate/nmstate-handler-9j2tz" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.497290 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkh22\" (UniqueName: \"kubernetes.io/projected/aa2300d6-10c0-4dc9-812a-fcb30f09920e-kube-api-access-hkh22\") pod \"nmstate-webhook-786f45cff4-zwhpc\" (UID: \"aa2300d6-10c0-4dc9-812a-fcb30f09920e\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-zwhpc" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.497546 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-9j2tz" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.525129 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gvz9\" (UniqueName: \"kubernetes.io/projected/f7ca92c8-f38b-4a0a-b330-5809993cbb49-kube-api-access-4gvz9\") pod \"nmstate-console-plugin-5dcbbd79cf-mtxkm\" (UID: \"f7ca92c8-f38b-4a0a-b330-5809993cbb49\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mtxkm" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.525177 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f7ca92c8-f38b-4a0a-b330-5809993cbb49-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-mtxkm\" (UID: \"f7ca92c8-f38b-4a0a-b330-5809993cbb49\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mtxkm" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.525224 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f7ca92c8-f38b-4a0a-b330-5809993cbb49-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-mtxkm\" (UID: \"f7ca92c8-f38b-4a0a-b330-5809993cbb49\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mtxkm" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.526576 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f7ca92c8-f38b-4a0a-b330-5809993cbb49-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-mtxkm\" (UID: \"f7ca92c8-f38b-4a0a-b330-5809993cbb49\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mtxkm" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.527827 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f7ca92c8-f38b-4a0a-b330-5809993cbb49-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-mtxkm\" (UID: \"f7ca92c8-f38b-4a0a-b330-5809993cbb49\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mtxkm" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.548831 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d787bcd-49gzg"] Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.549664 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.553801 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gvz9\" (UniqueName: \"kubernetes.io/projected/f7ca92c8-f38b-4a0a-b330-5809993cbb49-kube-api-access-4gvz9\") pod \"nmstate-console-plugin-5dcbbd79cf-mtxkm\" (UID: \"f7ca92c8-f38b-4a0a-b330-5809993cbb49\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mtxkm" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.568802 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d787bcd-49gzg"] Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.596312 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mtxkm" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.727602 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d040336d-5b5f-44e9-959d-84260224c25d-console-serving-cert\") pod \"console-f9d787bcd-49gzg\" (UID: \"d040336d-5b5f-44e9-959d-84260224c25d\") " pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.727885 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d040336d-5b5f-44e9-959d-84260224c25d-service-ca\") pod \"console-f9d787bcd-49gzg\" (UID: \"d040336d-5b5f-44e9-959d-84260224c25d\") " pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.727939 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54sqs\" (UniqueName: \"kubernetes.io/projected/d040336d-5b5f-44e9-959d-84260224c25d-kube-api-access-54sqs\") pod \"console-f9d787bcd-49gzg\" (UID: \"d040336d-5b5f-44e9-959d-84260224c25d\") " pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.727956 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d040336d-5b5f-44e9-959d-84260224c25d-console-config\") pod \"console-f9d787bcd-49gzg\" (UID: \"d040336d-5b5f-44e9-959d-84260224c25d\") " pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.727990 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d040336d-5b5f-44e9-959d-84260224c25d-console-oauth-config\") pod \"console-f9d787bcd-49gzg\" (UID: \"d040336d-5b5f-44e9-959d-84260224c25d\") " pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.728029 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d040336d-5b5f-44e9-959d-84260224c25d-trusted-ca-bundle\") pod \"console-f9d787bcd-49gzg\" (UID: \"d040336d-5b5f-44e9-959d-84260224c25d\") " pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.728081 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d040336d-5b5f-44e9-959d-84260224c25d-oauth-serving-cert\") pod \"console-f9d787bcd-49gzg\" (UID: \"d040336d-5b5f-44e9-959d-84260224c25d\") " pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.754469 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-97cv9"] Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.773437 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-zwhpc" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.832548 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d040336d-5b5f-44e9-959d-84260224c25d-console-oauth-config\") pod \"console-f9d787bcd-49gzg\" (UID: \"d040336d-5b5f-44e9-959d-84260224c25d\") " pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.832603 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d040336d-5b5f-44e9-959d-84260224c25d-trusted-ca-bundle\") pod \"console-f9d787bcd-49gzg\" (UID: \"d040336d-5b5f-44e9-959d-84260224c25d\") " pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.832665 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d040336d-5b5f-44e9-959d-84260224c25d-oauth-serving-cert\") pod \"console-f9d787bcd-49gzg\" (UID: \"d040336d-5b5f-44e9-959d-84260224c25d\") " pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.832699 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d040336d-5b5f-44e9-959d-84260224c25d-console-serving-cert\") pod \"console-f9d787bcd-49gzg\" (UID: \"d040336d-5b5f-44e9-959d-84260224c25d\") " pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.832733 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d040336d-5b5f-44e9-959d-84260224c25d-service-ca\") pod \"console-f9d787bcd-49gzg\" (UID: \"d040336d-5b5f-44e9-959d-84260224c25d\") " pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.832762 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54sqs\" (UniqueName: \"kubernetes.io/projected/d040336d-5b5f-44e9-959d-84260224c25d-kube-api-access-54sqs\") pod \"console-f9d787bcd-49gzg\" (UID: \"d040336d-5b5f-44e9-959d-84260224c25d\") " pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.832787 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d040336d-5b5f-44e9-959d-84260224c25d-console-config\") pod \"console-f9d787bcd-49gzg\" (UID: \"d040336d-5b5f-44e9-959d-84260224c25d\") " pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.833698 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d040336d-5b5f-44e9-959d-84260224c25d-oauth-serving-cert\") pod \"console-f9d787bcd-49gzg\" (UID: \"d040336d-5b5f-44e9-959d-84260224c25d\") " pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.833785 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d040336d-5b5f-44e9-959d-84260224c25d-console-config\") pod \"console-f9d787bcd-49gzg\" (UID: \"d040336d-5b5f-44e9-959d-84260224c25d\") " pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.834732 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d040336d-5b5f-44e9-959d-84260224c25d-trusted-ca-bundle\") pod \"console-f9d787bcd-49gzg\" (UID: \"d040336d-5b5f-44e9-959d-84260224c25d\") " pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.834776 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d040336d-5b5f-44e9-959d-84260224c25d-service-ca\") pod \"console-f9d787bcd-49gzg\" (UID: \"d040336d-5b5f-44e9-959d-84260224c25d\") " pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.837530 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d040336d-5b5f-44e9-959d-84260224c25d-console-serving-cert\") pod \"console-f9d787bcd-49gzg\" (UID: \"d040336d-5b5f-44e9-959d-84260224c25d\") " pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.838452 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d040336d-5b5f-44e9-959d-84260224c25d-console-oauth-config\") pod \"console-f9d787bcd-49gzg\" (UID: \"d040336d-5b5f-44e9-959d-84260224c25d\") " pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.855394 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54sqs\" (UniqueName: \"kubernetes.io/projected/d040336d-5b5f-44e9-959d-84260224c25d-kube-api-access-54sqs\") pod \"console-f9d787bcd-49gzg\" (UID: \"d040336d-5b5f-44e9-959d-84260224c25d\") " pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.874084 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.940640 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-zwhpc"] Mar 01 09:21:50 crc kubenswrapper[4792]: I0301 09:21:50.023616 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mtxkm"] Mar 01 09:21:50 crc kubenswrapper[4792]: W0301 09:21:50.029078 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7ca92c8_f38b_4a0a_b330_5809993cbb49.slice/crio-b19f0c79b85a6766587e8c440aa6b1232df07e88e2e35736f259bd8f65cda23f WatchSource:0}: Error finding container b19f0c79b85a6766587e8c440aa6b1232df07e88e2e35736f259bd8f65cda23f: Status 404 returned error can't find the container with id b19f0c79b85a6766587e8c440aa6b1232df07e88e2e35736f259bd8f65cda23f Mar 01 09:21:50 crc kubenswrapper[4792]: I0301 09:21:50.133835 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-9j2tz" event={"ID":"7105919f-ddac-45db-a8f7-bd927e5737df","Type":"ContainerStarted","Data":"e2ca7a86913e06245de7da9bea06c93473666473dd10d1f84831b2eca13979cd"} Mar 01 09:21:50 crc kubenswrapper[4792]: I0301 09:21:50.135194 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-97cv9" event={"ID":"bfe2cc56-28ca-4201-ba5a-4208dd1ec818","Type":"ContainerStarted","Data":"8befbe26d96643fd32f1674a35656efdebc284b5f6c25b19a271aacce8d6050a"} Mar 01 09:21:50 crc kubenswrapper[4792]: I0301 09:21:50.136111 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-zwhpc" event={"ID":"aa2300d6-10c0-4dc9-812a-fcb30f09920e","Type":"ContainerStarted","Data":"a7ee32be1ab30155d414feefebdc19ab56d1866ab4ffd457abd74dcf66fa8915"} Mar 01 09:21:50 crc kubenswrapper[4792]: I0301 09:21:50.137124 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mtxkm" event={"ID":"f7ca92c8-f38b-4a0a-b330-5809993cbb49","Type":"ContainerStarted","Data":"b19f0c79b85a6766587e8c440aa6b1232df07e88e2e35736f259bd8f65cda23f"} Mar 01 09:21:50 crc kubenswrapper[4792]: I0301 09:21:50.294082 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d787bcd-49gzg"] Mar 01 09:21:50 crc kubenswrapper[4792]: W0301 09:21:50.295032 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd040336d_5b5f_44e9_959d_84260224c25d.slice/crio-df68ab51b3fb82ea3e3c75e290c5ebe27d1af3c18b75b28c7b5448829cb26e7d WatchSource:0}: Error finding container df68ab51b3fb82ea3e3c75e290c5ebe27d1af3c18b75b28c7b5448829cb26e7d: Status 404 returned error can't find the container with id df68ab51b3fb82ea3e3c75e290c5ebe27d1af3c18b75b28c7b5448829cb26e7d Mar 01 09:21:51 crc kubenswrapper[4792]: I0301 09:21:51.145276 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d787bcd-49gzg" event={"ID":"d040336d-5b5f-44e9-959d-84260224c25d","Type":"ContainerStarted","Data":"8e3bc4647d1bcc0ace6be9cf55ca15cc2f9ebf86f45cef84623bffbddf5feedc"} Mar 01 09:21:51 crc kubenswrapper[4792]: I0301 09:21:51.145624 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d787bcd-49gzg" event={"ID":"d040336d-5b5f-44e9-959d-84260224c25d","Type":"ContainerStarted","Data":"df68ab51b3fb82ea3e3c75e290c5ebe27d1af3c18b75b28c7b5448829cb26e7d"} Mar 01 09:21:51 crc kubenswrapper[4792]: I0301 09:21:51.162875 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d787bcd-49gzg" podStartSLOduration=2.162853582 podStartE2EDuration="2.162853582s" podCreationTimestamp="2026-03-01 09:21:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:21:51.161750365 +0000 UTC m=+840.403629562" watchObservedRunningTime="2026-03-01 09:21:51.162853582 +0000 UTC m=+840.404732779" Mar 01 09:21:53 crc kubenswrapper[4792]: I0301 09:21:53.159134 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mtxkm" event={"ID":"f7ca92c8-f38b-4a0a-b330-5809993cbb49","Type":"ContainerStarted","Data":"e352eeb70fea6a32f9256ee242d703c85d61f0b059a6b9ebc168263dfa4de9e4"} Mar 01 09:21:53 crc kubenswrapper[4792]: I0301 09:21:53.161223 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-97cv9" event={"ID":"bfe2cc56-28ca-4201-ba5a-4208dd1ec818","Type":"ContainerStarted","Data":"bf5c4e80cffac78d41a00f0e98d52038426e8baa29faf6db8b2af41956c8bb77"} Mar 01 09:21:53 crc kubenswrapper[4792]: I0301 09:21:53.163147 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-zwhpc" event={"ID":"aa2300d6-10c0-4dc9-812a-fcb30f09920e","Type":"ContainerStarted","Data":"995a34fecaba5c30fc25409e04e2a82601abf24151a911e8ee750fe446f99450"} Mar 01 09:21:53 crc kubenswrapper[4792]: I0301 09:21:53.163261 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-786f45cff4-zwhpc" Mar 01 09:21:53 crc kubenswrapper[4792]: I0301 09:21:53.177965 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mtxkm" podStartSLOduration=1.320982143 podStartE2EDuration="4.177944924s" podCreationTimestamp="2026-03-01 09:21:49 +0000 UTC" firstStartedPulling="2026-03-01 09:21:50.036832987 +0000 UTC m=+839.278712184" lastFinishedPulling="2026-03-01 09:21:52.893795768 +0000 UTC m=+842.135674965" observedRunningTime="2026-03-01 09:21:53.17492343 +0000 UTC m=+842.416802627" watchObservedRunningTime="2026-03-01 09:21:53.177944924 +0000 UTC m=+842.419824121" Mar 01 09:21:54 crc kubenswrapper[4792]: I0301 09:21:54.170114 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-9j2tz" event={"ID":"7105919f-ddac-45db-a8f7-bd927e5737df","Type":"ContainerStarted","Data":"31459e1e34644765b296e5000ebb6bb93f3dea59a8dd2dc2c50eadced7ab305a"} Mar 01 09:21:54 crc kubenswrapper[4792]: I0301 09:21:54.187896 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-9j2tz" podStartSLOduration=1.8489637220000001 podStartE2EDuration="5.187876664s" podCreationTimestamp="2026-03-01 09:21:49 +0000 UTC" firstStartedPulling="2026-03-01 09:21:49.557138451 +0000 UTC m=+838.799017648" lastFinishedPulling="2026-03-01 09:21:52.896051383 +0000 UTC m=+842.137930590" observedRunningTime="2026-03-01 09:21:54.185936566 +0000 UTC m=+843.427815773" watchObservedRunningTime="2026-03-01 09:21:54.187876664 +0000 UTC m=+843.429755861" Mar 01 09:21:54 crc kubenswrapper[4792]: I0301 09:21:54.188743 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-786f45cff4-zwhpc" podStartSLOduration=2.22955115 podStartE2EDuration="5.188735515s" podCreationTimestamp="2026-03-01 09:21:49 +0000 UTC" firstStartedPulling="2026-03-01 09:21:49.956217356 +0000 UTC m=+839.198096553" lastFinishedPulling="2026-03-01 09:21:52.915401721 +0000 UTC m=+842.157280918" observedRunningTime="2026-03-01 09:21:53.204998632 +0000 UTC m=+842.446877819" watchObservedRunningTime="2026-03-01 09:21:54.188735515 +0000 UTC m=+843.430614712" Mar 01 09:21:54 crc kubenswrapper[4792]: I0301 09:21:54.498828 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-9j2tz" Mar 01 09:21:56 crc kubenswrapper[4792]: I0301 09:21:56.183588 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-97cv9" event={"ID":"bfe2cc56-28ca-4201-ba5a-4208dd1ec818","Type":"ContainerStarted","Data":"e0f6095afec3a06107aabffef4c0efee5b1f77d77ab71fbe9c2b8cadaaa723dc"} Mar 01 09:21:56 crc kubenswrapper[4792]: I0301 09:21:56.205775 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-69594cc75-97cv9" podStartSLOduration=1.773733074 podStartE2EDuration="7.205743413s" podCreationTimestamp="2026-03-01 09:21:49 +0000 UTC" firstStartedPulling="2026-03-01 09:21:49.769956267 +0000 UTC m=+839.011835464" lastFinishedPulling="2026-03-01 09:21:55.201966606 +0000 UTC m=+844.443845803" observedRunningTime="2026-03-01 09:21:56.203821486 +0000 UTC m=+845.445700723" watchObservedRunningTime="2026-03-01 09:21:56.205743413 +0000 UTC m=+845.447622660" Mar 01 09:21:59 crc kubenswrapper[4792]: I0301 09:21:59.525051 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-9j2tz" Mar 01 09:21:59 crc kubenswrapper[4792]: I0301 09:21:59.874467 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:59 crc kubenswrapper[4792]: I0301 09:21:59.874510 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:59 crc kubenswrapper[4792]: I0301 09:21:59.879396 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:22:00 crc kubenswrapper[4792]: I0301 09:22:00.128620 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539282-dcfkb"] Mar 01 09:22:00 crc kubenswrapper[4792]: I0301 09:22:00.129282 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539282-dcfkb" Mar 01 09:22:00 crc kubenswrapper[4792]: I0301 09:22:00.132125 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:22:00 crc kubenswrapper[4792]: I0301 09:22:00.132478 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:22:00 crc kubenswrapper[4792]: I0301 09:22:00.134017 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:22:00 crc kubenswrapper[4792]: I0301 09:22:00.138993 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539282-dcfkb"] Mar 01 09:22:00 crc kubenswrapper[4792]: I0301 09:22:00.214348 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:22:00 crc kubenswrapper[4792]: I0301 09:22:00.280034 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdhdv\" (UniqueName: \"kubernetes.io/projected/1fcb7c96-6ab5-413c-b776-d1bc938e85c0-kube-api-access-xdhdv\") pod \"auto-csr-approver-29539282-dcfkb\" (UID: \"1fcb7c96-6ab5-413c-b776-d1bc938e85c0\") " pod="openshift-infra/auto-csr-approver-29539282-dcfkb" Mar 01 09:22:00 crc kubenswrapper[4792]: I0301 09:22:00.298447 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-zrzcg"] Mar 01 09:22:00 crc kubenswrapper[4792]: I0301 09:22:00.380931 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdhdv\" (UniqueName: \"kubernetes.io/projected/1fcb7c96-6ab5-413c-b776-d1bc938e85c0-kube-api-access-xdhdv\") pod \"auto-csr-approver-29539282-dcfkb\" (UID: \"1fcb7c96-6ab5-413c-b776-d1bc938e85c0\") " pod="openshift-infra/auto-csr-approver-29539282-dcfkb" Mar 01 09:22:00 crc kubenswrapper[4792]: I0301 09:22:00.406748 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdhdv\" (UniqueName: \"kubernetes.io/projected/1fcb7c96-6ab5-413c-b776-d1bc938e85c0-kube-api-access-xdhdv\") pod \"auto-csr-approver-29539282-dcfkb\" (UID: \"1fcb7c96-6ab5-413c-b776-d1bc938e85c0\") " pod="openshift-infra/auto-csr-approver-29539282-dcfkb" Mar 01 09:22:00 crc kubenswrapper[4792]: I0301 09:22:00.481383 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539282-dcfkb" Mar 01 09:22:00 crc kubenswrapper[4792]: I0301 09:22:00.855489 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539282-dcfkb"] Mar 01 09:22:00 crc kubenswrapper[4792]: W0301 09:22:00.859241 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fcb7c96_6ab5_413c_b776_d1bc938e85c0.slice/crio-9ed4209950b7dc0c0219f7cbf98dfb2fd0f042c8e8d978db82b2efba327f129a WatchSource:0}: Error finding container 9ed4209950b7dc0c0219f7cbf98dfb2fd0f042c8e8d978db82b2efba327f129a: Status 404 returned error can't find the container with id 9ed4209950b7dc0c0219f7cbf98dfb2fd0f042c8e8d978db82b2efba327f129a Mar 01 09:22:01 crc kubenswrapper[4792]: I0301 09:22:01.217610 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539282-dcfkb" event={"ID":"1fcb7c96-6ab5-413c-b776-d1bc938e85c0","Type":"ContainerStarted","Data":"9ed4209950b7dc0c0219f7cbf98dfb2fd0f042c8e8d978db82b2efba327f129a"} Mar 01 09:22:02 crc kubenswrapper[4792]: I0301 09:22:02.225562 4792 generic.go:334] "Generic (PLEG): container finished" podID="1fcb7c96-6ab5-413c-b776-d1bc938e85c0" containerID="b3416cff442b7b3bec1893fb5c0aa2d61087db5d4679dae5ed62f8ea4a150ca7" exitCode=0 Mar 01 09:22:02 crc kubenswrapper[4792]: I0301 09:22:02.225644 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539282-dcfkb" event={"ID":"1fcb7c96-6ab5-413c-b776-d1bc938e85c0","Type":"ContainerDied","Data":"b3416cff442b7b3bec1893fb5c0aa2d61087db5d4679dae5ed62f8ea4a150ca7"} Mar 01 09:22:03 crc kubenswrapper[4792]: I0301 09:22:03.482075 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539282-dcfkb" Mar 01 09:22:03 crc kubenswrapper[4792]: I0301 09:22:03.620638 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdhdv\" (UniqueName: \"kubernetes.io/projected/1fcb7c96-6ab5-413c-b776-d1bc938e85c0-kube-api-access-xdhdv\") pod \"1fcb7c96-6ab5-413c-b776-d1bc938e85c0\" (UID: \"1fcb7c96-6ab5-413c-b776-d1bc938e85c0\") " Mar 01 09:22:03 crc kubenswrapper[4792]: I0301 09:22:03.629351 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fcb7c96-6ab5-413c-b776-d1bc938e85c0-kube-api-access-xdhdv" (OuterVolumeSpecName: "kube-api-access-xdhdv") pod "1fcb7c96-6ab5-413c-b776-d1bc938e85c0" (UID: "1fcb7c96-6ab5-413c-b776-d1bc938e85c0"). InnerVolumeSpecName "kube-api-access-xdhdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:22:03 crc kubenswrapper[4792]: I0301 09:22:03.723975 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdhdv\" (UniqueName: \"kubernetes.io/projected/1fcb7c96-6ab5-413c-b776-d1bc938e85c0-kube-api-access-xdhdv\") on node \"crc\" DevicePath \"\"" Mar 01 09:22:04 crc kubenswrapper[4792]: I0301 09:22:04.241231 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539282-dcfkb" event={"ID":"1fcb7c96-6ab5-413c-b776-d1bc938e85c0","Type":"ContainerDied","Data":"9ed4209950b7dc0c0219f7cbf98dfb2fd0f042c8e8d978db82b2efba327f129a"} Mar 01 09:22:04 crc kubenswrapper[4792]: I0301 09:22:04.241266 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ed4209950b7dc0c0219f7cbf98dfb2fd0f042c8e8d978db82b2efba327f129a" Mar 01 09:22:04 crc kubenswrapper[4792]: I0301 09:22:04.241289 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539282-dcfkb" Mar 01 09:22:04 crc kubenswrapper[4792]: I0301 09:22:04.549139 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539276-sbq86"] Mar 01 09:22:04 crc kubenswrapper[4792]: I0301 09:22:04.556075 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539276-sbq86"] Mar 01 09:22:05 crc kubenswrapper[4792]: I0301 09:22:05.416158 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ffc24e3-2b40-4368-91a7-474239cc46fc" path="/var/lib/kubelet/pods/0ffc24e3-2b40-4368-91a7-474239cc46fc/volumes" Mar 01 09:22:09 crc kubenswrapper[4792]: I0301 09:22:09.778555 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-786f45cff4-zwhpc" Mar 01 09:22:22 crc kubenswrapper[4792]: I0301 09:22:22.850602 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t"] Mar 01 09:22:22 crc kubenswrapper[4792]: E0301 09:22:22.852338 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fcb7c96-6ab5-413c-b776-d1bc938e85c0" containerName="oc" Mar 01 09:22:22 crc kubenswrapper[4792]: I0301 09:22:22.852426 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fcb7c96-6ab5-413c-b776-d1bc938e85c0" containerName="oc" Mar 01 09:22:22 crc kubenswrapper[4792]: I0301 09:22:22.852620 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fcb7c96-6ab5-413c-b776-d1bc938e85c0" containerName="oc" Mar 01 09:22:22 crc kubenswrapper[4792]: I0301 09:22:22.853463 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t" Mar 01 09:22:22 crc kubenswrapper[4792]: I0301 09:22:22.871963 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 01 09:22:22 crc kubenswrapper[4792]: I0301 09:22:22.877685 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t"] Mar 01 09:22:22 crc kubenswrapper[4792]: I0301 09:22:22.915271 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzcbl\" (UniqueName: \"kubernetes.io/projected/a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4-kube-api-access-pzcbl\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t\" (UID: \"a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t" Mar 01 09:22:22 crc kubenswrapper[4792]: I0301 09:22:22.915350 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t\" (UID: \"a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t" Mar 01 09:22:22 crc kubenswrapper[4792]: I0301 09:22:22.915415 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t\" (UID: \"a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t" Mar 01 09:22:23 crc kubenswrapper[4792]: I0301 09:22:23.016983 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzcbl\" (UniqueName: \"kubernetes.io/projected/a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4-kube-api-access-pzcbl\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t\" (UID: \"a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t" Mar 01 09:22:23 crc kubenswrapper[4792]: I0301 09:22:23.017501 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t\" (UID: \"a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t" Mar 01 09:22:23 crc kubenswrapper[4792]: I0301 09:22:23.018118 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t\" (UID: \"a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t" Mar 01 09:22:23 crc kubenswrapper[4792]: I0301 09:22:23.018058 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t\" (UID: \"a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t" Mar 01 09:22:23 crc kubenswrapper[4792]: I0301 09:22:23.018407 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t\" (UID: \"a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t" Mar 01 09:22:23 crc kubenswrapper[4792]: I0301 09:22:23.041781 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzcbl\" (UniqueName: \"kubernetes.io/projected/a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4-kube-api-access-pzcbl\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t\" (UID: \"a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t" Mar 01 09:22:23 crc kubenswrapper[4792]: I0301 09:22:23.190065 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t" Mar 01 09:22:23 crc kubenswrapper[4792]: I0301 09:22:23.592223 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t"] Mar 01 09:22:23 crc kubenswrapper[4792]: W0301 09:22:23.605997 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7c3d28a_4f36_4a3c_a4f6_793a5f945cd4.slice/crio-d09e4aa9d099d0f5855da4eafb998a6a9b4f330a892524865861519e11e6a71e WatchSource:0}: Error finding container d09e4aa9d099d0f5855da4eafb998a6a9b4f330a892524865861519e11e6a71e: Status 404 returned error can't find the container with id d09e4aa9d099d0f5855da4eafb998a6a9b4f330a892524865861519e11e6a71e Mar 01 09:22:24 crc kubenswrapper[4792]: I0301 09:22:24.375123 4792 generic.go:334] "Generic (PLEG): container finished" podID="a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4" containerID="4470a78593c70df9a7f1348a7ba2122a59c8f34035c2eb3e36dfe3908e9d37a9" exitCode=0 Mar 01 09:22:24 crc kubenswrapper[4792]: I0301 09:22:24.375425 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t" event={"ID":"a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4","Type":"ContainerDied","Data":"4470a78593c70df9a7f1348a7ba2122a59c8f34035c2eb3e36dfe3908e9d37a9"} Mar 01 09:22:24 crc kubenswrapper[4792]: I0301 09:22:24.376408 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t" event={"ID":"a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4","Type":"ContainerStarted","Data":"d09e4aa9d099d0f5855da4eafb998a6a9b4f330a892524865861519e11e6a71e"} Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.230406 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hw8r6"] Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.232198 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hw8r6" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.239023 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hw8r6"] Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.248949 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2905773e-c73d-4965-83a8-b1eff758a9b6-catalog-content\") pod \"redhat-operators-hw8r6\" (UID: \"2905773e-c73d-4965-83a8-b1eff758a9b6\") " pod="openshift-marketplace/redhat-operators-hw8r6" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.249058 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssnqt\" (UniqueName: \"kubernetes.io/projected/2905773e-c73d-4965-83a8-b1eff758a9b6-kube-api-access-ssnqt\") pod \"redhat-operators-hw8r6\" (UID: \"2905773e-c73d-4965-83a8-b1eff758a9b6\") " pod="openshift-marketplace/redhat-operators-hw8r6" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.249116 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2905773e-c73d-4965-83a8-b1eff758a9b6-utilities\") pod \"redhat-operators-hw8r6\" (UID: \"2905773e-c73d-4965-83a8-b1eff758a9b6\") " pod="openshift-marketplace/redhat-operators-hw8r6" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.350366 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssnqt\" (UniqueName: \"kubernetes.io/projected/2905773e-c73d-4965-83a8-b1eff758a9b6-kube-api-access-ssnqt\") pod \"redhat-operators-hw8r6\" (UID: \"2905773e-c73d-4965-83a8-b1eff758a9b6\") " pod="openshift-marketplace/redhat-operators-hw8r6" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.350434 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2905773e-c73d-4965-83a8-b1eff758a9b6-utilities\") pod \"redhat-operators-hw8r6\" (UID: \"2905773e-c73d-4965-83a8-b1eff758a9b6\") " pod="openshift-marketplace/redhat-operators-hw8r6" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.350513 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2905773e-c73d-4965-83a8-b1eff758a9b6-catalog-content\") pod \"redhat-operators-hw8r6\" (UID: \"2905773e-c73d-4965-83a8-b1eff758a9b6\") " pod="openshift-marketplace/redhat-operators-hw8r6" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.351083 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2905773e-c73d-4965-83a8-b1eff758a9b6-catalog-content\") pod \"redhat-operators-hw8r6\" (UID: \"2905773e-c73d-4965-83a8-b1eff758a9b6\") " pod="openshift-marketplace/redhat-operators-hw8r6" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.351236 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2905773e-c73d-4965-83a8-b1eff758a9b6-utilities\") pod \"redhat-operators-hw8r6\" (UID: \"2905773e-c73d-4965-83a8-b1eff758a9b6\") " pod="openshift-marketplace/redhat-operators-hw8r6" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.352820 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-zrzcg" podUID="86788093-42e5-4fa0-9595-97a910e6557e" containerName="console" containerID="cri-o://80316d2117c800289193f2a4dcfeab49966269af79c4c3f4ca03bd69d11da184" gracePeriod=15 Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.377638 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssnqt\" (UniqueName: \"kubernetes.io/projected/2905773e-c73d-4965-83a8-b1eff758a9b6-kube-api-access-ssnqt\") pod \"redhat-operators-hw8r6\" (UID: \"2905773e-c73d-4965-83a8-b1eff758a9b6\") " pod="openshift-marketplace/redhat-operators-hw8r6" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.562353 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hw8r6" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.738832 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-zrzcg_86788093-42e5-4fa0-9595-97a910e6557e/console/0.log" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.739150 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.757438 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/86788093-42e5-4fa0-9595-97a910e6557e-console-serving-cert\") pod \"86788093-42e5-4fa0-9595-97a910e6557e\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.757484 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-service-ca\") pod \"86788093-42e5-4fa0-9595-97a910e6557e\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.757535 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-oauth-serving-cert\") pod \"86788093-42e5-4fa0-9595-97a910e6557e\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.757563 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qmx8\" (UniqueName: \"kubernetes.io/projected/86788093-42e5-4fa0-9595-97a910e6557e-kube-api-access-6qmx8\") pod \"86788093-42e5-4fa0-9595-97a910e6557e\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.757594 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-console-config\") pod \"86788093-42e5-4fa0-9595-97a910e6557e\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.757622 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-trusted-ca-bundle\") pod \"86788093-42e5-4fa0-9595-97a910e6557e\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.757682 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/86788093-42e5-4fa0-9595-97a910e6557e-console-oauth-config\") pod \"86788093-42e5-4fa0-9595-97a910e6557e\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.758675 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-console-config" (OuterVolumeSpecName: "console-config") pod "86788093-42e5-4fa0-9595-97a910e6557e" (UID: "86788093-42e5-4fa0-9595-97a910e6557e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.758985 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "86788093-42e5-4fa0-9595-97a910e6557e" (UID: "86788093-42e5-4fa0-9595-97a910e6557e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.759224 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "86788093-42e5-4fa0-9595-97a910e6557e" (UID: "86788093-42e5-4fa0-9595-97a910e6557e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.766602 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-service-ca" (OuterVolumeSpecName: "service-ca") pod "86788093-42e5-4fa0-9595-97a910e6557e" (UID: "86788093-42e5-4fa0-9595-97a910e6557e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.767029 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86788093-42e5-4fa0-9595-97a910e6557e-kube-api-access-6qmx8" (OuterVolumeSpecName: "kube-api-access-6qmx8") pod "86788093-42e5-4fa0-9595-97a910e6557e" (UID: "86788093-42e5-4fa0-9595-97a910e6557e"). InnerVolumeSpecName "kube-api-access-6qmx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.768172 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86788093-42e5-4fa0-9595-97a910e6557e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "86788093-42e5-4fa0-9595-97a910e6557e" (UID: "86788093-42e5-4fa0-9595-97a910e6557e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.768463 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86788093-42e5-4fa0-9595-97a910e6557e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "86788093-42e5-4fa0-9595-97a910e6557e" (UID: "86788093-42e5-4fa0-9595-97a910e6557e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.798439 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hw8r6"] Mar 01 09:22:25 crc kubenswrapper[4792]: W0301 09:22:25.807215 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2905773e_c73d_4965_83a8_b1eff758a9b6.slice/crio-fc49f8e966aa2484369924988e53dd7acfdccbf1699d09f3aa942c9391b8496b WatchSource:0}: Error finding container fc49f8e966aa2484369924988e53dd7acfdccbf1699d09f3aa942c9391b8496b: Status 404 returned error can't find the container with id fc49f8e966aa2484369924988e53dd7acfdccbf1699d09f3aa942c9391b8496b Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.858371 4792 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.858398 4792 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/86788093-42e5-4fa0-9595-97a910e6557e-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.858408 4792 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/86788093-42e5-4fa0-9595-97a910e6557e-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.858417 4792 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-service-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.858426 4792 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.858435 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qmx8\" (UniqueName: \"kubernetes.io/projected/86788093-42e5-4fa0-9595-97a910e6557e-kube-api-access-6qmx8\") on node \"crc\" DevicePath \"\"" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.858443 4792 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-console-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:22:26 crc kubenswrapper[4792]: I0301 09:22:26.397441 4792 generic.go:334] "Generic (PLEG): container finished" podID="a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4" containerID="b1412ef8ba2536904ff48ce6977ac97f0c2513be0b40dc923238218df7f2cb91" exitCode=0 Mar 01 09:22:26 crc kubenswrapper[4792]: I0301 09:22:26.397594 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t" event={"ID":"a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4","Type":"ContainerDied","Data":"b1412ef8ba2536904ff48ce6977ac97f0c2513be0b40dc923238218df7f2cb91"} Mar 01 09:22:26 crc kubenswrapper[4792]: I0301 09:22:26.399728 4792 generic.go:334] "Generic (PLEG): container finished" podID="2905773e-c73d-4965-83a8-b1eff758a9b6" containerID="f68e5b68249b6c436545d18ed3e8511507d143daee0d4270a3aa78de08afac15" exitCode=0 Mar 01 09:22:26 crc kubenswrapper[4792]: I0301 09:22:26.399788 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hw8r6" event={"ID":"2905773e-c73d-4965-83a8-b1eff758a9b6","Type":"ContainerDied","Data":"f68e5b68249b6c436545d18ed3e8511507d143daee0d4270a3aa78de08afac15"} Mar 01 09:22:26 crc kubenswrapper[4792]: I0301 09:22:26.399805 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hw8r6" event={"ID":"2905773e-c73d-4965-83a8-b1eff758a9b6","Type":"ContainerStarted","Data":"fc49f8e966aa2484369924988e53dd7acfdccbf1699d09f3aa942c9391b8496b"} Mar 01 09:22:26 crc kubenswrapper[4792]: I0301 09:22:26.409605 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-zrzcg_86788093-42e5-4fa0-9595-97a910e6557e/console/0.log" Mar 01 09:22:26 crc kubenswrapper[4792]: I0301 09:22:26.409653 4792 generic.go:334] "Generic (PLEG): container finished" podID="86788093-42e5-4fa0-9595-97a910e6557e" containerID="80316d2117c800289193f2a4dcfeab49966269af79c4c3f4ca03bd69d11da184" exitCode=2 Mar 01 09:22:26 crc kubenswrapper[4792]: I0301 09:22:26.409683 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-zrzcg" event={"ID":"86788093-42e5-4fa0-9595-97a910e6557e","Type":"ContainerDied","Data":"80316d2117c800289193f2a4dcfeab49966269af79c4c3f4ca03bd69d11da184"} Mar 01 09:22:26 crc kubenswrapper[4792]: I0301 09:22:26.409710 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-zrzcg" event={"ID":"86788093-42e5-4fa0-9595-97a910e6557e","Type":"ContainerDied","Data":"d8eca6eb41f8b18cd9f4704945f3f7da8382b69f90c64f182bda36eef645951e"} Mar 01 09:22:26 crc kubenswrapper[4792]: I0301 09:22:26.409733 4792 scope.go:117] "RemoveContainer" containerID="80316d2117c800289193f2a4dcfeab49966269af79c4c3f4ca03bd69d11da184" Mar 01 09:22:26 crc kubenswrapper[4792]: I0301 09:22:26.409939 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:22:26 crc kubenswrapper[4792]: I0301 09:22:26.487741 4792 scope.go:117] "RemoveContainer" containerID="80316d2117c800289193f2a4dcfeab49966269af79c4c3f4ca03bd69d11da184" Mar 01 09:22:26 crc kubenswrapper[4792]: E0301 09:22:26.497073 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80316d2117c800289193f2a4dcfeab49966269af79c4c3f4ca03bd69d11da184\": container with ID starting with 80316d2117c800289193f2a4dcfeab49966269af79c4c3f4ca03bd69d11da184 not found: ID does not exist" containerID="80316d2117c800289193f2a4dcfeab49966269af79c4c3f4ca03bd69d11da184" Mar 01 09:22:26 crc kubenswrapper[4792]: I0301 09:22:26.497119 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80316d2117c800289193f2a4dcfeab49966269af79c4c3f4ca03bd69d11da184"} err="failed to get container status \"80316d2117c800289193f2a4dcfeab49966269af79c4c3f4ca03bd69d11da184\": rpc error: code = NotFound desc = could not find container \"80316d2117c800289193f2a4dcfeab49966269af79c4c3f4ca03bd69d11da184\": container with ID starting with 80316d2117c800289193f2a4dcfeab49966269af79c4c3f4ca03bd69d11da184 not found: ID does not exist" Mar 01 09:22:26 crc kubenswrapper[4792]: I0301 09:22:26.502662 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-zrzcg"] Mar 01 09:22:26 crc kubenswrapper[4792]: I0301 09:22:26.509296 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-zrzcg"] Mar 01 09:22:27 crc kubenswrapper[4792]: I0301 09:22:27.420148 4792 generic.go:334] "Generic (PLEG): container finished" podID="a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4" containerID="d67d114a432146643ca9daddb1ebead9e2e0d67e92df3f14f252b306bf3674b1" exitCode=0 Mar 01 09:22:27 crc kubenswrapper[4792]: I0301 09:22:27.421612 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86788093-42e5-4fa0-9595-97a910e6557e" path="/var/lib/kubelet/pods/86788093-42e5-4fa0-9595-97a910e6557e/volumes" Mar 01 09:22:27 crc kubenswrapper[4792]: I0301 09:22:27.422478 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t" event={"ID":"a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4","Type":"ContainerDied","Data":"d67d114a432146643ca9daddb1ebead9e2e0d67e92df3f14f252b306bf3674b1"} Mar 01 09:22:28 crc kubenswrapper[4792]: I0301 09:22:28.431016 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hw8r6" event={"ID":"2905773e-c73d-4965-83a8-b1eff758a9b6","Type":"ContainerStarted","Data":"f501896b6f9fd28b97abc647b2887daf59ebd8100e2ca875ad728ab54e2ea400"} Mar 01 09:22:28 crc kubenswrapper[4792]: I0301 09:22:28.654339 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t" Mar 01 09:22:28 crc kubenswrapper[4792]: I0301 09:22:28.700463 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4-util\") pod \"a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4\" (UID: \"a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4\") " Mar 01 09:22:28 crc kubenswrapper[4792]: I0301 09:22:28.700530 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4-bundle\") pod \"a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4\" (UID: \"a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4\") " Mar 01 09:22:28 crc kubenswrapper[4792]: I0301 09:22:28.700601 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzcbl\" (UniqueName: \"kubernetes.io/projected/a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4-kube-api-access-pzcbl\") pod \"a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4\" (UID: \"a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4\") " Mar 01 09:22:28 crc kubenswrapper[4792]: I0301 09:22:28.701426 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4-bundle" (OuterVolumeSpecName: "bundle") pod "a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4" (UID: "a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:22:28 crc kubenswrapper[4792]: I0301 09:22:28.706073 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4-kube-api-access-pzcbl" (OuterVolumeSpecName: "kube-api-access-pzcbl") pod "a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4" (UID: "a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4"). InnerVolumeSpecName "kube-api-access-pzcbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:22:28 crc kubenswrapper[4792]: I0301 09:22:28.721926 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4-util" (OuterVolumeSpecName: "util") pod "a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4" (UID: "a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:22:28 crc kubenswrapper[4792]: I0301 09:22:28.801872 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzcbl\" (UniqueName: \"kubernetes.io/projected/a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4-kube-api-access-pzcbl\") on node \"crc\" DevicePath \"\"" Mar 01 09:22:28 crc kubenswrapper[4792]: I0301 09:22:28.801930 4792 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4-util\") on node \"crc\" DevicePath \"\"" Mar 01 09:22:28 crc kubenswrapper[4792]: I0301 09:22:28.801948 4792 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:22:29 crc kubenswrapper[4792]: I0301 09:22:29.445631 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t" event={"ID":"a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4","Type":"ContainerDied","Data":"d09e4aa9d099d0f5855da4eafb998a6a9b4f330a892524865861519e11e6a71e"} Mar 01 09:22:29 crc kubenswrapper[4792]: I0301 09:22:29.446161 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d09e4aa9d099d0f5855da4eafb998a6a9b4f330a892524865861519e11e6a71e" Mar 01 09:22:29 crc kubenswrapper[4792]: I0301 09:22:29.445656 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t" Mar 01 09:22:29 crc kubenswrapper[4792]: I0301 09:22:29.452008 4792 generic.go:334] "Generic (PLEG): container finished" podID="2905773e-c73d-4965-83a8-b1eff758a9b6" containerID="f501896b6f9fd28b97abc647b2887daf59ebd8100e2ca875ad728ab54e2ea400" exitCode=0 Mar 01 09:22:29 crc kubenswrapper[4792]: I0301 09:22:29.452062 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hw8r6" event={"ID":"2905773e-c73d-4965-83a8-b1eff758a9b6","Type":"ContainerDied","Data":"f501896b6f9fd28b97abc647b2887daf59ebd8100e2ca875ad728ab54e2ea400"} Mar 01 09:22:30 crc kubenswrapper[4792]: I0301 09:22:30.459629 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hw8r6" event={"ID":"2905773e-c73d-4965-83a8-b1eff758a9b6","Type":"ContainerStarted","Data":"cc13c02cfab236dc86f662df07250f98a1840645f3d512661c31a12e8c7b0f08"} Mar 01 09:22:30 crc kubenswrapper[4792]: I0301 09:22:30.477230 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hw8r6" podStartSLOduration=2.073781403 podStartE2EDuration="5.477214367s" podCreationTimestamp="2026-03-01 09:22:25 +0000 UTC" firstStartedPulling="2026-03-01 09:22:26.426400347 +0000 UTC m=+875.668279544" lastFinishedPulling="2026-03-01 09:22:29.829833281 +0000 UTC m=+879.071712508" observedRunningTime="2026-03-01 09:22:30.475826443 +0000 UTC m=+879.717705640" watchObservedRunningTime="2026-03-01 09:22:30.477214367 +0000 UTC m=+879.719093564" Mar 01 09:22:34 crc kubenswrapper[4792]: I0301 09:22:34.942545 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:22:34 crc kubenswrapper[4792]: I0301 09:22:34.943114 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:22:35 crc kubenswrapper[4792]: I0301 09:22:35.563059 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hw8r6" Mar 01 09:22:35 crc kubenswrapper[4792]: I0301 09:22:35.563112 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hw8r6" Mar 01 09:22:36 crc kubenswrapper[4792]: I0301 09:22:36.603652 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hw8r6" podUID="2905773e-c73d-4965-83a8-b1eff758a9b6" containerName="registry-server" probeResult="failure" output=< Mar 01 09:22:36 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 09:22:36 crc kubenswrapper[4792]: > Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.321361 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5cd84fcfbc-lrpmz"] Mar 01 09:22:38 crc kubenswrapper[4792]: E0301 09:22:38.321815 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4" containerName="util" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.321827 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4" containerName="util" Mar 01 09:22:38 crc kubenswrapper[4792]: E0301 09:22:38.321839 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4" containerName="extract" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.321846 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4" containerName="extract" Mar 01 09:22:38 crc kubenswrapper[4792]: E0301 09:22:38.321861 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86788093-42e5-4fa0-9595-97a910e6557e" containerName="console" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.321867 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="86788093-42e5-4fa0-9595-97a910e6557e" containerName="console" Mar 01 09:22:38 crc kubenswrapper[4792]: E0301 09:22:38.321876 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4" containerName="pull" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.321882 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4" containerName="pull" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.321988 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="86788093-42e5-4fa0-9595-97a910e6557e" containerName="console" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.321999 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4" containerName="extract" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.322338 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5cd84fcfbc-lrpmz" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.326645 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.327118 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.327282 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.327362 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.327470 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-2g8xw" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.357056 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5cd84fcfbc-lrpmz"] Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.431845 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rrzh\" (UniqueName: \"kubernetes.io/projected/ba22e25a-31e8-4ca7-b169-f7433eda818b-kube-api-access-5rrzh\") pod \"metallb-operator-controller-manager-5cd84fcfbc-lrpmz\" (UID: \"ba22e25a-31e8-4ca7-b169-f7433eda818b\") " pod="metallb-system/metallb-operator-controller-manager-5cd84fcfbc-lrpmz" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.432136 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ba22e25a-31e8-4ca7-b169-f7433eda818b-webhook-cert\") pod \"metallb-operator-controller-manager-5cd84fcfbc-lrpmz\" (UID: \"ba22e25a-31e8-4ca7-b169-f7433eda818b\") " pod="metallb-system/metallb-operator-controller-manager-5cd84fcfbc-lrpmz" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.432669 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ba22e25a-31e8-4ca7-b169-f7433eda818b-apiservice-cert\") pod \"metallb-operator-controller-manager-5cd84fcfbc-lrpmz\" (UID: \"ba22e25a-31e8-4ca7-b169-f7433eda818b\") " pod="metallb-system/metallb-operator-controller-manager-5cd84fcfbc-lrpmz" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.534521 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rrzh\" (UniqueName: \"kubernetes.io/projected/ba22e25a-31e8-4ca7-b169-f7433eda818b-kube-api-access-5rrzh\") pod \"metallb-operator-controller-manager-5cd84fcfbc-lrpmz\" (UID: \"ba22e25a-31e8-4ca7-b169-f7433eda818b\") " pod="metallb-system/metallb-operator-controller-manager-5cd84fcfbc-lrpmz" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.534816 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ba22e25a-31e8-4ca7-b169-f7433eda818b-webhook-cert\") pod \"metallb-operator-controller-manager-5cd84fcfbc-lrpmz\" (UID: \"ba22e25a-31e8-4ca7-b169-f7433eda818b\") " pod="metallb-system/metallb-operator-controller-manager-5cd84fcfbc-lrpmz" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.535829 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ba22e25a-31e8-4ca7-b169-f7433eda818b-apiservice-cert\") pod \"metallb-operator-controller-manager-5cd84fcfbc-lrpmz\" (UID: \"ba22e25a-31e8-4ca7-b169-f7433eda818b\") " pod="metallb-system/metallb-operator-controller-manager-5cd84fcfbc-lrpmz" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.540784 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ba22e25a-31e8-4ca7-b169-f7433eda818b-webhook-cert\") pod \"metallb-operator-controller-manager-5cd84fcfbc-lrpmz\" (UID: \"ba22e25a-31e8-4ca7-b169-f7433eda818b\") " pod="metallb-system/metallb-operator-controller-manager-5cd84fcfbc-lrpmz" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.541315 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ba22e25a-31e8-4ca7-b169-f7433eda818b-apiservice-cert\") pod \"metallb-operator-controller-manager-5cd84fcfbc-lrpmz\" (UID: \"ba22e25a-31e8-4ca7-b169-f7433eda818b\") " pod="metallb-system/metallb-operator-controller-manager-5cd84fcfbc-lrpmz" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.561531 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rrzh\" (UniqueName: \"kubernetes.io/projected/ba22e25a-31e8-4ca7-b169-f7433eda818b-kube-api-access-5rrzh\") pod \"metallb-operator-controller-manager-5cd84fcfbc-lrpmz\" (UID: \"ba22e25a-31e8-4ca7-b169-f7433eda818b\") " pod="metallb-system/metallb-operator-controller-manager-5cd84fcfbc-lrpmz" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.637280 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5cd84fcfbc-lrpmz" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.671847 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-776c7d78bd-jwfh6"] Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.673056 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-776c7d78bd-jwfh6" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.676512 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-52bcm" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.676712 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.677510 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.751379 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-776c7d78bd-jwfh6"] Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.839621 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cf86866e-8afa-44da-a688-e1c018a025bd-webhook-cert\") pod \"metallb-operator-webhook-server-776c7d78bd-jwfh6\" (UID: \"cf86866e-8afa-44da-a688-e1c018a025bd\") " pod="metallb-system/metallb-operator-webhook-server-776c7d78bd-jwfh6" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.839666 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xn97\" (UniqueName: \"kubernetes.io/projected/cf86866e-8afa-44da-a688-e1c018a025bd-kube-api-access-8xn97\") pod \"metallb-operator-webhook-server-776c7d78bd-jwfh6\" (UID: \"cf86866e-8afa-44da-a688-e1c018a025bd\") " pod="metallb-system/metallb-operator-webhook-server-776c7d78bd-jwfh6" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.839706 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cf86866e-8afa-44da-a688-e1c018a025bd-apiservice-cert\") pod \"metallb-operator-webhook-server-776c7d78bd-jwfh6\" (UID: \"cf86866e-8afa-44da-a688-e1c018a025bd\") " pod="metallb-system/metallb-operator-webhook-server-776c7d78bd-jwfh6" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.940940 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cf86866e-8afa-44da-a688-e1c018a025bd-apiservice-cert\") pod \"metallb-operator-webhook-server-776c7d78bd-jwfh6\" (UID: \"cf86866e-8afa-44da-a688-e1c018a025bd\") " pod="metallb-system/metallb-operator-webhook-server-776c7d78bd-jwfh6" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.941218 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cf86866e-8afa-44da-a688-e1c018a025bd-webhook-cert\") pod \"metallb-operator-webhook-server-776c7d78bd-jwfh6\" (UID: \"cf86866e-8afa-44da-a688-e1c018a025bd\") " pod="metallb-system/metallb-operator-webhook-server-776c7d78bd-jwfh6" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.941246 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xn97\" (UniqueName: \"kubernetes.io/projected/cf86866e-8afa-44da-a688-e1c018a025bd-kube-api-access-8xn97\") pod \"metallb-operator-webhook-server-776c7d78bd-jwfh6\" (UID: \"cf86866e-8afa-44da-a688-e1c018a025bd\") " pod="metallb-system/metallb-operator-webhook-server-776c7d78bd-jwfh6" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.948317 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cf86866e-8afa-44da-a688-e1c018a025bd-apiservice-cert\") pod \"metallb-operator-webhook-server-776c7d78bd-jwfh6\" (UID: \"cf86866e-8afa-44da-a688-e1c018a025bd\") " pod="metallb-system/metallb-operator-webhook-server-776c7d78bd-jwfh6" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.949889 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cf86866e-8afa-44da-a688-e1c018a025bd-webhook-cert\") pod \"metallb-operator-webhook-server-776c7d78bd-jwfh6\" (UID: \"cf86866e-8afa-44da-a688-e1c018a025bd\") " pod="metallb-system/metallb-operator-webhook-server-776c7d78bd-jwfh6" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.968589 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xn97\" (UniqueName: \"kubernetes.io/projected/cf86866e-8afa-44da-a688-e1c018a025bd-kube-api-access-8xn97\") pod \"metallb-operator-webhook-server-776c7d78bd-jwfh6\" (UID: \"cf86866e-8afa-44da-a688-e1c018a025bd\") " pod="metallb-system/metallb-operator-webhook-server-776c7d78bd-jwfh6" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.993993 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5cd84fcfbc-lrpmz"] Mar 01 09:22:39 crc kubenswrapper[4792]: I0301 09:22:39.013946 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-776c7d78bd-jwfh6" Mar 01 09:22:39 crc kubenswrapper[4792]: I0301 09:22:39.336141 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-776c7d78bd-jwfh6"] Mar 01 09:22:39 crc kubenswrapper[4792]: W0301 09:22:39.345700 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf86866e_8afa_44da_a688_e1c018a025bd.slice/crio-86d45064458c5acfdb658657c1ebc69cb067763c259b3db068b5bdcabcef2181 WatchSource:0}: Error finding container 86d45064458c5acfdb658657c1ebc69cb067763c259b3db068b5bdcabcef2181: Status 404 returned error can't find the container with id 86d45064458c5acfdb658657c1ebc69cb067763c259b3db068b5bdcabcef2181 Mar 01 09:22:39 crc kubenswrapper[4792]: I0301 09:22:39.514556 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5cd84fcfbc-lrpmz" event={"ID":"ba22e25a-31e8-4ca7-b169-f7433eda818b","Type":"ContainerStarted","Data":"5eb6ffdc047fa110ebb051125dbd45715246f7404f1693ab496295b8fd7f3faa"} Mar 01 09:22:39 crc kubenswrapper[4792]: I0301 09:22:39.523727 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-776c7d78bd-jwfh6" event={"ID":"cf86866e-8afa-44da-a688-e1c018a025bd","Type":"ContainerStarted","Data":"86d45064458c5acfdb658657c1ebc69cb067763c259b3db068b5bdcabcef2181"} Mar 01 09:22:43 crc kubenswrapper[4792]: I0301 09:22:43.571853 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5cd84fcfbc-lrpmz" event={"ID":"ba22e25a-31e8-4ca7-b169-f7433eda818b","Type":"ContainerStarted","Data":"99f873ea6be94f67a1b492a1dd3b7b43beb066e3994fe56e64d4744d9a32b891"} Mar 01 09:22:43 crc kubenswrapper[4792]: I0301 09:22:43.572505 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5cd84fcfbc-lrpmz" Mar 01 09:22:45 crc kubenswrapper[4792]: I0301 09:22:45.584114 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-776c7d78bd-jwfh6" event={"ID":"cf86866e-8afa-44da-a688-e1c018a025bd","Type":"ContainerStarted","Data":"b2b42b80710f0a209f12adfaf613015c088fbd218de782f286e1ffa0664b4b3d"} Mar 01 09:22:45 crc kubenswrapper[4792]: I0301 09:22:45.585526 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-776c7d78bd-jwfh6" Mar 01 09:22:45 crc kubenswrapper[4792]: I0301 09:22:45.600015 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5cd84fcfbc-lrpmz" podStartSLOduration=3.756940419 podStartE2EDuration="7.59999524s" podCreationTimestamp="2026-03-01 09:22:38 +0000 UTC" firstStartedPulling="2026-03-01 09:22:39.004617284 +0000 UTC m=+888.246496481" lastFinishedPulling="2026-03-01 09:22:42.847672095 +0000 UTC m=+892.089551302" observedRunningTime="2026-03-01 09:22:43.592664552 +0000 UTC m=+892.834543769" watchObservedRunningTime="2026-03-01 09:22:45.59999524 +0000 UTC m=+894.841874437" Mar 01 09:22:45 crc kubenswrapper[4792]: I0301 09:22:45.602318 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-776c7d78bd-jwfh6" podStartSLOduration=1.594532041 podStartE2EDuration="7.602301627s" podCreationTimestamp="2026-03-01 09:22:38 +0000 UTC" firstStartedPulling="2026-03-01 09:22:39.349277975 +0000 UTC m=+888.591157172" lastFinishedPulling="2026-03-01 09:22:45.357047561 +0000 UTC m=+894.598926758" observedRunningTime="2026-03-01 09:22:45.598614486 +0000 UTC m=+894.840493693" watchObservedRunningTime="2026-03-01 09:22:45.602301627 +0000 UTC m=+894.844180824" Mar 01 09:22:45 crc kubenswrapper[4792]: I0301 09:22:45.632804 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hw8r6" Mar 01 09:22:45 crc kubenswrapper[4792]: I0301 09:22:45.675326 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hw8r6" Mar 01 09:22:45 crc kubenswrapper[4792]: I0301 09:22:45.868040 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hw8r6"] Mar 01 09:22:47 crc kubenswrapper[4792]: I0301 09:22:47.595007 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hw8r6" podUID="2905773e-c73d-4965-83a8-b1eff758a9b6" containerName="registry-server" containerID="cri-o://cc13c02cfab236dc86f662df07250f98a1840645f3d512661c31a12e8c7b0f08" gracePeriod=2 Mar 01 09:22:47 crc kubenswrapper[4792]: I0301 09:22:47.975767 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hw8r6" Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.071313 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssnqt\" (UniqueName: \"kubernetes.io/projected/2905773e-c73d-4965-83a8-b1eff758a9b6-kube-api-access-ssnqt\") pod \"2905773e-c73d-4965-83a8-b1eff758a9b6\" (UID: \"2905773e-c73d-4965-83a8-b1eff758a9b6\") " Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.071392 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2905773e-c73d-4965-83a8-b1eff758a9b6-utilities\") pod \"2905773e-c73d-4965-83a8-b1eff758a9b6\" (UID: \"2905773e-c73d-4965-83a8-b1eff758a9b6\") " Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.071477 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2905773e-c73d-4965-83a8-b1eff758a9b6-catalog-content\") pod \"2905773e-c73d-4965-83a8-b1eff758a9b6\" (UID: \"2905773e-c73d-4965-83a8-b1eff758a9b6\") " Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.073223 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2905773e-c73d-4965-83a8-b1eff758a9b6-utilities" (OuterVolumeSpecName: "utilities") pod "2905773e-c73d-4965-83a8-b1eff758a9b6" (UID: "2905773e-c73d-4965-83a8-b1eff758a9b6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.082346 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2905773e-c73d-4965-83a8-b1eff758a9b6-kube-api-access-ssnqt" (OuterVolumeSpecName: "kube-api-access-ssnqt") pod "2905773e-c73d-4965-83a8-b1eff758a9b6" (UID: "2905773e-c73d-4965-83a8-b1eff758a9b6"). InnerVolumeSpecName "kube-api-access-ssnqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.173330 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssnqt\" (UniqueName: \"kubernetes.io/projected/2905773e-c73d-4965-83a8-b1eff758a9b6-kube-api-access-ssnqt\") on node \"crc\" DevicePath \"\"" Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.173360 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2905773e-c73d-4965-83a8-b1eff758a9b6-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.187572 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2905773e-c73d-4965-83a8-b1eff758a9b6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2905773e-c73d-4965-83a8-b1eff758a9b6" (UID: "2905773e-c73d-4965-83a8-b1eff758a9b6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.274688 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2905773e-c73d-4965-83a8-b1eff758a9b6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.602424 4792 generic.go:334] "Generic (PLEG): container finished" podID="2905773e-c73d-4965-83a8-b1eff758a9b6" containerID="cc13c02cfab236dc86f662df07250f98a1840645f3d512661c31a12e8c7b0f08" exitCode=0 Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.602468 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hw8r6" event={"ID":"2905773e-c73d-4965-83a8-b1eff758a9b6","Type":"ContainerDied","Data":"cc13c02cfab236dc86f662df07250f98a1840645f3d512661c31a12e8c7b0f08"} Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.602497 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hw8r6" event={"ID":"2905773e-c73d-4965-83a8-b1eff758a9b6","Type":"ContainerDied","Data":"fc49f8e966aa2484369924988e53dd7acfdccbf1699d09f3aa942c9391b8496b"} Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.602517 4792 scope.go:117] "RemoveContainer" containerID="cc13c02cfab236dc86f662df07250f98a1840645f3d512661c31a12e8c7b0f08" Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.602640 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hw8r6" Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.625282 4792 scope.go:117] "RemoveContainer" containerID="f501896b6f9fd28b97abc647b2887daf59ebd8100e2ca875ad728ab54e2ea400" Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.638324 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hw8r6"] Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.650732 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hw8r6"] Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.655034 4792 scope.go:117] "RemoveContainer" containerID="f68e5b68249b6c436545d18ed3e8511507d143daee0d4270a3aa78de08afac15" Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.687589 4792 scope.go:117] "RemoveContainer" containerID="cc13c02cfab236dc86f662df07250f98a1840645f3d512661c31a12e8c7b0f08" Mar 01 09:22:48 crc kubenswrapper[4792]: E0301 09:22:48.688061 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc13c02cfab236dc86f662df07250f98a1840645f3d512661c31a12e8c7b0f08\": container with ID starting with cc13c02cfab236dc86f662df07250f98a1840645f3d512661c31a12e8c7b0f08 not found: ID does not exist" containerID="cc13c02cfab236dc86f662df07250f98a1840645f3d512661c31a12e8c7b0f08" Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.688099 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc13c02cfab236dc86f662df07250f98a1840645f3d512661c31a12e8c7b0f08"} err="failed to get container status \"cc13c02cfab236dc86f662df07250f98a1840645f3d512661c31a12e8c7b0f08\": rpc error: code = NotFound desc = could not find container \"cc13c02cfab236dc86f662df07250f98a1840645f3d512661c31a12e8c7b0f08\": container with ID starting with cc13c02cfab236dc86f662df07250f98a1840645f3d512661c31a12e8c7b0f08 not found: ID does not exist" Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.688121 4792 scope.go:117] "RemoveContainer" containerID="f501896b6f9fd28b97abc647b2887daf59ebd8100e2ca875ad728ab54e2ea400" Mar 01 09:22:48 crc kubenswrapper[4792]: E0301 09:22:48.688438 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f501896b6f9fd28b97abc647b2887daf59ebd8100e2ca875ad728ab54e2ea400\": container with ID starting with f501896b6f9fd28b97abc647b2887daf59ebd8100e2ca875ad728ab54e2ea400 not found: ID does not exist" containerID="f501896b6f9fd28b97abc647b2887daf59ebd8100e2ca875ad728ab54e2ea400" Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.688460 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f501896b6f9fd28b97abc647b2887daf59ebd8100e2ca875ad728ab54e2ea400"} err="failed to get container status \"f501896b6f9fd28b97abc647b2887daf59ebd8100e2ca875ad728ab54e2ea400\": rpc error: code = NotFound desc = could not find container \"f501896b6f9fd28b97abc647b2887daf59ebd8100e2ca875ad728ab54e2ea400\": container with ID starting with f501896b6f9fd28b97abc647b2887daf59ebd8100e2ca875ad728ab54e2ea400 not found: ID does not exist" Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.688486 4792 scope.go:117] "RemoveContainer" containerID="f68e5b68249b6c436545d18ed3e8511507d143daee0d4270a3aa78de08afac15" Mar 01 09:22:48 crc kubenswrapper[4792]: E0301 09:22:48.688754 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f68e5b68249b6c436545d18ed3e8511507d143daee0d4270a3aa78de08afac15\": container with ID starting with f68e5b68249b6c436545d18ed3e8511507d143daee0d4270a3aa78de08afac15 not found: ID does not exist" containerID="f68e5b68249b6c436545d18ed3e8511507d143daee0d4270a3aa78de08afac15" Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.688777 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f68e5b68249b6c436545d18ed3e8511507d143daee0d4270a3aa78de08afac15"} err="failed to get container status \"f68e5b68249b6c436545d18ed3e8511507d143daee0d4270a3aa78de08afac15\": rpc error: code = NotFound desc = could not find container \"f68e5b68249b6c436545d18ed3e8511507d143daee0d4270a3aa78de08afac15\": container with ID starting with f68e5b68249b6c436545d18ed3e8511507d143daee0d4270a3aa78de08afac15 not found: ID does not exist" Mar 01 09:22:49 crc kubenswrapper[4792]: I0301 09:22:49.416497 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2905773e-c73d-4965-83a8-b1eff758a9b6" path="/var/lib/kubelet/pods/2905773e-c73d-4965-83a8-b1eff758a9b6/volumes" Mar 01 09:22:52 crc kubenswrapper[4792]: I0301 09:22:52.858467 4792 scope.go:117] "RemoveContainer" containerID="7f82c367589f33dd358f1e6a48f6206b0470bf80f8be1de16c8420482e80dba1" Mar 01 09:22:59 crc kubenswrapper[4792]: I0301 09:22:59.019832 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-776c7d78bd-jwfh6" Mar 01 09:23:04 crc kubenswrapper[4792]: I0301 09:23:04.943114 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:23:04 crc kubenswrapper[4792]: I0301 09:23:04.943588 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:23:18 crc kubenswrapper[4792]: I0301 09:23:18.639484 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5cd84fcfbc-lrpmz" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.457539 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-fjh95"] Mar 01 09:23:19 crc kubenswrapper[4792]: E0301 09:23:19.458206 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2905773e-c73d-4965-83a8-b1eff758a9b6" containerName="extract-content" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.458221 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2905773e-c73d-4965-83a8-b1eff758a9b6" containerName="extract-content" Mar 01 09:23:19 crc kubenswrapper[4792]: E0301 09:23:19.458248 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2905773e-c73d-4965-83a8-b1eff758a9b6" containerName="extract-utilities" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.458255 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2905773e-c73d-4965-83a8-b1eff758a9b6" containerName="extract-utilities" Mar 01 09:23:19 crc kubenswrapper[4792]: E0301 09:23:19.458263 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2905773e-c73d-4965-83a8-b1eff758a9b6" containerName="registry-server" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.458269 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2905773e-c73d-4965-83a8-b1eff758a9b6" containerName="registry-server" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.458363 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="2905773e-c73d-4965-83a8-b1eff758a9b6" containerName="registry-server" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.460254 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.462510 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/53127911-b831-4b3a-816d-ff8271118244-reloader\") pod \"frr-k8s-fjh95\" (UID: \"53127911-b831-4b3a-816d-ff8271118244\") " pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.462633 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/53127911-b831-4b3a-816d-ff8271118244-metrics\") pod \"frr-k8s-fjh95\" (UID: \"53127911-b831-4b3a-816d-ff8271118244\") " pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.462793 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/53127911-b831-4b3a-816d-ff8271118244-frr-sockets\") pod \"frr-k8s-fjh95\" (UID: \"53127911-b831-4b3a-816d-ff8271118244\") " pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.462867 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/53127911-b831-4b3a-816d-ff8271118244-frr-startup\") pod \"frr-k8s-fjh95\" (UID: \"53127911-b831-4b3a-816d-ff8271118244\") " pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.462990 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttvlq\" (UniqueName: \"kubernetes.io/projected/53127911-b831-4b3a-816d-ff8271118244-kube-api-access-ttvlq\") pod \"frr-k8s-fjh95\" (UID: \"53127911-b831-4b3a-816d-ff8271118244\") " pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.463212 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53127911-b831-4b3a-816d-ff8271118244-metrics-certs\") pod \"frr-k8s-fjh95\" (UID: \"53127911-b831-4b3a-816d-ff8271118244\") " pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.463306 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/53127911-b831-4b3a-816d-ff8271118244-frr-conf\") pod \"frr-k8s-fjh95\" (UID: \"53127911-b831-4b3a-816d-ff8271118244\") " pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.465267 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-ml45x" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.469995 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-kfnzk"] Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.470409 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.470476 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.470747 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-kfnzk" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.473874 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.490537 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-kfnzk"] Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.566463 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/53127911-b831-4b3a-816d-ff8271118244-frr-sockets\") pod \"frr-k8s-fjh95\" (UID: \"53127911-b831-4b3a-816d-ff8271118244\") " pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.566711 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/53127911-b831-4b3a-816d-ff8271118244-frr-startup\") pod \"frr-k8s-fjh95\" (UID: \"53127911-b831-4b3a-816d-ff8271118244\") " pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.566755 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttvlq\" (UniqueName: \"kubernetes.io/projected/53127911-b831-4b3a-816d-ff8271118244-kube-api-access-ttvlq\") pod \"frr-k8s-fjh95\" (UID: \"53127911-b831-4b3a-816d-ff8271118244\") " pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.566790 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53127911-b831-4b3a-816d-ff8271118244-metrics-certs\") pod \"frr-k8s-fjh95\" (UID: \"53127911-b831-4b3a-816d-ff8271118244\") " pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.566813 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/53127911-b831-4b3a-816d-ff8271118244-frr-conf\") pod \"frr-k8s-fjh95\" (UID: \"53127911-b831-4b3a-816d-ff8271118244\") " pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.566859 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/53127911-b831-4b3a-816d-ff8271118244-reloader\") pod \"frr-k8s-fjh95\" (UID: \"53127911-b831-4b3a-816d-ff8271118244\") " pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.566874 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/53127911-b831-4b3a-816d-ff8271118244-metrics\") pod \"frr-k8s-fjh95\" (UID: \"53127911-b831-4b3a-816d-ff8271118244\") " pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.566938 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/53127911-b831-4b3a-816d-ff8271118244-frr-sockets\") pod \"frr-k8s-fjh95\" (UID: \"53127911-b831-4b3a-816d-ff8271118244\") " pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:19 crc kubenswrapper[4792]: E0301 09:23:19.567041 4792 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Mar 01 09:23:19 crc kubenswrapper[4792]: E0301 09:23:19.567089 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53127911-b831-4b3a-816d-ff8271118244-metrics-certs podName:53127911-b831-4b3a-816d-ff8271118244 nodeName:}" failed. No retries permitted until 2026-03-01 09:23:20.067074566 +0000 UTC m=+929.308953763 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/53127911-b831-4b3a-816d-ff8271118244-metrics-certs") pod "frr-k8s-fjh95" (UID: "53127911-b831-4b3a-816d-ff8271118244") : secret "frr-k8s-certs-secret" not found Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.567200 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/53127911-b831-4b3a-816d-ff8271118244-metrics\") pod \"frr-k8s-fjh95\" (UID: \"53127911-b831-4b3a-816d-ff8271118244\") " pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.567380 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/53127911-b831-4b3a-816d-ff8271118244-frr-conf\") pod \"frr-k8s-fjh95\" (UID: \"53127911-b831-4b3a-816d-ff8271118244\") " pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.568413 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/53127911-b831-4b3a-816d-ff8271118244-frr-startup\") pod \"frr-k8s-fjh95\" (UID: \"53127911-b831-4b3a-816d-ff8271118244\") " pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.568538 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/53127911-b831-4b3a-816d-ff8271118244-reloader\") pod \"frr-k8s-fjh95\" (UID: \"53127911-b831-4b3a-816d-ff8271118244\") " pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.579696 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-zpr27"] Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.580769 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-zpr27" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.592148 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-v7jnd" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.592313 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.592431 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.592536 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.606827 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttvlq\" (UniqueName: \"kubernetes.io/projected/53127911-b831-4b3a-816d-ff8271118244-kube-api-access-ttvlq\") pod \"frr-k8s-fjh95\" (UID: \"53127911-b831-4b3a-816d-ff8271118244\") " pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.608225 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-86ddb6bd46-twxml"] Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.609021 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-twxml" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.618450 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.643738 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-twxml"] Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.675730 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d2f0572c-e661-495c-873c-6e2d18f2ab7d-cert\") pod \"frr-k8s-webhook-server-7f989f654f-kfnzk\" (UID: \"d2f0572c-e661-495c-873c-6e2d18f2ab7d\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-kfnzk" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.675777 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vmsk\" (UniqueName: \"kubernetes.io/projected/d2f0572c-e661-495c-873c-6e2d18f2ab7d-kube-api-access-2vmsk\") pod \"frr-k8s-webhook-server-7f989f654f-kfnzk\" (UID: \"d2f0572c-e661-495c-873c-6e2d18f2ab7d\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-kfnzk" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.777309 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7-metrics-certs\") pod \"speaker-zpr27\" (UID: \"8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7\") " pod="metallb-system/speaker-zpr27" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.777377 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d2f0572c-e661-495c-873c-6e2d18f2ab7d-cert\") pod \"frr-k8s-webhook-server-7f989f654f-kfnzk\" (UID: \"d2f0572c-e661-495c-873c-6e2d18f2ab7d\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-kfnzk" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.777397 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vmsk\" (UniqueName: \"kubernetes.io/projected/d2f0572c-e661-495c-873c-6e2d18f2ab7d-kube-api-access-2vmsk\") pod \"frr-k8s-webhook-server-7f989f654f-kfnzk\" (UID: \"d2f0572c-e661-495c-873c-6e2d18f2ab7d\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-kfnzk" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.777418 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7-metallb-excludel2\") pod \"speaker-zpr27\" (UID: \"8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7\") " pod="metallb-system/speaker-zpr27" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.777441 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f73a6813-31ea-4018-bd23-45bf2f1dfe89-metrics-certs\") pod \"controller-86ddb6bd46-twxml\" (UID: \"f73a6813-31ea-4018-bd23-45bf2f1dfe89\") " pod="metallb-system/controller-86ddb6bd46-twxml" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.777461 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f73a6813-31ea-4018-bd23-45bf2f1dfe89-cert\") pod \"controller-86ddb6bd46-twxml\" (UID: \"f73a6813-31ea-4018-bd23-45bf2f1dfe89\") " pod="metallb-system/controller-86ddb6bd46-twxml" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.777481 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7-memberlist\") pod \"speaker-zpr27\" (UID: \"8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7\") " pod="metallb-system/speaker-zpr27" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.777500 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n92z9\" (UniqueName: \"kubernetes.io/projected/f73a6813-31ea-4018-bd23-45bf2f1dfe89-kube-api-access-n92z9\") pod \"controller-86ddb6bd46-twxml\" (UID: \"f73a6813-31ea-4018-bd23-45bf2f1dfe89\") " pod="metallb-system/controller-86ddb6bd46-twxml" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.777525 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkrfz\" (UniqueName: \"kubernetes.io/projected/8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7-kube-api-access-mkrfz\") pod \"speaker-zpr27\" (UID: \"8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7\") " pod="metallb-system/speaker-zpr27" Mar 01 09:23:19 crc kubenswrapper[4792]: E0301 09:23:19.777528 4792 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Mar 01 09:23:19 crc kubenswrapper[4792]: E0301 09:23:19.777601 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2f0572c-e661-495c-873c-6e2d18f2ab7d-cert podName:d2f0572c-e661-495c-873c-6e2d18f2ab7d nodeName:}" failed. No retries permitted until 2026-03-01 09:23:20.277583901 +0000 UTC m=+929.519463098 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d2f0572c-e661-495c-873c-6e2d18f2ab7d-cert") pod "frr-k8s-webhook-server-7f989f654f-kfnzk" (UID: "d2f0572c-e661-495c-873c-6e2d18f2ab7d") : secret "frr-k8s-webhook-server-cert" not found Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.813692 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vmsk\" (UniqueName: \"kubernetes.io/projected/d2f0572c-e661-495c-873c-6e2d18f2ab7d-kube-api-access-2vmsk\") pod \"frr-k8s-webhook-server-7f989f654f-kfnzk\" (UID: \"d2f0572c-e661-495c-873c-6e2d18f2ab7d\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-kfnzk" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.878690 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7-metrics-certs\") pod \"speaker-zpr27\" (UID: \"8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7\") " pod="metallb-system/speaker-zpr27" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.878773 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7-metallb-excludel2\") pod \"speaker-zpr27\" (UID: \"8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7\") " pod="metallb-system/speaker-zpr27" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.878799 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f73a6813-31ea-4018-bd23-45bf2f1dfe89-metrics-certs\") pod \"controller-86ddb6bd46-twxml\" (UID: \"f73a6813-31ea-4018-bd23-45bf2f1dfe89\") " pod="metallb-system/controller-86ddb6bd46-twxml" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.878817 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f73a6813-31ea-4018-bd23-45bf2f1dfe89-cert\") pod \"controller-86ddb6bd46-twxml\" (UID: \"f73a6813-31ea-4018-bd23-45bf2f1dfe89\") " pod="metallb-system/controller-86ddb6bd46-twxml" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.878836 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7-memberlist\") pod \"speaker-zpr27\" (UID: \"8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7\") " pod="metallb-system/speaker-zpr27" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.878853 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n92z9\" (UniqueName: \"kubernetes.io/projected/f73a6813-31ea-4018-bd23-45bf2f1dfe89-kube-api-access-n92z9\") pod \"controller-86ddb6bd46-twxml\" (UID: \"f73a6813-31ea-4018-bd23-45bf2f1dfe89\") " pod="metallb-system/controller-86ddb6bd46-twxml" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.878877 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkrfz\" (UniqueName: \"kubernetes.io/projected/8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7-kube-api-access-mkrfz\") pod \"speaker-zpr27\" (UID: \"8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7\") " pod="metallb-system/speaker-zpr27" Mar 01 09:23:19 crc kubenswrapper[4792]: E0301 09:23:19.879250 4792 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Mar 01 09:23:19 crc kubenswrapper[4792]: E0301 09:23:19.879298 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7-metrics-certs podName:8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7 nodeName:}" failed. No retries permitted until 2026-03-01 09:23:20.379286246 +0000 UTC m=+929.621165443 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7-metrics-certs") pod "speaker-zpr27" (UID: "8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7") : secret "speaker-certs-secret" not found Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.879968 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7-metallb-excludel2\") pod \"speaker-zpr27\" (UID: \"8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7\") " pod="metallb-system/speaker-zpr27" Mar 01 09:23:19 crc kubenswrapper[4792]: E0301 09:23:19.880029 4792 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Mar 01 09:23:19 crc kubenswrapper[4792]: E0301 09:23:19.880054 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f73a6813-31ea-4018-bd23-45bf2f1dfe89-metrics-certs podName:f73a6813-31ea-4018-bd23-45bf2f1dfe89 nodeName:}" failed. No retries permitted until 2026-03-01 09:23:20.380046085 +0000 UTC m=+929.621925282 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f73a6813-31ea-4018-bd23-45bf2f1dfe89-metrics-certs") pod "controller-86ddb6bd46-twxml" (UID: "f73a6813-31ea-4018-bd23-45bf2f1dfe89") : secret "controller-certs-secret" not found Mar 01 09:23:19 crc kubenswrapper[4792]: E0301 09:23:19.880117 4792 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 01 09:23:19 crc kubenswrapper[4792]: E0301 09:23:19.880138 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7-memberlist podName:8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7 nodeName:}" failed. No retries permitted until 2026-03-01 09:23:20.380131267 +0000 UTC m=+929.622010454 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7-memberlist") pod "speaker-zpr27" (UID: "8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7") : secret "metallb-memberlist" not found Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.882614 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.901534 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f73a6813-31ea-4018-bd23-45bf2f1dfe89-cert\") pod \"controller-86ddb6bd46-twxml\" (UID: \"f73a6813-31ea-4018-bd23-45bf2f1dfe89\") " pod="metallb-system/controller-86ddb6bd46-twxml" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.905008 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n92z9\" (UniqueName: \"kubernetes.io/projected/f73a6813-31ea-4018-bd23-45bf2f1dfe89-kube-api-access-n92z9\") pod \"controller-86ddb6bd46-twxml\" (UID: \"f73a6813-31ea-4018-bd23-45bf2f1dfe89\") " pod="metallb-system/controller-86ddb6bd46-twxml" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.905736 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkrfz\" (UniqueName: \"kubernetes.io/projected/8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7-kube-api-access-mkrfz\") pod \"speaker-zpr27\" (UID: \"8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7\") " pod="metallb-system/speaker-zpr27" Mar 01 09:23:20 crc kubenswrapper[4792]: I0301 09:23:20.081173 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53127911-b831-4b3a-816d-ff8271118244-metrics-certs\") pod \"frr-k8s-fjh95\" (UID: \"53127911-b831-4b3a-816d-ff8271118244\") " pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:20 crc kubenswrapper[4792]: I0301 09:23:20.091449 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53127911-b831-4b3a-816d-ff8271118244-metrics-certs\") pod \"frr-k8s-fjh95\" (UID: \"53127911-b831-4b3a-816d-ff8271118244\") " pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:20 crc kubenswrapper[4792]: I0301 09:23:20.283087 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d2f0572c-e661-495c-873c-6e2d18f2ab7d-cert\") pod \"frr-k8s-webhook-server-7f989f654f-kfnzk\" (UID: \"d2f0572c-e661-495c-873c-6e2d18f2ab7d\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-kfnzk" Mar 01 09:23:20 crc kubenswrapper[4792]: I0301 09:23:20.285766 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d2f0572c-e661-495c-873c-6e2d18f2ab7d-cert\") pod \"frr-k8s-webhook-server-7f989f654f-kfnzk\" (UID: \"d2f0572c-e661-495c-873c-6e2d18f2ab7d\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-kfnzk" Mar 01 09:23:20 crc kubenswrapper[4792]: I0301 09:23:20.383170 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:20 crc kubenswrapper[4792]: I0301 09:23:20.384114 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7-metrics-certs\") pod \"speaker-zpr27\" (UID: \"8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7\") " pod="metallb-system/speaker-zpr27" Mar 01 09:23:20 crc kubenswrapper[4792]: I0301 09:23:20.385123 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f73a6813-31ea-4018-bd23-45bf2f1dfe89-metrics-certs\") pod \"controller-86ddb6bd46-twxml\" (UID: \"f73a6813-31ea-4018-bd23-45bf2f1dfe89\") " pod="metallb-system/controller-86ddb6bd46-twxml" Mar 01 09:23:20 crc kubenswrapper[4792]: I0301 09:23:20.385182 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7-memberlist\") pod \"speaker-zpr27\" (UID: \"8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7\") " pod="metallb-system/speaker-zpr27" Mar 01 09:23:20 crc kubenswrapper[4792]: E0301 09:23:20.385334 4792 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 01 09:23:20 crc kubenswrapper[4792]: E0301 09:23:20.385400 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7-memberlist podName:8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7 nodeName:}" failed. No retries permitted until 2026-03-01 09:23:21.385376013 +0000 UTC m=+930.627255240 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7-memberlist") pod "speaker-zpr27" (UID: "8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7") : secret "metallb-memberlist" not found Mar 01 09:23:20 crc kubenswrapper[4792]: I0301 09:23:20.387832 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f73a6813-31ea-4018-bd23-45bf2f1dfe89-metrics-certs\") pod \"controller-86ddb6bd46-twxml\" (UID: \"f73a6813-31ea-4018-bd23-45bf2f1dfe89\") " pod="metallb-system/controller-86ddb6bd46-twxml" Mar 01 09:23:20 crc kubenswrapper[4792]: I0301 09:23:20.387999 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7-metrics-certs\") pod \"speaker-zpr27\" (UID: \"8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7\") " pod="metallb-system/speaker-zpr27" Mar 01 09:23:20 crc kubenswrapper[4792]: I0301 09:23:20.394206 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-kfnzk" Mar 01 09:23:20 crc kubenswrapper[4792]: I0301 09:23:20.501297 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 01 09:23:20 crc kubenswrapper[4792]: I0301 09:23:20.558437 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-twxml" Mar 01 09:23:20 crc kubenswrapper[4792]: I0301 09:23:20.753888 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-twxml"] Mar 01 09:23:20 crc kubenswrapper[4792]: W0301 09:23:20.760655 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf73a6813_31ea_4018_bd23_45bf2f1dfe89.slice/crio-dd7b7cadbd0b299fb1c2a072488f1ceccc61669b5710fd67105da12d92e4f954 WatchSource:0}: Error finding container dd7b7cadbd0b299fb1c2a072488f1ceccc61669b5710fd67105da12d92e4f954: Status 404 returned error can't find the container with id dd7b7cadbd0b299fb1c2a072488f1ceccc61669b5710fd67105da12d92e4f954 Mar 01 09:23:20 crc kubenswrapper[4792]: I0301 09:23:20.798641 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-kfnzk"] Mar 01 09:23:20 crc kubenswrapper[4792]: I0301 09:23:20.822389 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-twxml" event={"ID":"f73a6813-31ea-4018-bd23-45bf2f1dfe89","Type":"ContainerStarted","Data":"dd7b7cadbd0b299fb1c2a072488f1ceccc61669b5710fd67105da12d92e4f954"} Mar 01 09:23:20 crc kubenswrapper[4792]: I0301 09:23:20.823202 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fjh95" event={"ID":"53127911-b831-4b3a-816d-ff8271118244","Type":"ContainerStarted","Data":"f74d28b794be22820b963fd68426650c305ba12ef9c68cb95e15c544960c5fdb"} Mar 01 09:23:20 crc kubenswrapper[4792]: I0301 09:23:20.825574 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-kfnzk" event={"ID":"d2f0572c-e661-495c-873c-6e2d18f2ab7d","Type":"ContainerStarted","Data":"f943325d04a20f53a0d112a1c731bc17eededc9d94b599cb138af76e39e12202"} Mar 01 09:23:21 crc kubenswrapper[4792]: I0301 09:23:21.395992 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7-memberlist\") pod \"speaker-zpr27\" (UID: \"8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7\") " pod="metallb-system/speaker-zpr27" Mar 01 09:23:21 crc kubenswrapper[4792]: I0301 09:23:21.401841 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7-memberlist\") pod \"speaker-zpr27\" (UID: \"8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7\") " pod="metallb-system/speaker-zpr27" Mar 01 09:23:21 crc kubenswrapper[4792]: I0301 09:23:21.692825 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-zpr27" Mar 01 09:23:21 crc kubenswrapper[4792]: I0301 09:23:21.858298 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-twxml" event={"ID":"f73a6813-31ea-4018-bd23-45bf2f1dfe89","Type":"ContainerStarted","Data":"eabe384ac232e5735ff297d62687099b4e73e91c340b530929a4cae49694055d"} Mar 01 09:23:21 crc kubenswrapper[4792]: I0301 09:23:21.860215 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-twxml" event={"ID":"f73a6813-31ea-4018-bd23-45bf2f1dfe89","Type":"ContainerStarted","Data":"713ee522ef6f3709439e71064ddec2c9754005218fadb3586ec288372a43e550"} Mar 01 09:23:21 crc kubenswrapper[4792]: I0301 09:23:21.860349 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-twxml" Mar 01 09:23:21 crc kubenswrapper[4792]: I0301 09:23:21.876325 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zpr27" event={"ID":"8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7","Type":"ContainerStarted","Data":"842e7a6e0b3d1e46c10b1ce8dee51b18771e54a186215e7f33679d56b4259e47"} Mar 01 09:23:21 crc kubenswrapper[4792]: I0301 09:23:21.923347 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-86ddb6bd46-twxml" podStartSLOduration=2.9233276249999998 podStartE2EDuration="2.923327625s" podCreationTimestamp="2026-03-01 09:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:23:21.912472149 +0000 UTC m=+931.154351346" watchObservedRunningTime="2026-03-01 09:23:21.923327625 +0000 UTC m=+931.165206822" Mar 01 09:23:22 crc kubenswrapper[4792]: I0301 09:23:22.890528 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zpr27" event={"ID":"8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7","Type":"ContainerStarted","Data":"08ee59ceb76dc5ca9b8375a9ccea179ce981cbe1aec845042bbc4d3e1d1e9628"} Mar 01 09:23:22 crc kubenswrapper[4792]: I0301 09:23:22.890578 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zpr27" event={"ID":"8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7","Type":"ContainerStarted","Data":"2f3df5e44eb8abc0a9ba2ca75ed28771e6a184e173622de84ef7ac03adf5f95d"} Mar 01 09:23:22 crc kubenswrapper[4792]: I0301 09:23:22.910877 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-zpr27" podStartSLOduration=3.910858755 podStartE2EDuration="3.910858755s" podCreationTimestamp="2026-03-01 09:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:23:22.906522988 +0000 UTC m=+932.148402185" watchObservedRunningTime="2026-03-01 09:23:22.910858755 +0000 UTC m=+932.152737952" Mar 01 09:23:23 crc kubenswrapper[4792]: I0301 09:23:23.896468 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-zpr27" Mar 01 09:23:27 crc kubenswrapper[4792]: I0301 09:23:27.927190 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-kfnzk" event={"ID":"d2f0572c-e661-495c-873c-6e2d18f2ab7d","Type":"ContainerStarted","Data":"9856058333ce3fd85c0ea840c995e331a0fed9a49e9cee3fdfba8075b2489ea6"} Mar 01 09:23:27 crc kubenswrapper[4792]: I0301 09:23:27.928752 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-kfnzk" Mar 01 09:23:27 crc kubenswrapper[4792]: I0301 09:23:27.930600 4792 generic.go:334] "Generic (PLEG): container finished" podID="53127911-b831-4b3a-816d-ff8271118244" containerID="2849c13ea036c394232db22832c7b8a847417cc540e5e50969a2f5e3d8c2e8ce" exitCode=0 Mar 01 09:23:27 crc kubenswrapper[4792]: I0301 09:23:27.930653 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fjh95" event={"ID":"53127911-b831-4b3a-816d-ff8271118244","Type":"ContainerDied","Data":"2849c13ea036c394232db22832c7b8a847417cc540e5e50969a2f5e3d8c2e8ce"} Mar 01 09:23:27 crc kubenswrapper[4792]: I0301 09:23:27.984457 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-kfnzk" podStartSLOduration=2.042168546 podStartE2EDuration="8.984438153s" podCreationTimestamp="2026-03-01 09:23:19 +0000 UTC" firstStartedPulling="2026-03-01 09:23:20.809883327 +0000 UTC m=+930.051762524" lastFinishedPulling="2026-03-01 09:23:27.752152934 +0000 UTC m=+936.994032131" observedRunningTime="2026-03-01 09:23:27.951117095 +0000 UTC m=+937.192996292" watchObservedRunningTime="2026-03-01 09:23:27.984438153 +0000 UTC m=+937.226317350" Mar 01 09:23:28 crc kubenswrapper[4792]: I0301 09:23:28.937633 4792 generic.go:334] "Generic (PLEG): container finished" podID="53127911-b831-4b3a-816d-ff8271118244" containerID="aa4c9d8df43a35764346935e48866f5131887b7f2a4eb9631d0df7fdf5b57e5e" exitCode=0 Mar 01 09:23:28 crc kubenswrapper[4792]: I0301 09:23:28.937722 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fjh95" event={"ID":"53127911-b831-4b3a-816d-ff8271118244","Type":"ContainerDied","Data":"aa4c9d8df43a35764346935e48866f5131887b7f2a4eb9631d0df7fdf5b57e5e"} Mar 01 09:23:29 crc kubenswrapper[4792]: I0301 09:23:29.945824 4792 generic.go:334] "Generic (PLEG): container finished" podID="53127911-b831-4b3a-816d-ff8271118244" containerID="c5567badd19bd0774a5879cb60e83c9e9efe133a039626364b25b32420a0fbd9" exitCode=0 Mar 01 09:23:29 crc kubenswrapper[4792]: I0301 09:23:29.946174 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fjh95" event={"ID":"53127911-b831-4b3a-816d-ff8271118244","Type":"ContainerDied","Data":"c5567badd19bd0774a5879cb60e83c9e9efe133a039626364b25b32420a0fbd9"} Mar 01 09:23:30 crc kubenswrapper[4792]: I0301 09:23:30.576585 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-86ddb6bd46-twxml" Mar 01 09:23:30 crc kubenswrapper[4792]: I0301 09:23:30.963448 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fjh95" event={"ID":"53127911-b831-4b3a-816d-ff8271118244","Type":"ContainerStarted","Data":"34e4bb559cfb9d6e311c8dfbe2b6e322b66c7cef954dce8387c2f2aea3eda5bd"} Mar 01 09:23:30 crc kubenswrapper[4792]: I0301 09:23:30.963489 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fjh95" event={"ID":"53127911-b831-4b3a-816d-ff8271118244","Type":"ContainerStarted","Data":"dfc24e756e4ba7fa7e3d2587c6a5e57d3eafe42995d193b132e5e55c76946896"} Mar 01 09:23:30 crc kubenswrapper[4792]: I0301 09:23:30.963502 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fjh95" event={"ID":"53127911-b831-4b3a-816d-ff8271118244","Type":"ContainerStarted","Data":"26f91c5c157dae44d9ef6b2752ae51f2a09dfb570bcddcd67b7b331f9f89f181"} Mar 01 09:23:30 crc kubenswrapper[4792]: I0301 09:23:30.963512 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fjh95" event={"ID":"53127911-b831-4b3a-816d-ff8271118244","Type":"ContainerStarted","Data":"90be616421c9f3419a458a1fef2213ebed5c52b4aa4b8799bed8863dac9370f9"} Mar 01 09:23:30 crc kubenswrapper[4792]: I0301 09:23:30.963522 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fjh95" event={"ID":"53127911-b831-4b3a-816d-ff8271118244","Type":"ContainerStarted","Data":"4afebf9e80babf7e58e9ef7416b26544ff34766a1d654b36f2e4854e21aa3036"} Mar 01 09:23:31 crc kubenswrapper[4792]: I0301 09:23:31.973296 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fjh95" event={"ID":"53127911-b831-4b3a-816d-ff8271118244","Type":"ContainerStarted","Data":"f5a0c46df22e1e5bf9fd8e6d06a9d81d9a441358efdd9259aa7325c336c66cf0"} Mar 01 09:23:31 crc kubenswrapper[4792]: I0301 09:23:31.973591 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:31 crc kubenswrapper[4792]: I0301 09:23:31.995491 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-fjh95" podStartSLOduration=5.726734755 podStartE2EDuration="12.995470342s" podCreationTimestamp="2026-03-01 09:23:19 +0000 UTC" firstStartedPulling="2026-03-01 09:23:20.50107341 +0000 UTC m=+929.742952607" lastFinishedPulling="2026-03-01 09:23:27.769809007 +0000 UTC m=+937.011688194" observedRunningTime="2026-03-01 09:23:31.991042363 +0000 UTC m=+941.232921560" watchObservedRunningTime="2026-03-01 09:23:31.995470342 +0000 UTC m=+941.237349529" Mar 01 09:23:34 crc kubenswrapper[4792]: I0301 09:23:34.942577 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:23:34 crc kubenswrapper[4792]: I0301 09:23:34.942822 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:23:34 crc kubenswrapper[4792]: I0301 09:23:34.942855 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:23:34 crc kubenswrapper[4792]: I0301 09:23:34.943584 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f223d76bf80d673696c61fa09c13cc363c202a24d360c9e2da1e52c335e05521"} pod="openshift-machine-config-operator/machine-config-daemon-bqszv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 01 09:23:34 crc kubenswrapper[4792]: I0301 09:23:34.943637 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" containerID="cri-o://f223d76bf80d673696c61fa09c13cc363c202a24d360c9e2da1e52c335e05521" gracePeriod=600 Mar 01 09:23:35 crc kubenswrapper[4792]: I0301 09:23:35.384093 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:35 crc kubenswrapper[4792]: I0301 09:23:35.439994 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:36 crc kubenswrapper[4792]: I0301 09:23:36.001157 4792 generic.go:334] "Generic (PLEG): container finished" podID="9105f6b0-6f16-47aa-8009-73736a90b765" containerID="f223d76bf80d673696c61fa09c13cc363c202a24d360c9e2da1e52c335e05521" exitCode=0 Mar 01 09:23:36 crc kubenswrapper[4792]: I0301 09:23:36.002242 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerDied","Data":"f223d76bf80d673696c61fa09c13cc363c202a24d360c9e2da1e52c335e05521"} Mar 01 09:23:36 crc kubenswrapper[4792]: I0301 09:23:36.002355 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerStarted","Data":"9fc8b9702d9d3591695478729a2a209996e2f83219ba8649a31afc02f286ad3f"} Mar 01 09:23:36 crc kubenswrapper[4792]: I0301 09:23:36.002448 4792 scope.go:117] "RemoveContainer" containerID="3d6c435219d8bc835f1f13dc6f334e1ca86fd764226b575caaadee4dad9e5209" Mar 01 09:23:40 crc kubenswrapper[4792]: I0301 09:23:40.386007 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:40 crc kubenswrapper[4792]: I0301 09:23:40.398803 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-kfnzk" Mar 01 09:23:41 crc kubenswrapper[4792]: I0301 09:23:41.698391 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-zpr27" Mar 01 09:23:44 crc kubenswrapper[4792]: I0301 09:23:44.342619 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-wsvzs"] Mar 01 09:23:44 crc kubenswrapper[4792]: I0301 09:23:44.344049 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wsvzs" Mar 01 09:23:44 crc kubenswrapper[4792]: I0301 09:23:44.347764 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 01 09:23:44 crc kubenswrapper[4792]: I0301 09:23:44.347778 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-h4vjh" Mar 01 09:23:44 crc kubenswrapper[4792]: I0301 09:23:44.351457 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 01 09:23:44 crc kubenswrapper[4792]: I0301 09:23:44.392085 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn9gn\" (UniqueName: \"kubernetes.io/projected/00b31e3f-8443-487a-916a-59ec98ccd4de-kube-api-access-vn9gn\") pod \"openstack-operator-index-wsvzs\" (UID: \"00b31e3f-8443-487a-916a-59ec98ccd4de\") " pod="openstack-operators/openstack-operator-index-wsvzs" Mar 01 09:23:44 crc kubenswrapper[4792]: I0301 09:23:44.395434 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-wsvzs"] Mar 01 09:23:44 crc kubenswrapper[4792]: I0301 09:23:44.493244 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn9gn\" (UniqueName: \"kubernetes.io/projected/00b31e3f-8443-487a-916a-59ec98ccd4de-kube-api-access-vn9gn\") pod \"openstack-operator-index-wsvzs\" (UID: \"00b31e3f-8443-487a-916a-59ec98ccd4de\") " pod="openstack-operators/openstack-operator-index-wsvzs" Mar 01 09:23:44 crc kubenswrapper[4792]: I0301 09:23:44.512529 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn9gn\" (UniqueName: \"kubernetes.io/projected/00b31e3f-8443-487a-916a-59ec98ccd4de-kube-api-access-vn9gn\") pod \"openstack-operator-index-wsvzs\" (UID: \"00b31e3f-8443-487a-916a-59ec98ccd4de\") " pod="openstack-operators/openstack-operator-index-wsvzs" Mar 01 09:23:44 crc kubenswrapper[4792]: I0301 09:23:44.663810 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wsvzs" Mar 01 09:23:44 crc kubenswrapper[4792]: I0301 09:23:44.879436 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-wsvzs"] Mar 01 09:23:45 crc kubenswrapper[4792]: I0301 09:23:45.057375 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wsvzs" event={"ID":"00b31e3f-8443-487a-916a-59ec98ccd4de","Type":"ContainerStarted","Data":"e98ee9c3b90fa09d03524b35e6a99ec6a5872d1dba45dc8484cb95ef1f21e4e3"} Mar 01 09:23:47 crc kubenswrapper[4792]: I0301 09:23:47.069797 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wsvzs" event={"ID":"00b31e3f-8443-487a-916a-59ec98ccd4de","Type":"ContainerStarted","Data":"059995c34f2b45f2d80e005bcf0169eacc91be21aa2b7c23d727bfaab5828afe"} Mar 01 09:23:47 crc kubenswrapper[4792]: I0301 09:23:47.083411 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-wsvzs" podStartSLOduration=2.020652226 podStartE2EDuration="3.083392569s" podCreationTimestamp="2026-03-01 09:23:44 +0000 UTC" firstStartedPulling="2026-03-01 09:23:44.88775647 +0000 UTC m=+954.129635667" lastFinishedPulling="2026-03-01 09:23:45.950496813 +0000 UTC m=+955.192376010" observedRunningTime="2026-03-01 09:23:47.082436895 +0000 UTC m=+956.324316102" watchObservedRunningTime="2026-03-01 09:23:47.083392569 +0000 UTC m=+956.325271776" Mar 01 09:23:47 crc kubenswrapper[4792]: I0301 09:23:47.703307 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-wsvzs"] Mar 01 09:23:48 crc kubenswrapper[4792]: I0301 09:23:48.311828 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-5kfk4"] Mar 01 09:23:48 crc kubenswrapper[4792]: I0301 09:23:48.312694 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5kfk4" Mar 01 09:23:48 crc kubenswrapper[4792]: I0301 09:23:48.325012 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-5kfk4"] Mar 01 09:23:48 crc kubenswrapper[4792]: I0301 09:23:48.437381 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hllql\" (UniqueName: \"kubernetes.io/projected/dc22117a-72a7-4838-bb1c-111e91514b98-kube-api-access-hllql\") pod \"openstack-operator-index-5kfk4\" (UID: \"dc22117a-72a7-4838-bb1c-111e91514b98\") " pod="openstack-operators/openstack-operator-index-5kfk4" Mar 01 09:23:48 crc kubenswrapper[4792]: I0301 09:23:48.538531 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hllql\" (UniqueName: \"kubernetes.io/projected/dc22117a-72a7-4838-bb1c-111e91514b98-kube-api-access-hllql\") pod \"openstack-operator-index-5kfk4\" (UID: \"dc22117a-72a7-4838-bb1c-111e91514b98\") " pod="openstack-operators/openstack-operator-index-5kfk4" Mar 01 09:23:48 crc kubenswrapper[4792]: I0301 09:23:48.558011 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hllql\" (UniqueName: \"kubernetes.io/projected/dc22117a-72a7-4838-bb1c-111e91514b98-kube-api-access-hllql\") pod \"openstack-operator-index-5kfk4\" (UID: \"dc22117a-72a7-4838-bb1c-111e91514b98\") " pod="openstack-operators/openstack-operator-index-5kfk4" Mar 01 09:23:48 crc kubenswrapper[4792]: I0301 09:23:48.630950 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5kfk4" Mar 01 09:23:49 crc kubenswrapper[4792]: I0301 09:23:49.039095 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-5kfk4"] Mar 01 09:23:49 crc kubenswrapper[4792]: W0301 09:23:49.044526 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc22117a_72a7_4838_bb1c_111e91514b98.slice/crio-00e2da99ed5c797263d32d1b50464306d1a3e39c15fbab96cd76bab996b0fa34 WatchSource:0}: Error finding container 00e2da99ed5c797263d32d1b50464306d1a3e39c15fbab96cd76bab996b0fa34: Status 404 returned error can't find the container with id 00e2da99ed5c797263d32d1b50464306d1a3e39c15fbab96cd76bab996b0fa34 Mar 01 09:23:49 crc kubenswrapper[4792]: I0301 09:23:49.088272 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-wsvzs" podUID="00b31e3f-8443-487a-916a-59ec98ccd4de" containerName="registry-server" containerID="cri-o://059995c34f2b45f2d80e005bcf0169eacc91be21aa2b7c23d727bfaab5828afe" gracePeriod=2 Mar 01 09:23:49 crc kubenswrapper[4792]: I0301 09:23:49.088649 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5kfk4" event={"ID":"dc22117a-72a7-4838-bb1c-111e91514b98","Type":"ContainerStarted","Data":"00e2da99ed5c797263d32d1b50464306d1a3e39c15fbab96cd76bab996b0fa34"} Mar 01 09:23:49 crc kubenswrapper[4792]: I0301 09:23:49.457505 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wsvzs" Mar 01 09:23:49 crc kubenswrapper[4792]: I0301 09:23:49.652771 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vn9gn\" (UniqueName: \"kubernetes.io/projected/00b31e3f-8443-487a-916a-59ec98ccd4de-kube-api-access-vn9gn\") pod \"00b31e3f-8443-487a-916a-59ec98ccd4de\" (UID: \"00b31e3f-8443-487a-916a-59ec98ccd4de\") " Mar 01 09:23:49 crc kubenswrapper[4792]: I0301 09:23:49.658954 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00b31e3f-8443-487a-916a-59ec98ccd4de-kube-api-access-vn9gn" (OuterVolumeSpecName: "kube-api-access-vn9gn") pod "00b31e3f-8443-487a-916a-59ec98ccd4de" (UID: "00b31e3f-8443-487a-916a-59ec98ccd4de"). InnerVolumeSpecName "kube-api-access-vn9gn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:23:49 crc kubenswrapper[4792]: I0301 09:23:49.754684 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vn9gn\" (UniqueName: \"kubernetes.io/projected/00b31e3f-8443-487a-916a-59ec98ccd4de-kube-api-access-vn9gn\") on node \"crc\" DevicePath \"\"" Mar 01 09:23:50 crc kubenswrapper[4792]: I0301 09:23:50.094729 4792 generic.go:334] "Generic (PLEG): container finished" podID="00b31e3f-8443-487a-916a-59ec98ccd4de" containerID="059995c34f2b45f2d80e005bcf0169eacc91be21aa2b7c23d727bfaab5828afe" exitCode=0 Mar 01 09:23:50 crc kubenswrapper[4792]: I0301 09:23:50.094814 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wsvzs" event={"ID":"00b31e3f-8443-487a-916a-59ec98ccd4de","Type":"ContainerDied","Data":"059995c34f2b45f2d80e005bcf0169eacc91be21aa2b7c23d727bfaab5828afe"} Mar 01 09:23:50 crc kubenswrapper[4792]: I0301 09:23:50.094846 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wsvzs" event={"ID":"00b31e3f-8443-487a-916a-59ec98ccd4de","Type":"ContainerDied","Data":"e98ee9c3b90fa09d03524b35e6a99ec6a5872d1dba45dc8484cb95ef1f21e4e3"} Mar 01 09:23:50 crc kubenswrapper[4792]: I0301 09:23:50.094868 4792 scope.go:117] "RemoveContainer" containerID="059995c34f2b45f2d80e005bcf0169eacc91be21aa2b7c23d727bfaab5828afe" Mar 01 09:23:50 crc kubenswrapper[4792]: I0301 09:23:50.095014 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wsvzs" Mar 01 09:23:50 crc kubenswrapper[4792]: I0301 09:23:50.101195 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5kfk4" event={"ID":"dc22117a-72a7-4838-bb1c-111e91514b98","Type":"ContainerStarted","Data":"b278cb80c20ca321724b63e627678ef7ebad47bdb0fac71db7314c188b072401"} Mar 01 09:23:50 crc kubenswrapper[4792]: I0301 09:23:50.116543 4792 scope.go:117] "RemoveContainer" containerID="059995c34f2b45f2d80e005bcf0169eacc91be21aa2b7c23d727bfaab5828afe" Mar 01 09:23:50 crc kubenswrapper[4792]: E0301 09:23:50.116958 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"059995c34f2b45f2d80e005bcf0169eacc91be21aa2b7c23d727bfaab5828afe\": container with ID starting with 059995c34f2b45f2d80e005bcf0169eacc91be21aa2b7c23d727bfaab5828afe not found: ID does not exist" containerID="059995c34f2b45f2d80e005bcf0169eacc91be21aa2b7c23d727bfaab5828afe" Mar 01 09:23:50 crc kubenswrapper[4792]: I0301 09:23:50.116993 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"059995c34f2b45f2d80e005bcf0169eacc91be21aa2b7c23d727bfaab5828afe"} err="failed to get container status \"059995c34f2b45f2d80e005bcf0169eacc91be21aa2b7c23d727bfaab5828afe\": rpc error: code = NotFound desc = could not find container \"059995c34f2b45f2d80e005bcf0169eacc91be21aa2b7c23d727bfaab5828afe\": container with ID starting with 059995c34f2b45f2d80e005bcf0169eacc91be21aa2b7c23d727bfaab5828afe not found: ID does not exist" Mar 01 09:23:50 crc kubenswrapper[4792]: I0301 09:23:50.130196 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-5kfk4" podStartSLOduration=1.752191838 podStartE2EDuration="2.13015009s" podCreationTimestamp="2026-03-01 09:23:48 +0000 UTC" firstStartedPulling="2026-03-01 09:23:49.049030466 +0000 UTC m=+958.290909663" lastFinishedPulling="2026-03-01 09:23:49.426988718 +0000 UTC m=+958.668867915" observedRunningTime="2026-03-01 09:23:50.120100983 +0000 UTC m=+959.361980190" watchObservedRunningTime="2026-03-01 09:23:50.13015009 +0000 UTC m=+959.372029287" Mar 01 09:23:50 crc kubenswrapper[4792]: I0301 09:23:50.141120 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-wsvzs"] Mar 01 09:23:50 crc kubenswrapper[4792]: I0301 09:23:50.144273 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-wsvzs"] Mar 01 09:23:51 crc kubenswrapper[4792]: I0301 09:23:51.423330 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00b31e3f-8443-487a-916a-59ec98ccd4de" path="/var/lib/kubelet/pods/00b31e3f-8443-487a-916a-59ec98ccd4de/volumes" Mar 01 09:23:58 crc kubenswrapper[4792]: I0301 09:23:58.632127 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-5kfk4" Mar 01 09:23:58 crc kubenswrapper[4792]: I0301 09:23:58.632485 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-5kfk4" Mar 01 09:23:58 crc kubenswrapper[4792]: I0301 09:23:58.660934 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-5kfk4" Mar 01 09:23:59 crc kubenswrapper[4792]: I0301 09:23:59.193380 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-5kfk4" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.124324 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539284-g5rbc"] Mar 01 09:24:00 crc kubenswrapper[4792]: E0301 09:24:00.124858 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00b31e3f-8443-487a-916a-59ec98ccd4de" containerName="registry-server" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.124873 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="00b31e3f-8443-487a-916a-59ec98ccd4de" containerName="registry-server" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.125022 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="00b31e3f-8443-487a-916a-59ec98ccd4de" containerName="registry-server" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.125455 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539284-g5rbc" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.128246 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.128899 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.135073 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539284-g5rbc"] Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.178220 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.191187 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2"] Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.192525 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.195072 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d4447fd9-d2df-47f4-a94f-ff8b4c5080bd-util\") pod \"905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2\" (UID: \"d4447fd9-d2df-47f4-a94f-ff8b4c5080bd\") " pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.195185 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqkqn\" (UniqueName: \"kubernetes.io/projected/d4447fd9-d2df-47f4-a94f-ff8b4c5080bd-kube-api-access-cqkqn\") pod \"905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2\" (UID: \"d4447fd9-d2df-47f4-a94f-ff8b4c5080bd\") " pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.195241 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d4447fd9-d2df-47f4-a94f-ff8b4c5080bd-bundle\") pod \"905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2\" (UID: \"d4447fd9-d2df-47f4-a94f-ff8b4c5080bd\") " pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.195276 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwt58\" (UniqueName: \"kubernetes.io/projected/1263f40a-23c7-4ab8-8ebc-7c697e2eacd6-kube-api-access-nwt58\") pod \"auto-csr-approver-29539284-g5rbc\" (UID: \"1263f40a-23c7-4ab8-8ebc-7c697e2eacd6\") " pod="openshift-infra/auto-csr-approver-29539284-g5rbc" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.206501 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-s7c52" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.212625 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2"] Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.296498 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d4447fd9-d2df-47f4-a94f-ff8b4c5080bd-util\") pod \"905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2\" (UID: \"d4447fd9-d2df-47f4-a94f-ff8b4c5080bd\") " pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.296560 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqkqn\" (UniqueName: \"kubernetes.io/projected/d4447fd9-d2df-47f4-a94f-ff8b4c5080bd-kube-api-access-cqkqn\") pod \"905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2\" (UID: \"d4447fd9-d2df-47f4-a94f-ff8b4c5080bd\") " pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.296587 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d4447fd9-d2df-47f4-a94f-ff8b4c5080bd-bundle\") pod \"905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2\" (UID: \"d4447fd9-d2df-47f4-a94f-ff8b4c5080bd\") " pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.296614 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwt58\" (UniqueName: \"kubernetes.io/projected/1263f40a-23c7-4ab8-8ebc-7c697e2eacd6-kube-api-access-nwt58\") pod \"auto-csr-approver-29539284-g5rbc\" (UID: \"1263f40a-23c7-4ab8-8ebc-7c697e2eacd6\") " pod="openshift-infra/auto-csr-approver-29539284-g5rbc" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.296928 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d4447fd9-d2df-47f4-a94f-ff8b4c5080bd-util\") pod \"905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2\" (UID: \"d4447fd9-d2df-47f4-a94f-ff8b4c5080bd\") " pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.300179 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d4447fd9-d2df-47f4-a94f-ff8b4c5080bd-bundle\") pod \"905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2\" (UID: \"d4447fd9-d2df-47f4-a94f-ff8b4c5080bd\") " pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.316590 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqkqn\" (UniqueName: \"kubernetes.io/projected/d4447fd9-d2df-47f4-a94f-ff8b4c5080bd-kube-api-access-cqkqn\") pod \"905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2\" (UID: \"d4447fd9-d2df-47f4-a94f-ff8b4c5080bd\") " pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.317022 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwt58\" (UniqueName: \"kubernetes.io/projected/1263f40a-23c7-4ab8-8ebc-7c697e2eacd6-kube-api-access-nwt58\") pod \"auto-csr-approver-29539284-g5rbc\" (UID: \"1263f40a-23c7-4ab8-8ebc-7c697e2eacd6\") " pod="openshift-infra/auto-csr-approver-29539284-g5rbc" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.489102 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539284-g5rbc" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.507427 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.765462 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2"] Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.900096 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539284-g5rbc"] Mar 01 09:24:00 crc kubenswrapper[4792]: W0301 09:24:00.904652 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1263f40a_23c7_4ab8_8ebc_7c697e2eacd6.slice/crio-6d4a810ccd8818aa2aa4996c08f4aecc31ca3e184af445411fec70a53dd8b3d3 WatchSource:0}: Error finding container 6d4a810ccd8818aa2aa4996c08f4aecc31ca3e184af445411fec70a53dd8b3d3: Status 404 returned error can't find the container with id 6d4a810ccd8818aa2aa4996c08f4aecc31ca3e184af445411fec70a53dd8b3d3 Mar 01 09:24:01 crc kubenswrapper[4792]: I0301 09:24:01.199552 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539284-g5rbc" event={"ID":"1263f40a-23c7-4ab8-8ebc-7c697e2eacd6","Type":"ContainerStarted","Data":"6d4a810ccd8818aa2aa4996c08f4aecc31ca3e184af445411fec70a53dd8b3d3"} Mar 01 09:24:01 crc kubenswrapper[4792]: I0301 09:24:01.202104 4792 generic.go:334] "Generic (PLEG): container finished" podID="d4447fd9-d2df-47f4-a94f-ff8b4c5080bd" containerID="e303016694d80717056bed8ebe7ac0ed91d935075c1a2c8363a48e9a34247c9f" exitCode=0 Mar 01 09:24:01 crc kubenswrapper[4792]: I0301 09:24:01.202361 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2" event={"ID":"d4447fd9-d2df-47f4-a94f-ff8b4c5080bd","Type":"ContainerDied","Data":"e303016694d80717056bed8ebe7ac0ed91d935075c1a2c8363a48e9a34247c9f"} Mar 01 09:24:01 crc kubenswrapper[4792]: I0301 09:24:01.202393 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2" event={"ID":"d4447fd9-d2df-47f4-a94f-ff8b4c5080bd","Type":"ContainerStarted","Data":"901d71ab1004a975c28ad1740acca30e4ad62081d579542b1327000938dbb1e0"} Mar 01 09:24:02 crc kubenswrapper[4792]: I0301 09:24:02.216626 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539284-g5rbc" event={"ID":"1263f40a-23c7-4ab8-8ebc-7c697e2eacd6","Type":"ContainerStarted","Data":"25e2a65861bddb5cf69014ea8d6e4a60ec1aeeeac6538cad770632d905286110"} Mar 01 09:24:02 crc kubenswrapper[4792]: I0301 09:24:02.221133 4792 generic.go:334] "Generic (PLEG): container finished" podID="d4447fd9-d2df-47f4-a94f-ff8b4c5080bd" containerID="857a9688c261d295e943ef630ce635540fb82ab9abcae64b93c371953ef33783" exitCode=0 Mar 01 09:24:02 crc kubenswrapper[4792]: I0301 09:24:02.221259 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2" event={"ID":"d4447fd9-d2df-47f4-a94f-ff8b4c5080bd","Type":"ContainerDied","Data":"857a9688c261d295e943ef630ce635540fb82ab9abcae64b93c371953ef33783"} Mar 01 09:24:02 crc kubenswrapper[4792]: I0301 09:24:02.237980 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29539284-g5rbc" podStartSLOduration=1.290508193 podStartE2EDuration="2.237961349s" podCreationTimestamp="2026-03-01 09:24:00 +0000 UTC" firstStartedPulling="2026-03-01 09:24:00.906807599 +0000 UTC m=+970.148686816" lastFinishedPulling="2026-03-01 09:24:01.854260765 +0000 UTC m=+971.096139972" observedRunningTime="2026-03-01 09:24:02.234848322 +0000 UTC m=+971.476727549" watchObservedRunningTime="2026-03-01 09:24:02.237961349 +0000 UTC m=+971.479840556" Mar 01 09:24:03 crc kubenswrapper[4792]: I0301 09:24:03.232552 4792 generic.go:334] "Generic (PLEG): container finished" podID="d4447fd9-d2df-47f4-a94f-ff8b4c5080bd" containerID="6ff8df30eacd38cf3a14434c98b1a8b0327d53de2b6a40501367acbf0b0e60ea" exitCode=0 Mar 01 09:24:03 crc kubenswrapper[4792]: I0301 09:24:03.232651 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2" event={"ID":"d4447fd9-d2df-47f4-a94f-ff8b4c5080bd","Type":"ContainerDied","Data":"6ff8df30eacd38cf3a14434c98b1a8b0327d53de2b6a40501367acbf0b0e60ea"} Mar 01 09:24:03 crc kubenswrapper[4792]: I0301 09:24:03.235987 4792 generic.go:334] "Generic (PLEG): container finished" podID="1263f40a-23c7-4ab8-8ebc-7c697e2eacd6" containerID="25e2a65861bddb5cf69014ea8d6e4a60ec1aeeeac6538cad770632d905286110" exitCode=0 Mar 01 09:24:03 crc kubenswrapper[4792]: I0301 09:24:03.236079 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539284-g5rbc" event={"ID":"1263f40a-23c7-4ab8-8ebc-7c697e2eacd6","Type":"ContainerDied","Data":"25e2a65861bddb5cf69014ea8d6e4a60ec1aeeeac6538cad770632d905286110"} Mar 01 09:24:04 crc kubenswrapper[4792]: I0301 09:24:04.488279 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539284-g5rbc" Mar 01 09:24:04 crc kubenswrapper[4792]: I0301 09:24:04.565628 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwt58\" (UniqueName: \"kubernetes.io/projected/1263f40a-23c7-4ab8-8ebc-7c697e2eacd6-kube-api-access-nwt58\") pod \"1263f40a-23c7-4ab8-8ebc-7c697e2eacd6\" (UID: \"1263f40a-23c7-4ab8-8ebc-7c697e2eacd6\") " Mar 01 09:24:04 crc kubenswrapper[4792]: I0301 09:24:04.570723 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1263f40a-23c7-4ab8-8ebc-7c697e2eacd6-kube-api-access-nwt58" (OuterVolumeSpecName: "kube-api-access-nwt58") pod "1263f40a-23c7-4ab8-8ebc-7c697e2eacd6" (UID: "1263f40a-23c7-4ab8-8ebc-7c697e2eacd6"). InnerVolumeSpecName "kube-api-access-nwt58". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:24:04 crc kubenswrapper[4792]: I0301 09:24:04.571944 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2" Mar 01 09:24:04 crc kubenswrapper[4792]: I0301 09:24:04.667067 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d4447fd9-d2df-47f4-a94f-ff8b4c5080bd-util\") pod \"d4447fd9-d2df-47f4-a94f-ff8b4c5080bd\" (UID: \"d4447fd9-d2df-47f4-a94f-ff8b4c5080bd\") " Mar 01 09:24:04 crc kubenswrapper[4792]: I0301 09:24:04.667186 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d4447fd9-d2df-47f4-a94f-ff8b4c5080bd-bundle\") pod \"d4447fd9-d2df-47f4-a94f-ff8b4c5080bd\" (UID: \"d4447fd9-d2df-47f4-a94f-ff8b4c5080bd\") " Mar 01 09:24:04 crc kubenswrapper[4792]: I0301 09:24:04.667212 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqkqn\" (UniqueName: \"kubernetes.io/projected/d4447fd9-d2df-47f4-a94f-ff8b4c5080bd-kube-api-access-cqkqn\") pod \"d4447fd9-d2df-47f4-a94f-ff8b4c5080bd\" (UID: \"d4447fd9-d2df-47f4-a94f-ff8b4c5080bd\") " Mar 01 09:24:04 crc kubenswrapper[4792]: I0301 09:24:04.667402 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwt58\" (UniqueName: \"kubernetes.io/projected/1263f40a-23c7-4ab8-8ebc-7c697e2eacd6-kube-api-access-nwt58\") on node \"crc\" DevicePath \"\"" Mar 01 09:24:04 crc kubenswrapper[4792]: I0301 09:24:04.668380 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4447fd9-d2df-47f4-a94f-ff8b4c5080bd-bundle" (OuterVolumeSpecName: "bundle") pod "d4447fd9-d2df-47f4-a94f-ff8b4c5080bd" (UID: "d4447fd9-d2df-47f4-a94f-ff8b4c5080bd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:24:04 crc kubenswrapper[4792]: I0301 09:24:04.670428 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4447fd9-d2df-47f4-a94f-ff8b4c5080bd-kube-api-access-cqkqn" (OuterVolumeSpecName: "kube-api-access-cqkqn") pod "d4447fd9-d2df-47f4-a94f-ff8b4c5080bd" (UID: "d4447fd9-d2df-47f4-a94f-ff8b4c5080bd"). InnerVolumeSpecName "kube-api-access-cqkqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:24:04 crc kubenswrapper[4792]: I0301 09:24:04.683048 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4447fd9-d2df-47f4-a94f-ff8b4c5080bd-util" (OuterVolumeSpecName: "util") pod "d4447fd9-d2df-47f4-a94f-ff8b4c5080bd" (UID: "d4447fd9-d2df-47f4-a94f-ff8b4c5080bd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:24:04 crc kubenswrapper[4792]: I0301 09:24:04.768467 4792 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d4447fd9-d2df-47f4-a94f-ff8b4c5080bd-util\") on node \"crc\" DevicePath \"\"" Mar 01 09:24:04 crc kubenswrapper[4792]: I0301 09:24:04.768708 4792 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d4447fd9-d2df-47f4-a94f-ff8b4c5080bd-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:24:04 crc kubenswrapper[4792]: I0301 09:24:04.768790 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqkqn\" (UniqueName: \"kubernetes.io/projected/d4447fd9-d2df-47f4-a94f-ff8b4c5080bd-kube-api-access-cqkqn\") on node \"crc\" DevicePath \"\"" Mar 01 09:24:05 crc kubenswrapper[4792]: I0301 09:24:05.251264 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539284-g5rbc" event={"ID":"1263f40a-23c7-4ab8-8ebc-7c697e2eacd6","Type":"ContainerDied","Data":"6d4a810ccd8818aa2aa4996c08f4aecc31ca3e184af445411fec70a53dd8b3d3"} Mar 01 09:24:05 crc kubenswrapper[4792]: I0301 09:24:05.251624 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d4a810ccd8818aa2aa4996c08f4aecc31ca3e184af445411fec70a53dd8b3d3" Mar 01 09:24:05 crc kubenswrapper[4792]: I0301 09:24:05.251314 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539284-g5rbc" Mar 01 09:24:05 crc kubenswrapper[4792]: I0301 09:24:05.253455 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2" event={"ID":"d4447fd9-d2df-47f4-a94f-ff8b4c5080bd","Type":"ContainerDied","Data":"901d71ab1004a975c28ad1740acca30e4ad62081d579542b1327000938dbb1e0"} Mar 01 09:24:05 crc kubenswrapper[4792]: I0301 09:24:05.253493 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="901d71ab1004a975c28ad1740acca30e4ad62081d579542b1327000938dbb1e0" Mar 01 09:24:05 crc kubenswrapper[4792]: I0301 09:24:05.253595 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2" Mar 01 09:24:05 crc kubenswrapper[4792]: I0301 09:24:05.546037 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539278-dkkbp"] Mar 01 09:24:05 crc kubenswrapper[4792]: I0301 09:24:05.551435 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539278-dkkbp"] Mar 01 09:24:07 crc kubenswrapper[4792]: I0301 09:24:07.416878 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d17aa56-5b61-403d-9d20-cb300aabc44d" path="/var/lib/kubelet/pods/0d17aa56-5b61-403d-9d20-cb300aabc44d/volumes" Mar 01 09:24:12 crc kubenswrapper[4792]: I0301 09:24:12.753265 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-595c94944c-vtchh"] Mar 01 09:24:12 crc kubenswrapper[4792]: E0301 09:24:12.754723 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4447fd9-d2df-47f4-a94f-ff8b4c5080bd" containerName="util" Mar 01 09:24:12 crc kubenswrapper[4792]: I0301 09:24:12.754824 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4447fd9-d2df-47f4-a94f-ff8b4c5080bd" containerName="util" Mar 01 09:24:12 crc kubenswrapper[4792]: E0301 09:24:12.754886 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1263f40a-23c7-4ab8-8ebc-7c697e2eacd6" containerName="oc" Mar 01 09:24:12 crc kubenswrapper[4792]: I0301 09:24:12.755725 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="1263f40a-23c7-4ab8-8ebc-7c697e2eacd6" containerName="oc" Mar 01 09:24:12 crc kubenswrapper[4792]: E0301 09:24:12.755966 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4447fd9-d2df-47f4-a94f-ff8b4c5080bd" containerName="extract" Mar 01 09:24:12 crc kubenswrapper[4792]: I0301 09:24:12.756129 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4447fd9-d2df-47f4-a94f-ff8b4c5080bd" containerName="extract" Mar 01 09:24:12 crc kubenswrapper[4792]: E0301 09:24:12.756245 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4447fd9-d2df-47f4-a94f-ff8b4c5080bd" containerName="pull" Mar 01 09:24:12 crc kubenswrapper[4792]: I0301 09:24:12.757178 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4447fd9-d2df-47f4-a94f-ff8b4c5080bd" containerName="pull" Mar 01 09:24:12 crc kubenswrapper[4792]: I0301 09:24:12.757589 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4447fd9-d2df-47f4-a94f-ff8b4c5080bd" containerName="extract" Mar 01 09:24:12 crc kubenswrapper[4792]: I0301 09:24:12.757736 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="1263f40a-23c7-4ab8-8ebc-7c697e2eacd6" containerName="oc" Mar 01 09:24:12 crc kubenswrapper[4792]: I0301 09:24:12.758612 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-595c94944c-vtchh" Mar 01 09:24:12 crc kubenswrapper[4792]: I0301 09:24:12.764500 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-dq8rd" Mar 01 09:24:12 crc kubenswrapper[4792]: I0301 09:24:12.772974 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfs7s\" (UniqueName: \"kubernetes.io/projected/c967e6f5-6388-4ae5-9ccf-379b6305e1b0-kube-api-access-jfs7s\") pod \"openstack-operator-controller-init-595c94944c-vtchh\" (UID: \"c967e6f5-6388-4ae5-9ccf-379b6305e1b0\") " pod="openstack-operators/openstack-operator-controller-init-595c94944c-vtchh" Mar 01 09:24:12 crc kubenswrapper[4792]: I0301 09:24:12.783450 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-595c94944c-vtchh"] Mar 01 09:24:12 crc kubenswrapper[4792]: I0301 09:24:12.874615 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfs7s\" (UniqueName: \"kubernetes.io/projected/c967e6f5-6388-4ae5-9ccf-379b6305e1b0-kube-api-access-jfs7s\") pod \"openstack-operator-controller-init-595c94944c-vtchh\" (UID: \"c967e6f5-6388-4ae5-9ccf-379b6305e1b0\") " pod="openstack-operators/openstack-operator-controller-init-595c94944c-vtchh" Mar 01 09:24:12 crc kubenswrapper[4792]: I0301 09:24:12.904495 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfs7s\" (UniqueName: \"kubernetes.io/projected/c967e6f5-6388-4ae5-9ccf-379b6305e1b0-kube-api-access-jfs7s\") pod \"openstack-operator-controller-init-595c94944c-vtchh\" (UID: \"c967e6f5-6388-4ae5-9ccf-379b6305e1b0\") " pod="openstack-operators/openstack-operator-controller-init-595c94944c-vtchh" Mar 01 09:24:13 crc kubenswrapper[4792]: I0301 09:24:13.072571 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-595c94944c-vtchh" Mar 01 09:24:13 crc kubenswrapper[4792]: I0301 09:24:13.537393 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-595c94944c-vtchh"] Mar 01 09:24:13 crc kubenswrapper[4792]: W0301 09:24:13.541501 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc967e6f5_6388_4ae5_9ccf_379b6305e1b0.slice/crio-f586b15186f380abfcd39f1711097e90a17d914ffbbee23bffaf50c18c7555d4 WatchSource:0}: Error finding container f586b15186f380abfcd39f1711097e90a17d914ffbbee23bffaf50c18c7555d4: Status 404 returned error can't find the container with id f586b15186f380abfcd39f1711097e90a17d914ffbbee23bffaf50c18c7555d4 Mar 01 09:24:14 crc kubenswrapper[4792]: I0301 09:24:14.309038 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-595c94944c-vtchh" event={"ID":"c967e6f5-6388-4ae5-9ccf-379b6305e1b0","Type":"ContainerStarted","Data":"f586b15186f380abfcd39f1711097e90a17d914ffbbee23bffaf50c18c7555d4"} Mar 01 09:24:18 crc kubenswrapper[4792]: I0301 09:24:18.349811 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-595c94944c-vtchh" event={"ID":"c967e6f5-6388-4ae5-9ccf-379b6305e1b0","Type":"ContainerStarted","Data":"7c99d3874715f2ca02422dd8190b6185d9da40caf33a75c71acb9739fc7fe999"} Mar 01 09:24:18 crc kubenswrapper[4792]: I0301 09:24:18.350456 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-595c94944c-vtchh" Mar 01 09:24:18 crc kubenswrapper[4792]: I0301 09:24:18.385673 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-595c94944c-vtchh" podStartSLOduration=1.775006104 podStartE2EDuration="6.385656424s" podCreationTimestamp="2026-03-01 09:24:12 +0000 UTC" firstStartedPulling="2026-03-01 09:24:13.545318599 +0000 UTC m=+982.787197796" lastFinishedPulling="2026-03-01 09:24:18.155968919 +0000 UTC m=+987.397848116" observedRunningTime="2026-03-01 09:24:18.380793065 +0000 UTC m=+987.622672262" watchObservedRunningTime="2026-03-01 09:24:18.385656424 +0000 UTC m=+987.627535621" Mar 01 09:24:23 crc kubenswrapper[4792]: I0301 09:24:23.075414 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-595c94944c-vtchh" Mar 01 09:24:37 crc kubenswrapper[4792]: I0301 09:24:37.487221 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-29ht5"] Mar 01 09:24:37 crc kubenswrapper[4792]: I0301 09:24:37.488751 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-29ht5" Mar 01 09:24:37 crc kubenswrapper[4792]: I0301 09:24:37.507935 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-29ht5"] Mar 01 09:24:37 crc kubenswrapper[4792]: I0301 09:24:37.602087 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15fa5cd2-57f2-4589-9947-c4a227fa68b6-utilities\") pod \"redhat-marketplace-29ht5\" (UID: \"15fa5cd2-57f2-4589-9947-c4a227fa68b6\") " pod="openshift-marketplace/redhat-marketplace-29ht5" Mar 01 09:24:37 crc kubenswrapper[4792]: I0301 09:24:37.602130 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtpbt\" (UniqueName: \"kubernetes.io/projected/15fa5cd2-57f2-4589-9947-c4a227fa68b6-kube-api-access-dtpbt\") pod \"redhat-marketplace-29ht5\" (UID: \"15fa5cd2-57f2-4589-9947-c4a227fa68b6\") " pod="openshift-marketplace/redhat-marketplace-29ht5" Mar 01 09:24:37 crc kubenswrapper[4792]: I0301 09:24:37.602178 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15fa5cd2-57f2-4589-9947-c4a227fa68b6-catalog-content\") pod \"redhat-marketplace-29ht5\" (UID: \"15fa5cd2-57f2-4589-9947-c4a227fa68b6\") " pod="openshift-marketplace/redhat-marketplace-29ht5" Mar 01 09:24:37 crc kubenswrapper[4792]: I0301 09:24:37.703128 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15fa5cd2-57f2-4589-9947-c4a227fa68b6-utilities\") pod \"redhat-marketplace-29ht5\" (UID: \"15fa5cd2-57f2-4589-9947-c4a227fa68b6\") " pod="openshift-marketplace/redhat-marketplace-29ht5" Mar 01 09:24:37 crc kubenswrapper[4792]: I0301 09:24:37.703170 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtpbt\" (UniqueName: \"kubernetes.io/projected/15fa5cd2-57f2-4589-9947-c4a227fa68b6-kube-api-access-dtpbt\") pod \"redhat-marketplace-29ht5\" (UID: \"15fa5cd2-57f2-4589-9947-c4a227fa68b6\") " pod="openshift-marketplace/redhat-marketplace-29ht5" Mar 01 09:24:37 crc kubenswrapper[4792]: I0301 09:24:37.703206 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15fa5cd2-57f2-4589-9947-c4a227fa68b6-catalog-content\") pod \"redhat-marketplace-29ht5\" (UID: \"15fa5cd2-57f2-4589-9947-c4a227fa68b6\") " pod="openshift-marketplace/redhat-marketplace-29ht5" Mar 01 09:24:37 crc kubenswrapper[4792]: I0301 09:24:37.703673 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15fa5cd2-57f2-4589-9947-c4a227fa68b6-utilities\") pod \"redhat-marketplace-29ht5\" (UID: \"15fa5cd2-57f2-4589-9947-c4a227fa68b6\") " pod="openshift-marketplace/redhat-marketplace-29ht5" Mar 01 09:24:37 crc kubenswrapper[4792]: I0301 09:24:37.703716 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15fa5cd2-57f2-4589-9947-c4a227fa68b6-catalog-content\") pod \"redhat-marketplace-29ht5\" (UID: \"15fa5cd2-57f2-4589-9947-c4a227fa68b6\") " pod="openshift-marketplace/redhat-marketplace-29ht5" Mar 01 09:24:37 crc kubenswrapper[4792]: I0301 09:24:37.733457 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtpbt\" (UniqueName: \"kubernetes.io/projected/15fa5cd2-57f2-4589-9947-c4a227fa68b6-kube-api-access-dtpbt\") pod \"redhat-marketplace-29ht5\" (UID: \"15fa5cd2-57f2-4589-9947-c4a227fa68b6\") " pod="openshift-marketplace/redhat-marketplace-29ht5" Mar 01 09:24:37 crc kubenswrapper[4792]: I0301 09:24:37.804294 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-29ht5" Mar 01 09:24:38 crc kubenswrapper[4792]: I0301 09:24:38.215677 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-29ht5"] Mar 01 09:24:38 crc kubenswrapper[4792]: I0301 09:24:38.466236 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-29ht5" event={"ID":"15fa5cd2-57f2-4589-9947-c4a227fa68b6","Type":"ContainerStarted","Data":"2712402bbe2a400f1f172cc0f249c7e35edf7b64593d06c6d1cfd9d81ee06f57"} Mar 01 09:24:38 crc kubenswrapper[4792]: I0301 09:24:38.466575 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-29ht5" event={"ID":"15fa5cd2-57f2-4589-9947-c4a227fa68b6","Type":"ContainerStarted","Data":"5ee4a8e4800ff27037b52aa58b081311f95a6ba4d258c46fcee562038196b6f2"} Mar 01 09:24:39 crc kubenswrapper[4792]: I0301 09:24:39.472117 4792 generic.go:334] "Generic (PLEG): container finished" podID="15fa5cd2-57f2-4589-9947-c4a227fa68b6" containerID="2712402bbe2a400f1f172cc0f249c7e35edf7b64593d06c6d1cfd9d81ee06f57" exitCode=0 Mar 01 09:24:39 crc kubenswrapper[4792]: I0301 09:24:39.472154 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-29ht5" event={"ID":"15fa5cd2-57f2-4589-9947-c4a227fa68b6","Type":"ContainerDied","Data":"2712402bbe2a400f1f172cc0f249c7e35edf7b64593d06c6d1cfd9d81ee06f57"} Mar 01 09:24:40 crc kubenswrapper[4792]: I0301 09:24:40.480685 4792 generic.go:334] "Generic (PLEG): container finished" podID="15fa5cd2-57f2-4589-9947-c4a227fa68b6" containerID="f4b48983a710c40494648dd6a515d3975deee5c28f7b927750a63de93e040785" exitCode=0 Mar 01 09:24:40 crc kubenswrapper[4792]: I0301 09:24:40.480791 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-29ht5" event={"ID":"15fa5cd2-57f2-4589-9947-c4a227fa68b6","Type":"ContainerDied","Data":"f4b48983a710c40494648dd6a515d3975deee5c28f7b927750a63de93e040785"} Mar 01 09:24:41 crc kubenswrapper[4792]: I0301 09:24:41.487695 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-29ht5" event={"ID":"15fa5cd2-57f2-4589-9947-c4a227fa68b6","Type":"ContainerStarted","Data":"8fa90defd7fd5e3f42f2b3e3f4e2672234268081fbce5bff10b4ec243b2afba1"} Mar 01 09:24:41 crc kubenswrapper[4792]: I0301 09:24:41.521512 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-29ht5" podStartSLOduration=3.123782451 podStartE2EDuration="4.521493524s" podCreationTimestamp="2026-03-01 09:24:37 +0000 UTC" firstStartedPulling="2026-03-01 09:24:39.473469596 +0000 UTC m=+1008.715348793" lastFinishedPulling="2026-03-01 09:24:40.871180659 +0000 UTC m=+1010.113059866" observedRunningTime="2026-03-01 09:24:41.51925853 +0000 UTC m=+1010.761137727" watchObservedRunningTime="2026-03-01 09:24:41.521493524 +0000 UTC m=+1010.763372721" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.552277 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-ggspg"] Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.553003 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-ggspg" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.562448 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jlnsb"] Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.563168 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jlnsb" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.563658 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-46k6h" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.564464 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj6r9\" (UniqueName: \"kubernetes.io/projected/b9e3fd6b-e3e2-4380-b8d7-900891df562a-kube-api-access-nj6r9\") pod \"barbican-operator-controller-manager-6db6876945-ggspg\" (UID: \"b9e3fd6b-e3e2-4380-b8d7-900891df562a\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-ggspg" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.564508 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jd5z\" (UniqueName: \"kubernetes.io/projected/8741a141-0194-4eb2-956e-c41f4ffe1338-kube-api-access-7jd5z\") pod \"cinder-operator-controller-manager-55d77d7b5c-jlnsb\" (UID: \"8741a141-0194-4eb2-956e-c41f4ffe1338\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jlnsb" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.569257 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-6bf7t" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.578356 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jlnsb"] Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.609150 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-72srw"] Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.609862 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-72srw" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.615063 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-v4bn8" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.616544 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-ggspg"] Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.632840 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-9wzbh"] Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.633605 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-9wzbh" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.644786 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-72srw"] Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.645194 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-btdr4" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.656876 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-9wzbh"] Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.665436 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jd5z\" (UniqueName: \"kubernetes.io/projected/8741a141-0194-4eb2-956e-c41f4ffe1338-kube-api-access-7jd5z\") pod \"cinder-operator-controller-manager-55d77d7b5c-jlnsb\" (UID: \"8741a141-0194-4eb2-956e-c41f4ffe1338\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jlnsb" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.665503 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxh8z\" (UniqueName: \"kubernetes.io/projected/bf1f37ea-a566-4dfd-b45b-02f284f19ce3-kube-api-access-lxh8z\") pod \"designate-operator-controller-manager-5d87c9d997-72srw\" (UID: \"bf1f37ea-a566-4dfd-b45b-02f284f19ce3\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-72srw" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.665531 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqf5f\" (UniqueName: \"kubernetes.io/projected/02dd5cc0-c44b-4ede-972b-9d26c9c54100-kube-api-access-jqf5f\") pod \"glance-operator-controller-manager-64db6967f8-9wzbh\" (UID: \"02dd5cc0-c44b-4ede-972b-9d26c9c54100\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-9wzbh" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.665567 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nj6r9\" (UniqueName: \"kubernetes.io/projected/b9e3fd6b-e3e2-4380-b8d7-900891df562a-kube-api-access-nj6r9\") pod \"barbican-operator-controller-manager-6db6876945-ggspg\" (UID: \"b9e3fd6b-e3e2-4380-b8d7-900891df562a\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-ggspg" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.701686 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj6r9\" (UniqueName: \"kubernetes.io/projected/b9e3fd6b-e3e2-4380-b8d7-900891df562a-kube-api-access-nj6r9\") pod \"barbican-operator-controller-manager-6db6876945-ggspg\" (UID: \"b9e3fd6b-e3e2-4380-b8d7-900891df562a\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-ggspg" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.705448 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jd5z\" (UniqueName: \"kubernetes.io/projected/8741a141-0194-4eb2-956e-c41f4ffe1338-kube-api-access-7jd5z\") pod \"cinder-operator-controller-manager-55d77d7b5c-jlnsb\" (UID: \"8741a141-0194-4eb2-956e-c41f4ffe1338\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jlnsb" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.716854 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-7v65r"] Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.727241 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-7v65r" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.731973 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-55qzx"] Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.732989 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-55qzx" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.735268 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-f968d" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.743192 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-s6s9l" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.748742 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-55qzx"] Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.767619 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxh8z\" (UniqueName: \"kubernetes.io/projected/bf1f37ea-a566-4dfd-b45b-02f284f19ce3-kube-api-access-lxh8z\") pod \"designate-operator-controller-manager-5d87c9d997-72srw\" (UID: \"bf1f37ea-a566-4dfd-b45b-02f284f19ce3\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-72srw" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.767666 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqf5f\" (UniqueName: \"kubernetes.io/projected/02dd5cc0-c44b-4ede-972b-9d26c9c54100-kube-api-access-jqf5f\") pod \"glance-operator-controller-manager-64db6967f8-9wzbh\" (UID: \"02dd5cc0-c44b-4ede-972b-9d26c9c54100\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-9wzbh" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.776100 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-7v65r"] Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.791289 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-f7fcc58b9-dsqtf"] Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.792159 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-dsqtf" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.806375 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.806612 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-vn8ng" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.813296 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqf5f\" (UniqueName: \"kubernetes.io/projected/02dd5cc0-c44b-4ede-972b-9d26c9c54100-kube-api-access-jqf5f\") pod \"glance-operator-controller-manager-64db6967f8-9wzbh\" (UID: \"02dd5cc0-c44b-4ede-972b-9d26c9c54100\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-9wzbh" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.828006 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-f7fcc58b9-dsqtf"] Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.830547 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxh8z\" (UniqueName: \"kubernetes.io/projected/bf1f37ea-a566-4dfd-b45b-02f284f19ce3-kube-api-access-lxh8z\") pod \"designate-operator-controller-manager-5d87c9d997-72srw\" (UID: \"bf1f37ea-a566-4dfd-b45b-02f284f19ce3\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-72srw" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.838348 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-jvw5j"] Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.838954 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-wjf62"] Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.839420 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wjf62" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.839725 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-jvw5j" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.844711 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-gqj5r" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.845033 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-sdj2j" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.856723 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-jvw5j"] Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.869466 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsb8q\" (UniqueName: \"kubernetes.io/projected/5044cf86-f557-41d4-b6c0-a41a668ac999-kube-api-access-vsb8q\") pod \"heat-operator-controller-manager-cf99c678f-7v65r\" (UID: \"5044cf86-f557-41d4-b6c0-a41a668ac999\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-7v65r" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.877236 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-ggspg" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.869497 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8khs\" (UniqueName: \"kubernetes.io/projected/cd83ed19-023d-43c2-92db-d290499db3d4-kube-api-access-d8khs\") pod \"horizon-operator-controller-manager-78bc7f9bd9-55qzx\" (UID: \"cd83ed19-023d-43c2-92db-d290499db3d4\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-55qzx" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.879351 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t6l9\" (UniqueName: \"kubernetes.io/projected/ea6739c2-185a-43e7-8fcf-0b2ae31957a0-kube-api-access-7t6l9\") pod \"infra-operator-controller-manager-f7fcc58b9-dsqtf\" (UID: \"ea6739c2-185a-43e7-8fcf-0b2ae31957a0\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-dsqtf" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.879384 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7q8t\" (UniqueName: \"kubernetes.io/projected/2af4993f-9ba9-4f7a-a31e-2bd133a7d4c5-kube-api-access-c7q8t\") pod \"ironic-operator-controller-manager-545456dc4-jvw5j\" (UID: \"2af4993f-9ba9-4f7a-a31e-2bd133a7d4c5\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-jvw5j" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.879426 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxwh7\" (UniqueName: \"kubernetes.io/projected/234d2ae5-7589-44cc-83f4-b0ee8a91940a-kube-api-access-sxwh7\") pod \"keystone-operator-controller-manager-7c789f89c6-wjf62\" (UID: \"234d2ae5-7589-44cc-83f4-b0ee8a91940a\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wjf62" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.879458 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea6739c2-185a-43e7-8fcf-0b2ae31957a0-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-dsqtf\" (UID: \"ea6739c2-185a-43e7-8fcf-0b2ae31957a0\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-dsqtf" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.883235 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-wjf62"] Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.888591 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jlnsb" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.909254 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-hlzm6"] Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.910176 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-hlzm6" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.913264 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-ltcf8" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.919358 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-t5fsn"] Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.920132 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-t5fsn" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.923570 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-7lngp" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.932542 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-72srw" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.934962 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-hlzm6"] Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.967196 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-9wzbh" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.971476 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-qjqd2"] Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.972224 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54688575f-qjqd2" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.980548 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxwh7\" (UniqueName: \"kubernetes.io/projected/234d2ae5-7589-44cc-83f4-b0ee8a91940a-kube-api-access-sxwh7\") pod \"keystone-operator-controller-manager-7c789f89c6-wjf62\" (UID: \"234d2ae5-7589-44cc-83f4-b0ee8a91940a\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wjf62" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.980600 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea6739c2-185a-43e7-8fcf-0b2ae31957a0-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-dsqtf\" (UID: \"ea6739c2-185a-43e7-8fcf-0b2ae31957a0\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-dsqtf" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.980629 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9d52\" (UniqueName: \"kubernetes.io/projected/376afe52-646d-44b7-b32e-ce6cd6dc21a6-kube-api-access-q9d52\") pod \"manila-operator-controller-manager-67d996989d-t5fsn\" (UID: \"376afe52-646d-44b7-b32e-ce6cd6dc21a6\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-t5fsn" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.980671 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsb8q\" (UniqueName: \"kubernetes.io/projected/5044cf86-f557-41d4-b6c0-a41a668ac999-kube-api-access-vsb8q\") pod \"heat-operator-controller-manager-cf99c678f-7v65r\" (UID: \"5044cf86-f557-41d4-b6c0-a41a668ac999\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-7v65r" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.980691 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8khs\" (UniqueName: \"kubernetes.io/projected/cd83ed19-023d-43c2-92db-d290499db3d4-kube-api-access-d8khs\") pod \"horizon-operator-controller-manager-78bc7f9bd9-55qzx\" (UID: \"cd83ed19-023d-43c2-92db-d290499db3d4\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-55qzx" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.980729 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhplh\" (UniqueName: \"kubernetes.io/projected/1793465e-1273-4250-a238-c99798788618-kube-api-access-rhplh\") pod \"mariadb-operator-controller-manager-7b6bfb6475-hlzm6\" (UID: \"1793465e-1273-4250-a238-c99798788618\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-hlzm6" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.980761 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxgmx\" (UniqueName: \"kubernetes.io/projected/dfb10d33-c4f1-4287-be83-dff835c733ba-kube-api-access-vxgmx\") pod \"neutron-operator-controller-manager-54688575f-qjqd2\" (UID: \"dfb10d33-c4f1-4287-be83-dff835c733ba\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-qjqd2" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.980789 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7q8t\" (UniqueName: \"kubernetes.io/projected/2af4993f-9ba9-4f7a-a31e-2bd133a7d4c5-kube-api-access-c7q8t\") pod \"ironic-operator-controller-manager-545456dc4-jvw5j\" (UID: \"2af4993f-9ba9-4f7a-a31e-2bd133a7d4c5\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-jvw5j" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.980808 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t6l9\" (UniqueName: \"kubernetes.io/projected/ea6739c2-185a-43e7-8fcf-0b2ae31957a0-kube-api-access-7t6l9\") pod \"infra-operator-controller-manager-f7fcc58b9-dsqtf\" (UID: \"ea6739c2-185a-43e7-8fcf-0b2ae31957a0\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-dsqtf" Mar 01 09:24:42 crc kubenswrapper[4792]: E0301 09:24:42.981396 4792 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 01 09:24:42 crc kubenswrapper[4792]: E0301 09:24:42.981469 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea6739c2-185a-43e7-8fcf-0b2ae31957a0-cert podName:ea6739c2-185a-43e7-8fcf-0b2ae31957a0 nodeName:}" failed. No retries permitted until 2026-03-01 09:24:43.481448883 +0000 UTC m=+1012.723328070 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ea6739c2-185a-43e7-8fcf-0b2ae31957a0-cert") pod "infra-operator-controller-manager-f7fcc58b9-dsqtf" (UID: "ea6739c2-185a-43e7-8fcf-0b2ae31957a0") : secret "infra-operator-webhook-server-cert" not found Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.003266 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-7qcsv" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.016156 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-t5fsn"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.026628 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7q8t\" (UniqueName: \"kubernetes.io/projected/2af4993f-9ba9-4f7a-a31e-2bd133a7d4c5-kube-api-access-c7q8t\") pod \"ironic-operator-controller-manager-545456dc4-jvw5j\" (UID: \"2af4993f-9ba9-4f7a-a31e-2bd133a7d4c5\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-jvw5j" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.044284 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsb8q\" (UniqueName: \"kubernetes.io/projected/5044cf86-f557-41d4-b6c0-a41a668ac999-kube-api-access-vsb8q\") pod \"heat-operator-controller-manager-cf99c678f-7v65r\" (UID: \"5044cf86-f557-41d4-b6c0-a41a668ac999\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-7v65r" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.047100 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8khs\" (UniqueName: \"kubernetes.io/projected/cd83ed19-023d-43c2-92db-d290499db3d4-kube-api-access-d8khs\") pod \"horizon-operator-controller-manager-78bc7f9bd9-55qzx\" (UID: \"cd83ed19-023d-43c2-92db-d290499db3d4\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-55qzx" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.050961 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-qjqd2"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.054192 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxwh7\" (UniqueName: \"kubernetes.io/projected/234d2ae5-7589-44cc-83f4-b0ee8a91940a-kube-api-access-sxwh7\") pod \"keystone-operator-controller-manager-7c789f89c6-wjf62\" (UID: \"234d2ae5-7589-44cc-83f4-b0ee8a91940a\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wjf62" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.054393 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-knk7m"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.055194 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-knk7m" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.055924 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t6l9\" (UniqueName: \"kubernetes.io/projected/ea6739c2-185a-43e7-8fcf-0b2ae31957a0-kube-api-access-7t6l9\") pod \"infra-operator-controller-manager-f7fcc58b9-dsqtf\" (UID: \"ea6739c2-185a-43e7-8fcf-0b2ae31957a0\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-dsqtf" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.065562 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-mbvwj" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.067481 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-knk7m"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.067723 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-7v65r" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.130038 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-55qzx" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.137788 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9d52\" (UniqueName: \"kubernetes.io/projected/376afe52-646d-44b7-b32e-ce6cd6dc21a6-kube-api-access-q9d52\") pod \"manila-operator-controller-manager-67d996989d-t5fsn\" (UID: \"376afe52-646d-44b7-b32e-ce6cd6dc21a6\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-t5fsn" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.138365 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhplh\" (UniqueName: \"kubernetes.io/projected/1793465e-1273-4250-a238-c99798788618-kube-api-access-rhplh\") pod \"mariadb-operator-controller-manager-7b6bfb6475-hlzm6\" (UID: \"1793465e-1273-4250-a238-c99798788618\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-hlzm6" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.138627 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxgmx\" (UniqueName: \"kubernetes.io/projected/dfb10d33-c4f1-4287-be83-dff835c733ba-kube-api-access-vxgmx\") pod \"neutron-operator-controller-manager-54688575f-qjqd2\" (UID: \"dfb10d33-c4f1-4287-be83-dff835c733ba\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-qjqd2" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.154287 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-54rpl"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.155502 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-54rpl" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.161658 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-8znlx" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.165301 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776948grv"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.180917 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776948grv" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.214304 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wjf62" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.215643 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.216433 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-jvw5j" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.228926 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxgmx\" (UniqueName: \"kubernetes.io/projected/dfb10d33-c4f1-4287-be83-dff835c733ba-kube-api-access-vxgmx\") pod \"neutron-operator-controller-manager-54688575f-qjqd2\" (UID: \"dfb10d33-c4f1-4287-be83-dff835c733ba\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-qjqd2" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.230872 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9d52\" (UniqueName: \"kubernetes.io/projected/376afe52-646d-44b7-b32e-ce6cd6dc21a6-kube-api-access-q9d52\") pod \"manila-operator-controller-manager-67d996989d-t5fsn\" (UID: \"376afe52-646d-44b7-b32e-ce6cd6dc21a6\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-t5fsn" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.237355 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-h6dst" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.237773 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhplh\" (UniqueName: \"kubernetes.io/projected/1793465e-1273-4250-a238-c99798788618-kube-api-access-rhplh\") pod \"mariadb-operator-controller-manager-7b6bfb6475-hlzm6\" (UID: \"1793465e-1273-4250-a238-c99798788618\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-hlzm6" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.242133 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-54rpl"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.242776 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcdk4\" (UniqueName: \"kubernetes.io/projected/8307ba19-5fc4-4cfc-b3cd-cafe5eac9cb9-kube-api-access-pcdk4\") pod \"nova-operator-controller-manager-74b6b5dc96-knk7m\" (UID: \"8307ba19-5fc4-4cfc-b3cd-cafe5eac9cb9\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-knk7m" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.262420 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-mqndr"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.263496 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-mqndr" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.267519 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-8kxt9" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.290465 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-jdn6k"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.293882 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-jdn6k" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.300575 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-l5f87" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.304482 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-hlzm6" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.340788 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-t5fsn" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.348184 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m24nn\" (UniqueName: \"kubernetes.io/projected/9244686e-175e-45f9-9eb7-23621cd1f3cd-kube-api-access-m24nn\") pod \"openstack-baremetal-operator-controller-manager-7b4cc4776948grv\" (UID: \"9244686e-175e-45f9-9eb7-23621cd1f3cd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776948grv" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.348345 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc6l4\" (UniqueName: \"kubernetes.io/projected/ecc17c18-7695-4d22-9a95-bcac51800d60-kube-api-access-lc6l4\") pod \"octavia-operator-controller-manager-5d86c7ddb7-54rpl\" (UID: \"ecc17c18-7695-4d22-9a95-bcac51800d60\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-54rpl" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.348404 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9244686e-175e-45f9-9eb7-23621cd1f3cd-cert\") pod \"openstack-baremetal-operator-controller-manager-7b4cc4776948grv\" (UID: \"9244686e-175e-45f9-9eb7-23621cd1f3cd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776948grv" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.348459 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcdk4\" (UniqueName: \"kubernetes.io/projected/8307ba19-5fc4-4cfc-b3cd-cafe5eac9cb9-kube-api-access-pcdk4\") pod \"nova-operator-controller-manager-74b6b5dc96-knk7m\" (UID: \"8307ba19-5fc4-4cfc-b3cd-cafe5eac9cb9\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-knk7m" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.358543 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54688575f-qjqd2" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.386847 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-zkx7c"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.387848 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-zkx7c" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.399198 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-svkrk" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.412124 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcdk4\" (UniqueName: \"kubernetes.io/projected/8307ba19-5fc4-4cfc-b3cd-cafe5eac9cb9-kube-api-access-pcdk4\") pod \"nova-operator-controller-manager-74b6b5dc96-knk7m\" (UID: \"8307ba19-5fc4-4cfc-b3cd-cafe5eac9cb9\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-knk7m" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.460625 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-mqndr"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.471867 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m24nn\" (UniqueName: \"kubernetes.io/projected/9244686e-175e-45f9-9eb7-23621cd1f3cd-kube-api-access-m24nn\") pod \"openstack-baremetal-operator-controller-manager-7b4cc4776948grv\" (UID: \"9244686e-175e-45f9-9eb7-23621cd1f3cd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776948grv" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.471978 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fct2d\" (UniqueName: \"kubernetes.io/projected/808b8753-0a20-419b-8b04-dcbccaa2d77e-kube-api-access-fct2d\") pod \"placement-operator-controller-manager-648564c9fc-jdn6k\" (UID: \"808b8753-0a20-419b-8b04-dcbccaa2d77e\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-jdn6k" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.472054 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc6l4\" (UniqueName: \"kubernetes.io/projected/ecc17c18-7695-4d22-9a95-bcac51800d60-kube-api-access-lc6l4\") pod \"octavia-operator-controller-manager-5d86c7ddb7-54rpl\" (UID: \"ecc17c18-7695-4d22-9a95-bcac51800d60\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-54rpl" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.472139 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9244686e-175e-45f9-9eb7-23621cd1f3cd-cert\") pod \"openstack-baremetal-operator-controller-manager-7b4cc4776948grv\" (UID: \"9244686e-175e-45f9-9eb7-23621cd1f3cd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776948grv" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.472182 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74rgc\" (UniqueName: \"kubernetes.io/projected/e0cef8e2-a392-4612-97c6-17c611b2a44e-kube-api-access-74rgc\") pod \"swift-operator-controller-manager-9b9ff9f4d-mqndr\" (UID: \"e0cef8e2-a392-4612-97c6-17c611b2a44e\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-mqndr" Mar 01 09:24:43 crc kubenswrapper[4792]: E0301 09:24:43.472693 4792 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 01 09:24:43 crc kubenswrapper[4792]: E0301 09:24:43.472735 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9244686e-175e-45f9-9eb7-23621cd1f3cd-cert podName:9244686e-175e-45f9-9eb7-23621cd1f3cd nodeName:}" failed. No retries permitted until 2026-03-01 09:24:43.972719927 +0000 UTC m=+1013.214599114 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9244686e-175e-45f9-9eb7-23621cd1f3cd-cert") pod "openstack-baremetal-operator-controller-manager-7b4cc4776948grv" (UID: "9244686e-175e-45f9-9eb7-23621cd1f3cd") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.496159 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776948grv"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.520230 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-jdn6k"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.529254 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-zkx7c"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.553148 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-knk7m" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.557506 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-jpxwz"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.559656 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-jpxwz" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.565145 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-bcnns"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.565936 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-bcnns" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.574112 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbz6d\" (UniqueName: \"kubernetes.io/projected/3d38195c-e4ff-49cf-9592-e9f52d73f2df-kube-api-access-hbz6d\") pod \"ovn-operator-controller-manager-75684d597f-zkx7c\" (UID: \"3d38195c-e4ff-49cf-9592-e9f52d73f2df\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-zkx7c" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.574218 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fct2d\" (UniqueName: \"kubernetes.io/projected/808b8753-0a20-419b-8b04-dcbccaa2d77e-kube-api-access-fct2d\") pod \"placement-operator-controller-manager-648564c9fc-jdn6k\" (UID: \"808b8753-0a20-419b-8b04-dcbccaa2d77e\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-jdn6k" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.574262 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea6739c2-185a-43e7-8fcf-0b2ae31957a0-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-dsqtf\" (UID: \"ea6739c2-185a-43e7-8fcf-0b2ae31957a0\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-dsqtf" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.574333 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74rgc\" (UniqueName: \"kubernetes.io/projected/e0cef8e2-a392-4612-97c6-17c611b2a44e-kube-api-access-74rgc\") pod \"swift-operator-controller-manager-9b9ff9f4d-mqndr\" (UID: \"e0cef8e2-a392-4612-97c6-17c611b2a44e\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-mqndr" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.584405 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-x7m9p" Mar 01 09:24:43 crc kubenswrapper[4792]: E0301 09:24:43.584557 4792 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 01 09:24:43 crc kubenswrapper[4792]: E0301 09:24:43.584609 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea6739c2-185a-43e7-8fcf-0b2ae31957a0-cert podName:ea6739c2-185a-43e7-8fcf-0b2ae31957a0 nodeName:}" failed. No retries permitted until 2026-03-01 09:24:44.584592481 +0000 UTC m=+1013.826471678 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ea6739c2-185a-43e7-8fcf-0b2ae31957a0-cert") pod "infra-operator-controller-manager-f7fcc58b9-dsqtf" (UID: "ea6739c2-185a-43e7-8fcf-0b2ae31957a0") : secret "infra-operator-webhook-server-cert" not found Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.584691 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-jpxwz"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.585320 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-ssz2h" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.585694 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc6l4\" (UniqueName: \"kubernetes.io/projected/ecc17c18-7695-4d22-9a95-bcac51800d60-kube-api-access-lc6l4\") pod \"octavia-operator-controller-manager-5d86c7ddb7-54rpl\" (UID: \"ecc17c18-7695-4d22-9a95-bcac51800d60\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-54rpl" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.589102 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-bcnns"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.610584 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-64lkf"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.611474 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-64lkf" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.625097 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-64lkf"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.637325 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-ts4wr" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.650485 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-54rpl" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.675579 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l9fq\" (UniqueName: \"kubernetes.io/projected/4fe8270e-a46d-40bc-8d24-a4585b196f5e-kube-api-access-8l9fq\") pod \"telemetry-operator-controller-manager-5fdb694969-jpxwz\" (UID: \"4fe8270e-a46d-40bc-8d24-a4585b196f5e\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-jpxwz" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.675664 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmrmm\" (UniqueName: \"kubernetes.io/projected/2970c60c-7b03-4667-99e4-08c094cdbfc2-kube-api-access-dmrmm\") pod \"test-operator-controller-manager-55b5ff4dbb-bcnns\" (UID: \"2970c60c-7b03-4667-99e4-08c094cdbfc2\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-bcnns" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.675710 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m24nn\" (UniqueName: \"kubernetes.io/projected/9244686e-175e-45f9-9eb7-23621cd1f3cd-kube-api-access-m24nn\") pod \"openstack-baremetal-operator-controller-manager-7b4cc4776948grv\" (UID: \"9244686e-175e-45f9-9eb7-23621cd1f3cd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776948grv" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.675805 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbz6d\" (UniqueName: \"kubernetes.io/projected/3d38195c-e4ff-49cf-9592-e9f52d73f2df-kube-api-access-hbz6d\") pod \"ovn-operator-controller-manager-75684d597f-zkx7c\" (UID: \"3d38195c-e4ff-49cf-9592-e9f52d73f2df\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-zkx7c" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.695655 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fct2d\" (UniqueName: \"kubernetes.io/projected/808b8753-0a20-419b-8b04-dcbccaa2d77e-kube-api-access-fct2d\") pod \"placement-operator-controller-manager-648564c9fc-jdn6k\" (UID: \"808b8753-0a20-419b-8b04-dcbccaa2d77e\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-jdn6k" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.697311 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.698188 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.698727 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74rgc\" (UniqueName: \"kubernetes.io/projected/e0cef8e2-a392-4612-97c6-17c611b2a44e-kube-api-access-74rgc\") pod \"swift-operator-controller-manager-9b9ff9f4d-mqndr\" (UID: \"e0cef8e2-a392-4612-97c6-17c611b2a44e\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-mqndr" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.702528 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-chzwj" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.702789 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.702993 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.712744 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.718219 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbz6d\" (UniqueName: \"kubernetes.io/projected/3d38195c-e4ff-49cf-9592-e9f52d73f2df-kube-api-access-hbz6d\") pod \"ovn-operator-controller-manager-75684d597f-zkx7c\" (UID: \"3d38195c-e4ff-49cf-9592-e9f52d73f2df\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-zkx7c" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.732778 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-72srw"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.768391 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-zkx7c" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.791976 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmrmm\" (UniqueName: \"kubernetes.io/projected/2970c60c-7b03-4667-99e4-08c094cdbfc2-kube-api-access-dmrmm\") pod \"test-operator-controller-manager-55b5ff4dbb-bcnns\" (UID: \"2970c60c-7b03-4667-99e4-08c094cdbfc2\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-bcnns" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.792491 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l9fq\" (UniqueName: \"kubernetes.io/projected/4fe8270e-a46d-40bc-8d24-a4585b196f5e-kube-api-access-8l9fq\") pod \"telemetry-operator-controller-manager-5fdb694969-jpxwz\" (UID: \"4fe8270e-a46d-40bc-8d24-a4585b196f5e\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-jpxwz" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.792544 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jwgz\" (UniqueName: \"kubernetes.io/projected/e45ebab9-87d5-4b2f-b3d1-f1832864584d-kube-api-access-7jwgz\") pod \"watcher-operator-controller-manager-bccc79885-64lkf\" (UID: \"e45ebab9-87d5-4b2f-b3d1-f1832864584d\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-64lkf" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.897307 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-metrics-certs\") pod \"openstack-operator-controller-manager-864b865b94-5ndlx\" (UID: \"d1d3783f-78e9-461a-916a-5a46e3083e70\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.897372 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh2wh\" (UniqueName: \"kubernetes.io/projected/d1d3783f-78e9-461a-916a-5a46e3083e70-kube-api-access-vh2wh\") pod \"openstack-operator-controller-manager-864b865b94-5ndlx\" (UID: \"d1d3783f-78e9-461a-916a-5a46e3083e70\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.897450 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jwgz\" (UniqueName: \"kubernetes.io/projected/e45ebab9-87d5-4b2f-b3d1-f1832864584d-kube-api-access-7jwgz\") pod \"watcher-operator-controller-manager-bccc79885-64lkf\" (UID: \"e45ebab9-87d5-4b2f-b3d1-f1832864584d\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-64lkf" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.897479 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-webhook-certs\") pod \"openstack-operator-controller-manager-864b865b94-5ndlx\" (UID: \"d1d3783f-78e9-461a-916a-5a46e3083e70\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.918599 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l9fq\" (UniqueName: \"kubernetes.io/projected/4fe8270e-a46d-40bc-8d24-a4585b196f5e-kube-api-access-8l9fq\") pod \"telemetry-operator-controller-manager-5fdb694969-jpxwz\" (UID: \"4fe8270e-a46d-40bc-8d24-a4585b196f5e\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-jpxwz" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.927785 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmrmm\" (UniqueName: \"kubernetes.io/projected/2970c60c-7b03-4667-99e4-08c094cdbfc2-kube-api-access-dmrmm\") pod \"test-operator-controller-manager-55b5ff4dbb-bcnns\" (UID: \"2970c60c-7b03-4667-99e4-08c094cdbfc2\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-bcnns" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.937684 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jwgz\" (UniqueName: \"kubernetes.io/projected/e45ebab9-87d5-4b2f-b3d1-f1832864584d-kube-api-access-7jwgz\") pod \"watcher-operator-controller-manager-bccc79885-64lkf\" (UID: \"e45ebab9-87d5-4b2f-b3d1-f1832864584d\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-64lkf" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.948715 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-jpxwz" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.965960 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5l9m"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.966714 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5l9m" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.967327 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-mqndr" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.977412 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-vr24h" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.978924 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-jdn6k" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.989121 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-64lkf" Mar 01 09:24:44 crc kubenswrapper[4792]: I0301 09:24:43.996819 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5l9m"] Mar 01 09:24:44 crc kubenswrapper[4792]: I0301 09:24:44.010221 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-webhook-certs\") pod \"openstack-operator-controller-manager-864b865b94-5ndlx\" (UID: \"d1d3783f-78e9-461a-916a-5a46e3083e70\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" Mar 01 09:24:44 crc kubenswrapper[4792]: I0301 09:24:44.010414 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9244686e-175e-45f9-9eb7-23621cd1f3cd-cert\") pod \"openstack-baremetal-operator-controller-manager-7b4cc4776948grv\" (UID: \"9244686e-175e-45f9-9eb7-23621cd1f3cd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776948grv" Mar 01 09:24:44 crc kubenswrapper[4792]: I0301 09:24:44.010481 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-metrics-certs\") pod \"openstack-operator-controller-manager-864b865b94-5ndlx\" (UID: \"d1d3783f-78e9-461a-916a-5a46e3083e70\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" Mar 01 09:24:44 crc kubenswrapper[4792]: I0301 09:24:44.010522 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh2wh\" (UniqueName: \"kubernetes.io/projected/d1d3783f-78e9-461a-916a-5a46e3083e70-kube-api-access-vh2wh\") pod \"openstack-operator-controller-manager-864b865b94-5ndlx\" (UID: \"d1d3783f-78e9-461a-916a-5a46e3083e70\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" Mar 01 09:24:44 crc kubenswrapper[4792]: E0301 09:24:44.010981 4792 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 01 09:24:44 crc kubenswrapper[4792]: E0301 09:24:44.011020 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-webhook-certs podName:d1d3783f-78e9-461a-916a-5a46e3083e70 nodeName:}" failed. No retries permitted until 2026-03-01 09:24:44.511006443 +0000 UTC m=+1013.752885640 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-webhook-certs") pod "openstack-operator-controller-manager-864b865b94-5ndlx" (UID: "d1d3783f-78e9-461a-916a-5a46e3083e70") : secret "webhook-server-cert" not found Mar 01 09:24:44 crc kubenswrapper[4792]: E0301 09:24:44.011060 4792 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 01 09:24:44 crc kubenswrapper[4792]: E0301 09:24:44.011078 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9244686e-175e-45f9-9eb7-23621cd1f3cd-cert podName:9244686e-175e-45f9-9eb7-23621cd1f3cd nodeName:}" failed. No retries permitted until 2026-03-01 09:24:45.011072125 +0000 UTC m=+1014.252951322 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9244686e-175e-45f9-9eb7-23621cd1f3cd-cert") pod "openstack-baremetal-operator-controller-manager-7b4cc4776948grv" (UID: "9244686e-175e-45f9-9eb7-23621cd1f3cd") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 01 09:24:44 crc kubenswrapper[4792]: E0301 09:24:44.011109 4792 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 01 09:24:44 crc kubenswrapper[4792]: E0301 09:24:44.011127 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-metrics-certs podName:d1d3783f-78e9-461a-916a-5a46e3083e70 nodeName:}" failed. No retries permitted until 2026-03-01 09:24:44.511122146 +0000 UTC m=+1013.753001333 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-metrics-certs") pod "openstack-operator-controller-manager-864b865b94-5ndlx" (UID: "d1d3783f-78e9-461a-916a-5a46e3083e70") : secret "metrics-server-cert" not found Mar 01 09:24:44 crc kubenswrapper[4792]: I0301 09:24:44.050530 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh2wh\" (UniqueName: \"kubernetes.io/projected/d1d3783f-78e9-461a-916a-5a46e3083e70-kube-api-access-vh2wh\") pod \"openstack-operator-controller-manager-864b865b94-5ndlx\" (UID: \"d1d3783f-78e9-461a-916a-5a46e3083e70\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" Mar 01 09:24:44 crc kubenswrapper[4792]: I0301 09:24:44.113608 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsppl\" (UniqueName: \"kubernetes.io/projected/1ecd6b07-eda9-41d6-90af-6471699ff808-kube-api-access-nsppl\") pod \"rabbitmq-cluster-operator-manager-668c99d594-r5l9m\" (UID: \"1ecd6b07-eda9-41d6-90af-6471699ff808\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5l9m" Mar 01 09:24:44 crc kubenswrapper[4792]: I0301 09:24:44.216294 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsppl\" (UniqueName: \"kubernetes.io/projected/1ecd6b07-eda9-41d6-90af-6471699ff808-kube-api-access-nsppl\") pod \"rabbitmq-cluster-operator-manager-668c99d594-r5l9m\" (UID: \"1ecd6b07-eda9-41d6-90af-6471699ff808\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5l9m" Mar 01 09:24:44 crc kubenswrapper[4792]: I0301 09:24:44.225314 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-bcnns" Mar 01 09:24:44 crc kubenswrapper[4792]: I0301 09:24:44.259750 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsppl\" (UniqueName: \"kubernetes.io/projected/1ecd6b07-eda9-41d6-90af-6471699ff808-kube-api-access-nsppl\") pod \"rabbitmq-cluster-operator-manager-668c99d594-r5l9m\" (UID: \"1ecd6b07-eda9-41d6-90af-6471699ff808\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5l9m" Mar 01 09:24:44 crc kubenswrapper[4792]: I0301 09:24:44.371168 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5l9m" Mar 01 09:24:44 crc kubenswrapper[4792]: I0301 09:24:44.491335 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jlnsb"] Mar 01 09:24:44 crc kubenswrapper[4792]: I0301 09:24:44.497139 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-ggspg"] Mar 01 09:24:44 crc kubenswrapper[4792]: I0301 09:24:44.527143 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-72srw" event={"ID":"bf1f37ea-a566-4dfd-b45b-02f284f19ce3","Type":"ContainerStarted","Data":"ba10b617caf1f37d826b32cdf76709c174e0681e6bf47c736021b5c1eddff1d1"} Mar 01 09:24:44 crc kubenswrapper[4792]: I0301 09:24:44.527457 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-metrics-certs\") pod \"openstack-operator-controller-manager-864b865b94-5ndlx\" (UID: \"d1d3783f-78e9-461a-916a-5a46e3083e70\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" Mar 01 09:24:44 crc kubenswrapper[4792]: I0301 09:24:44.527526 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-webhook-certs\") pod \"openstack-operator-controller-manager-864b865b94-5ndlx\" (UID: \"d1d3783f-78e9-461a-916a-5a46e3083e70\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" Mar 01 09:24:44 crc kubenswrapper[4792]: E0301 09:24:44.529306 4792 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 01 09:24:44 crc kubenswrapper[4792]: E0301 09:24:44.529361 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-webhook-certs podName:d1d3783f-78e9-461a-916a-5a46e3083e70 nodeName:}" failed. No retries permitted until 2026-03-01 09:24:45.529346841 +0000 UTC m=+1014.771226038 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-webhook-certs") pod "openstack-operator-controller-manager-864b865b94-5ndlx" (UID: "d1d3783f-78e9-461a-916a-5a46e3083e70") : secret "webhook-server-cert" not found Mar 01 09:24:44 crc kubenswrapper[4792]: E0301 09:24:44.536006 4792 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 01 09:24:44 crc kubenswrapper[4792]: E0301 09:24:44.536051 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-metrics-certs podName:d1d3783f-78e9-461a-916a-5a46e3083e70 nodeName:}" failed. No retries permitted until 2026-03-01 09:24:45.536039335 +0000 UTC m=+1014.777918532 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-metrics-certs") pod "openstack-operator-controller-manager-864b865b94-5ndlx" (UID: "d1d3783f-78e9-461a-916a-5a46e3083e70") : secret "metrics-server-cert" not found Mar 01 09:24:44 crc kubenswrapper[4792]: W0301 09:24:44.549210 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8741a141_0194_4eb2_956e_c41f4ffe1338.slice/crio-d5adb0b48ab275232d9602130968f0100e1c618b83eacd5c6a159607b3b0635a WatchSource:0}: Error finding container d5adb0b48ab275232d9602130968f0100e1c618b83eacd5c6a159607b3b0635a: Status 404 returned error can't find the container with id d5adb0b48ab275232d9602130968f0100e1c618b83eacd5c6a159607b3b0635a Mar 01 09:24:44 crc kubenswrapper[4792]: I0301 09:24:44.570130 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-7v65r"] Mar 01 09:24:44 crc kubenswrapper[4792]: I0301 09:24:44.629114 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea6739c2-185a-43e7-8fcf-0b2ae31957a0-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-dsqtf\" (UID: \"ea6739c2-185a-43e7-8fcf-0b2ae31957a0\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-dsqtf" Mar 01 09:24:44 crc kubenswrapper[4792]: E0301 09:24:44.629336 4792 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 01 09:24:44 crc kubenswrapper[4792]: E0301 09:24:44.629382 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea6739c2-185a-43e7-8fcf-0b2ae31957a0-cert podName:ea6739c2-185a-43e7-8fcf-0b2ae31957a0 nodeName:}" failed. No retries permitted until 2026-03-01 09:24:46.629367955 +0000 UTC m=+1015.871247152 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ea6739c2-185a-43e7-8fcf-0b2ae31957a0-cert") pod "infra-operator-controller-manager-f7fcc58b9-dsqtf" (UID: "ea6739c2-185a-43e7-8fcf-0b2ae31957a0") : secret "infra-operator-webhook-server-cert" not found Mar 01 09:24:45 crc kubenswrapper[4792]: I0301 09:24:45.032887 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9244686e-175e-45f9-9eb7-23621cd1f3cd-cert\") pod \"openstack-baremetal-operator-controller-manager-7b4cc4776948grv\" (UID: \"9244686e-175e-45f9-9eb7-23621cd1f3cd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776948grv" Mar 01 09:24:45 crc kubenswrapper[4792]: E0301 09:24:45.033922 4792 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 01 09:24:45 crc kubenswrapper[4792]: E0301 09:24:45.033980 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9244686e-175e-45f9-9eb7-23621cd1f3cd-cert podName:9244686e-175e-45f9-9eb7-23621cd1f3cd nodeName:}" failed. No retries permitted until 2026-03-01 09:24:47.033962642 +0000 UTC m=+1016.275841839 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9244686e-175e-45f9-9eb7-23621cd1f3cd-cert") pod "openstack-baremetal-operator-controller-manager-7b4cc4776948grv" (UID: "9244686e-175e-45f9-9eb7-23621cd1f3cd") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 01 09:24:45 crc kubenswrapper[4792]: I0301 09:24:45.533641 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jlnsb" event={"ID":"8741a141-0194-4eb2-956e-c41f4ffe1338","Type":"ContainerStarted","Data":"d5adb0b48ab275232d9602130968f0100e1c618b83eacd5c6a159607b3b0635a"} Mar 01 09:24:45 crc kubenswrapper[4792]: I0301 09:24:45.534704 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-ggspg" event={"ID":"b9e3fd6b-e3e2-4380-b8d7-900891df562a","Type":"ContainerStarted","Data":"aa3b1164026ce2782aba927f155fc7e25c18b5439d62dda12c446f84dd90c59e"} Mar 01 09:24:45 crc kubenswrapper[4792]: I0301 09:24:45.540026 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-webhook-certs\") pod \"openstack-operator-controller-manager-864b865b94-5ndlx\" (UID: \"d1d3783f-78e9-461a-916a-5a46e3083e70\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" Mar 01 09:24:45 crc kubenswrapper[4792]: I0301 09:24:45.540199 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-metrics-certs\") pod \"openstack-operator-controller-manager-864b865b94-5ndlx\" (UID: \"d1d3783f-78e9-461a-916a-5a46e3083e70\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" Mar 01 09:24:45 crc kubenswrapper[4792]: E0301 09:24:45.540328 4792 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 01 09:24:45 crc kubenswrapper[4792]: E0301 09:24:45.540386 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-metrics-certs podName:d1d3783f-78e9-461a-916a-5a46e3083e70 nodeName:}" failed. No retries permitted until 2026-03-01 09:24:47.540372977 +0000 UTC m=+1016.782252174 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-metrics-certs") pod "openstack-operator-controller-manager-864b865b94-5ndlx" (UID: "d1d3783f-78e9-461a-916a-5a46e3083e70") : secret "metrics-server-cert" not found Mar 01 09:24:45 crc kubenswrapper[4792]: E0301 09:24:45.540731 4792 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 01 09:24:45 crc kubenswrapper[4792]: E0301 09:24:45.540772 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-webhook-certs podName:d1d3783f-78e9-461a-916a-5a46e3083e70 nodeName:}" failed. No retries permitted until 2026-03-01 09:24:47.540762166 +0000 UTC m=+1016.782641363 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-webhook-certs") pod "openstack-operator-controller-manager-864b865b94-5ndlx" (UID: "d1d3783f-78e9-461a-916a-5a46e3083e70") : secret "webhook-server-cert" not found Mar 01 09:24:46 crc kubenswrapper[4792]: I0301 09:24:46.081424 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-55qzx"] Mar 01 09:24:46 crc kubenswrapper[4792]: I0301 09:24:46.572888 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-knk7m"] Mar 01 09:24:46 crc kubenswrapper[4792]: I0301 09:24:46.581322 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-55qzx" event={"ID":"cd83ed19-023d-43c2-92db-d290499db3d4","Type":"ContainerStarted","Data":"80b1a2de06a8adc305b302ebf919b5861887ec48b367ad9db461383f27c4cd8b"} Mar 01 09:24:46 crc kubenswrapper[4792]: I0301 09:24:46.594346 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-9wzbh"] Mar 01 09:24:46 crc kubenswrapper[4792]: I0301 09:24:46.601173 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-jvw5j"] Mar 01 09:24:46 crc kubenswrapper[4792]: I0301 09:24:46.601817 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-7v65r" event={"ID":"5044cf86-f557-41d4-b6c0-a41a668ac999","Type":"ContainerStarted","Data":"e1b9edd3b1e3986848baa31be4313f49d7a1d702a243dcf54f6ab6440910eafa"} Mar 01 09:24:46 crc kubenswrapper[4792]: I0301 09:24:46.639001 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-t5fsn"] Mar 01 09:24:46 crc kubenswrapper[4792]: I0301 09:24:46.697003 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea6739c2-185a-43e7-8fcf-0b2ae31957a0-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-dsqtf\" (UID: \"ea6739c2-185a-43e7-8fcf-0b2ae31957a0\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-dsqtf" Mar 01 09:24:46 crc kubenswrapper[4792]: E0301 09:24:46.697127 4792 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 01 09:24:46 crc kubenswrapper[4792]: E0301 09:24:46.697173 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea6739c2-185a-43e7-8fcf-0b2ae31957a0-cert podName:ea6739c2-185a-43e7-8fcf-0b2ae31957a0 nodeName:}" failed. No retries permitted until 2026-03-01 09:24:50.697157677 +0000 UTC m=+1019.939036874 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ea6739c2-185a-43e7-8fcf-0b2ae31957a0-cert") pod "infra-operator-controller-manager-f7fcc58b9-dsqtf" (UID: "ea6739c2-185a-43e7-8fcf-0b2ae31957a0") : secret "infra-operator-webhook-server-cert" not found Mar 01 09:24:46 crc kubenswrapper[4792]: I0301 09:24:46.711982 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-qjqd2"] Mar 01 09:24:46 crc kubenswrapper[4792]: I0301 09:24:46.781529 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-wjf62"] Mar 01 09:24:46 crc kubenswrapper[4792]: W0301 09:24:46.790935 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod234d2ae5_7589_44cc_83f4_b0ee8a91940a.slice/crio-960c1a0af0708f56ab6591644ca45dde3daa4e5598f8db495768e71b953665b9 WatchSource:0}: Error finding container 960c1a0af0708f56ab6591644ca45dde3daa4e5598f8db495768e71b953665b9: Status 404 returned error can't find the container with id 960c1a0af0708f56ab6591644ca45dde3daa4e5598f8db495768e71b953665b9 Mar 01 09:24:46 crc kubenswrapper[4792]: W0301 09:24:46.822441 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1793465e_1273_4250_a238_c99798788618.slice/crio-c3c82ce45a98801bb1bab97f4fbba734ecdffde5eff16e4d1ae09024fd6bba16 WatchSource:0}: Error finding container c3c82ce45a98801bb1bab97f4fbba734ecdffde5eff16e4d1ae09024fd6bba16: Status 404 returned error can't find the container with id c3c82ce45a98801bb1bab97f4fbba734ecdffde5eff16e4d1ae09024fd6bba16 Mar 01 09:24:46 crc kubenswrapper[4792]: I0301 09:24:46.823549 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-hlzm6"] Mar 01 09:24:46 crc kubenswrapper[4792]: I0301 09:24:46.921817 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-54rpl"] Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.015185 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-zkx7c"] Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.033817 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-mqndr"] Mar 01 09:24:47 crc kubenswrapper[4792]: W0301 09:24:47.035554 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d38195c_e4ff_49cf_9592_e9f52d73f2df.slice/crio-4e28b880eea6a7bbde96fd8689733c02a6616cf895cd282660b903a0ac994c42 WatchSource:0}: Error finding container 4e28b880eea6a7bbde96fd8689733c02a6616cf895cd282660b903a0ac994c42: Status 404 returned error can't find the container with id 4e28b880eea6a7bbde96fd8689733c02a6616cf895cd282660b903a0ac994c42 Mar 01 09:24:47 crc kubenswrapper[4792]: W0301 09:24:47.054268 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0cef8e2_a392_4612_97c6_17c611b2a44e.slice/crio-a5c2f882f8dcf8e2e6ceeec8064ddf7477298dd78dc8a91fcdbe032c8be5b4f2 WatchSource:0}: Error finding container a5c2f882f8dcf8e2e6ceeec8064ddf7477298dd78dc8a91fcdbe032c8be5b4f2: Status 404 returned error can't find the container with id a5c2f882f8dcf8e2e6ceeec8064ddf7477298dd78dc8a91fcdbe032c8be5b4f2 Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.099990 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-bcnns"] Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.118092 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9244686e-175e-45f9-9eb7-23621cd1f3cd-cert\") pod \"openstack-baremetal-operator-controller-manager-7b4cc4776948grv\" (UID: \"9244686e-175e-45f9-9eb7-23621cd1f3cd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776948grv" Mar 01 09:24:47 crc kubenswrapper[4792]: E0301 09:24:47.118282 4792 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 01 09:24:47 crc kubenswrapper[4792]: E0301 09:24:47.118342 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9244686e-175e-45f9-9eb7-23621cd1f3cd-cert podName:9244686e-175e-45f9-9eb7-23621cd1f3cd nodeName:}" failed. No retries permitted until 2026-03-01 09:24:51.118323551 +0000 UTC m=+1020.360202748 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9244686e-175e-45f9-9eb7-23621cd1f3cd-cert") pod "openstack-baremetal-operator-controller-manager-7b4cc4776948grv" (UID: "9244686e-175e-45f9-9eb7-23621cd1f3cd") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.165034 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5l9m"] Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.176516 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-jpxwz"] Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.230097 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-64lkf"] Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.238677 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-jdn6k"] Mar 01 09:24:47 crc kubenswrapper[4792]: W0301 09:24:47.241632 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ecd6b07_eda9_41d6_90af_6471699ff808.slice/crio-d691d0399e469677d6ae9177034885648dffc0a75190738c79db56036d6e9782 WatchSource:0}: Error finding container d691d0399e469677d6ae9177034885648dffc0a75190738c79db56036d6e9782: Status 404 returned error can't find the container with id d691d0399e469677d6ae9177034885648dffc0a75190738c79db56036d6e9782 Mar 01 09:24:47 crc kubenswrapper[4792]: E0301 09:24:47.257524 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7jwgz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-bccc79885-64lkf_openstack-operators(e45ebab9-87d5-4b2f-b3d1-f1832864584d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 01 09:24:47 crc kubenswrapper[4792]: E0301 09:24:47.258723 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-64lkf" podUID="e45ebab9-87d5-4b2f-b3d1-f1832864584d" Mar 01 09:24:47 crc kubenswrapper[4792]: W0301 09:24:47.273759 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod808b8753_0a20_419b_8b04_dcbccaa2d77e.slice/crio-7f7e9f7e093cd111450949696d888963e0ce1a82c52679f952863446b592ba38 WatchSource:0}: Error finding container 7f7e9f7e093cd111450949696d888963e0ce1a82c52679f952863446b592ba38: Status 404 returned error can't find the container with id 7f7e9f7e093cd111450949696d888963e0ce1a82c52679f952863446b592ba38 Mar 01 09:24:47 crc kubenswrapper[4792]: E0301 09:24:47.276545 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:bb939885bd04593ad03af901adb77ee2a2d18529b328c23288c7cc7a2ba5282e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fct2d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-648564c9fc-jdn6k_openstack-operators(808b8753-0a20-419b-8b04-dcbccaa2d77e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 01 09:24:47 crc kubenswrapper[4792]: E0301 09:24:47.278178 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-jdn6k" podUID="808b8753-0a20-419b-8b04-dcbccaa2d77e" Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.631692 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-metrics-certs\") pod \"openstack-operator-controller-manager-864b865b94-5ndlx\" (UID: \"d1d3783f-78e9-461a-916a-5a46e3083e70\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.631803 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-webhook-certs\") pod \"openstack-operator-controller-manager-864b865b94-5ndlx\" (UID: \"d1d3783f-78e9-461a-916a-5a46e3083e70\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" Mar 01 09:24:47 crc kubenswrapper[4792]: E0301 09:24:47.631932 4792 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 01 09:24:47 crc kubenswrapper[4792]: E0301 09:24:47.632006 4792 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 01 09:24:47 crc kubenswrapper[4792]: E0301 09:24:47.632027 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-metrics-certs podName:d1d3783f-78e9-461a-916a-5a46e3083e70 nodeName:}" failed. No retries permitted until 2026-03-01 09:24:51.632007714 +0000 UTC m=+1020.873886901 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-metrics-certs") pod "openstack-operator-controller-manager-864b865b94-5ndlx" (UID: "d1d3783f-78e9-461a-916a-5a46e3083e70") : secret "metrics-server-cert" not found Mar 01 09:24:47 crc kubenswrapper[4792]: E0301 09:24:47.632069 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-webhook-certs podName:d1d3783f-78e9-461a-916a-5a46e3083e70 nodeName:}" failed. No retries permitted until 2026-03-01 09:24:51.632047955 +0000 UTC m=+1020.873927152 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-webhook-certs") pod "openstack-operator-controller-manager-864b865b94-5ndlx" (UID: "d1d3783f-78e9-461a-916a-5a46e3083e70") : secret "webhook-server-cert" not found Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.638738 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-zkx7c" event={"ID":"3d38195c-e4ff-49cf-9592-e9f52d73f2df","Type":"ContainerStarted","Data":"4e28b880eea6a7bbde96fd8689733c02a6616cf895cd282660b903a0ac994c42"} Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.642689 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-mqndr" event={"ID":"e0cef8e2-a392-4612-97c6-17c611b2a44e","Type":"ContainerStarted","Data":"a5c2f882f8dcf8e2e6ceeec8064ddf7477298dd78dc8a91fcdbe032c8be5b4f2"} Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.668570 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-jvw5j" event={"ID":"2af4993f-9ba9-4f7a-a31e-2bd133a7d4c5","Type":"ContainerStarted","Data":"135a878ebf26e55ab721a8b3c8811e6af8d3324717b4b08c3ff36c2f50a806d5"} Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.671451 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-54rpl" event={"ID":"ecc17c18-7695-4d22-9a95-bcac51800d60","Type":"ContainerStarted","Data":"7699d89194923baa5130b849d85533c2a4f80b67c3079cadb83ed7942a341128"} Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.672594 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wjf62" event={"ID":"234d2ae5-7589-44cc-83f4-b0ee8a91940a","Type":"ContainerStarted","Data":"960c1a0af0708f56ab6591644ca45dde3daa4e5598f8db495768e71b953665b9"} Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.673882 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-jdn6k" event={"ID":"808b8753-0a20-419b-8b04-dcbccaa2d77e","Type":"ContainerStarted","Data":"7f7e9f7e093cd111450949696d888963e0ce1a82c52679f952863446b592ba38"} Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.675008 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-jpxwz" event={"ID":"4fe8270e-a46d-40bc-8d24-a4585b196f5e","Type":"ContainerStarted","Data":"94c097d9c3e6a279e675034e53573d9abf36ff13f40f2a4f7bd46fedbbd4a885"} Mar 01 09:24:47 crc kubenswrapper[4792]: E0301 09:24:47.675154 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:bb939885bd04593ad03af901adb77ee2a2d18529b328c23288c7cc7a2ba5282e\\\"\"" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-jdn6k" podUID="808b8753-0a20-419b-8b04-dcbccaa2d77e" Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.677027 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-bcnns" event={"ID":"2970c60c-7b03-4667-99e4-08c094cdbfc2","Type":"ContainerStarted","Data":"17aab2391d2d662cc1edf1caabcc155551d4b00dd50fb74030ede2e96fb63e50"} Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.679505 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-64lkf" event={"ID":"e45ebab9-87d5-4b2f-b3d1-f1832864584d","Type":"ContainerStarted","Data":"ee959300b9c941a04eb70d52b1531fd8c48a4983f714abc59aa4c8f3d9148c49"} Mar 01 09:24:47 crc kubenswrapper[4792]: E0301 09:24:47.680962 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-64lkf" podUID="e45ebab9-87d5-4b2f-b3d1-f1832864584d" Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.681727 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5l9m" event={"ID":"1ecd6b07-eda9-41d6-90af-6471699ff808","Type":"ContainerStarted","Data":"d691d0399e469677d6ae9177034885648dffc0a75190738c79db56036d6e9782"} Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.683845 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-knk7m" event={"ID":"8307ba19-5fc4-4cfc-b3cd-cafe5eac9cb9","Type":"ContainerStarted","Data":"dfe782533c3722521e20eace8fe429e3a265f6fb32963109d2cf78121d3153ac"} Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.685708 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-t5fsn" event={"ID":"376afe52-646d-44b7-b32e-ce6cd6dc21a6","Type":"ContainerStarted","Data":"609b87890865e6905eeff0b5ac5b546e05c5efef3c92fab1622c004625e51d14"} Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.689396 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54688575f-qjqd2" event={"ID":"dfb10d33-c4f1-4287-be83-dff835c733ba","Type":"ContainerStarted","Data":"00a2c32629d94ae8fd350196ced6e1e95c4402f4a0916b2f1ad846c6db2f17aa"} Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.693680 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-9wzbh" event={"ID":"02dd5cc0-c44b-4ede-972b-9d26c9c54100","Type":"ContainerStarted","Data":"69a665b2ed8de862fa907844a28352a770d035c6e90eac4279c14d89c6409a8f"} Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.697575 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-hlzm6" event={"ID":"1793465e-1273-4250-a238-c99798788618","Type":"ContainerStarted","Data":"c3c82ce45a98801bb1bab97f4fbba734ecdffde5eff16e4d1ae09024fd6bba16"} Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.804735 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-29ht5" Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.804783 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-29ht5" Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.869483 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-29ht5" Mar 01 09:24:48 crc kubenswrapper[4792]: E0301 09:24:48.729147 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:bb939885bd04593ad03af901adb77ee2a2d18529b328c23288c7cc7a2ba5282e\\\"\"" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-jdn6k" podUID="808b8753-0a20-419b-8b04-dcbccaa2d77e" Mar 01 09:24:48 crc kubenswrapper[4792]: E0301 09:24:48.729514 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-64lkf" podUID="e45ebab9-87d5-4b2f-b3d1-f1832864584d" Mar 01 09:24:48 crc kubenswrapper[4792]: I0301 09:24:48.875471 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vgfjs"] Mar 01 09:24:48 crc kubenswrapper[4792]: I0301 09:24:48.876919 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vgfjs" Mar 01 09:24:48 crc kubenswrapper[4792]: I0301 09:24:48.903749 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vgfjs"] Mar 01 09:24:48 crc kubenswrapper[4792]: I0301 09:24:48.967934 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a-utilities\") pod \"certified-operators-vgfjs\" (UID: \"02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a\") " pod="openshift-marketplace/certified-operators-vgfjs" Mar 01 09:24:48 crc kubenswrapper[4792]: I0301 09:24:48.967988 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a-catalog-content\") pod \"certified-operators-vgfjs\" (UID: \"02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a\") " pod="openshift-marketplace/certified-operators-vgfjs" Mar 01 09:24:48 crc kubenswrapper[4792]: I0301 09:24:48.968056 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xztgl\" (UniqueName: \"kubernetes.io/projected/02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a-kube-api-access-xztgl\") pod \"certified-operators-vgfjs\" (UID: \"02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a\") " pod="openshift-marketplace/certified-operators-vgfjs" Mar 01 09:24:49 crc kubenswrapper[4792]: I0301 09:24:48.996108 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-29ht5" Mar 01 09:24:49 crc kubenswrapper[4792]: I0301 09:24:49.068793 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a-utilities\") pod \"certified-operators-vgfjs\" (UID: \"02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a\") " pod="openshift-marketplace/certified-operators-vgfjs" Mar 01 09:24:49 crc kubenswrapper[4792]: I0301 09:24:49.068830 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a-catalog-content\") pod \"certified-operators-vgfjs\" (UID: \"02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a\") " pod="openshift-marketplace/certified-operators-vgfjs" Mar 01 09:24:49 crc kubenswrapper[4792]: I0301 09:24:49.068888 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xztgl\" (UniqueName: \"kubernetes.io/projected/02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a-kube-api-access-xztgl\") pod \"certified-operators-vgfjs\" (UID: \"02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a\") " pod="openshift-marketplace/certified-operators-vgfjs" Mar 01 09:24:49 crc kubenswrapper[4792]: I0301 09:24:49.069553 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a-utilities\") pod \"certified-operators-vgfjs\" (UID: \"02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a\") " pod="openshift-marketplace/certified-operators-vgfjs" Mar 01 09:24:49 crc kubenswrapper[4792]: I0301 09:24:49.069571 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a-catalog-content\") pod \"certified-operators-vgfjs\" (UID: \"02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a\") " pod="openshift-marketplace/certified-operators-vgfjs" Mar 01 09:24:49 crc kubenswrapper[4792]: I0301 09:24:49.108281 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xztgl\" (UniqueName: \"kubernetes.io/projected/02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a-kube-api-access-xztgl\") pod \"certified-operators-vgfjs\" (UID: \"02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a\") " pod="openshift-marketplace/certified-operators-vgfjs" Mar 01 09:24:49 crc kubenswrapper[4792]: I0301 09:24:49.205651 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vgfjs" Mar 01 09:24:50 crc kubenswrapper[4792]: I0301 09:24:50.000288 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vgfjs"] Mar 01 09:24:50 crc kubenswrapper[4792]: I0301 09:24:50.710052 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea6739c2-185a-43e7-8fcf-0b2ae31957a0-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-dsqtf\" (UID: \"ea6739c2-185a-43e7-8fcf-0b2ae31957a0\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-dsqtf" Mar 01 09:24:50 crc kubenswrapper[4792]: E0301 09:24:50.710425 4792 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 01 09:24:50 crc kubenswrapper[4792]: E0301 09:24:50.710481 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea6739c2-185a-43e7-8fcf-0b2ae31957a0-cert podName:ea6739c2-185a-43e7-8fcf-0b2ae31957a0 nodeName:}" failed. No retries permitted until 2026-03-01 09:24:58.710467633 +0000 UTC m=+1027.952346830 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ea6739c2-185a-43e7-8fcf-0b2ae31957a0-cert") pod "infra-operator-controller-manager-f7fcc58b9-dsqtf" (UID: "ea6739c2-185a-43e7-8fcf-0b2ae31957a0") : secret "infra-operator-webhook-server-cert" not found Mar 01 09:24:50 crc kubenswrapper[4792]: I0301 09:24:50.798398 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vgfjs" event={"ID":"02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a","Type":"ContainerStarted","Data":"07b2ccb12f444a169347668634bc26575de0ebccb8ee9dc035b529cef91259bc"} Mar 01 09:24:51 crc kubenswrapper[4792]: I0301 09:24:51.217733 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9244686e-175e-45f9-9eb7-23621cd1f3cd-cert\") pod \"openstack-baremetal-operator-controller-manager-7b4cc4776948grv\" (UID: \"9244686e-175e-45f9-9eb7-23621cd1f3cd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776948grv" Mar 01 09:24:51 crc kubenswrapper[4792]: E0301 09:24:51.218006 4792 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 01 09:24:51 crc kubenswrapper[4792]: E0301 09:24:51.218052 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9244686e-175e-45f9-9eb7-23621cd1f3cd-cert podName:9244686e-175e-45f9-9eb7-23621cd1f3cd nodeName:}" failed. No retries permitted until 2026-03-01 09:24:59.218039116 +0000 UTC m=+1028.459918303 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9244686e-175e-45f9-9eb7-23621cd1f3cd-cert") pod "openstack-baremetal-operator-controller-manager-7b4cc4776948grv" (UID: "9244686e-175e-45f9-9eb7-23621cd1f3cd") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 01 09:24:51 crc kubenswrapper[4792]: I0301 09:24:51.302960 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-29ht5"] Mar 01 09:24:51 crc kubenswrapper[4792]: I0301 09:24:51.303180 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-29ht5" podUID="15fa5cd2-57f2-4589-9947-c4a227fa68b6" containerName="registry-server" containerID="cri-o://8fa90defd7fd5e3f42f2b3e3f4e2672234268081fbce5bff10b4ec243b2afba1" gracePeriod=2 Mar 01 09:24:51 crc kubenswrapper[4792]: I0301 09:24:51.532762 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-76rvg"] Mar 01 09:24:51 crc kubenswrapper[4792]: I0301 09:24:51.535760 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-76rvg" Mar 01 09:24:51 crc kubenswrapper[4792]: I0301 09:24:51.539023 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-76rvg"] Mar 01 09:24:51 crc kubenswrapper[4792]: I0301 09:24:51.624326 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c9l7\" (UniqueName: \"kubernetes.io/projected/5ee54ba4-1afe-492b-a35b-23f0da447772-kube-api-access-4c9l7\") pod \"community-operators-76rvg\" (UID: \"5ee54ba4-1afe-492b-a35b-23f0da447772\") " pod="openshift-marketplace/community-operators-76rvg" Mar 01 09:24:51 crc kubenswrapper[4792]: I0301 09:24:51.624692 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ee54ba4-1afe-492b-a35b-23f0da447772-catalog-content\") pod \"community-operators-76rvg\" (UID: \"5ee54ba4-1afe-492b-a35b-23f0da447772\") " pod="openshift-marketplace/community-operators-76rvg" Mar 01 09:24:51 crc kubenswrapper[4792]: I0301 09:24:51.624771 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ee54ba4-1afe-492b-a35b-23f0da447772-utilities\") pod \"community-operators-76rvg\" (UID: \"5ee54ba4-1afe-492b-a35b-23f0da447772\") " pod="openshift-marketplace/community-operators-76rvg" Mar 01 09:24:51 crc kubenswrapper[4792]: I0301 09:24:51.732742 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-metrics-certs\") pod \"openstack-operator-controller-manager-864b865b94-5ndlx\" (UID: \"d1d3783f-78e9-461a-916a-5a46e3083e70\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" Mar 01 09:24:51 crc kubenswrapper[4792]: I0301 09:24:51.732791 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c9l7\" (UniqueName: \"kubernetes.io/projected/5ee54ba4-1afe-492b-a35b-23f0da447772-kube-api-access-4c9l7\") pod \"community-operators-76rvg\" (UID: \"5ee54ba4-1afe-492b-a35b-23f0da447772\") " pod="openshift-marketplace/community-operators-76rvg" Mar 01 09:24:51 crc kubenswrapper[4792]: I0301 09:24:51.732836 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ee54ba4-1afe-492b-a35b-23f0da447772-catalog-content\") pod \"community-operators-76rvg\" (UID: \"5ee54ba4-1afe-492b-a35b-23f0da447772\") " pod="openshift-marketplace/community-operators-76rvg" Mar 01 09:24:51 crc kubenswrapper[4792]: I0301 09:24:51.732871 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-webhook-certs\") pod \"openstack-operator-controller-manager-864b865b94-5ndlx\" (UID: \"d1d3783f-78e9-461a-916a-5a46e3083e70\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" Mar 01 09:24:51 crc kubenswrapper[4792]: I0301 09:24:51.732933 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ee54ba4-1afe-492b-a35b-23f0da447772-utilities\") pod \"community-operators-76rvg\" (UID: \"5ee54ba4-1afe-492b-a35b-23f0da447772\") " pod="openshift-marketplace/community-operators-76rvg" Mar 01 09:24:51 crc kubenswrapper[4792]: I0301 09:24:51.733884 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ee54ba4-1afe-492b-a35b-23f0da447772-utilities\") pod \"community-operators-76rvg\" (UID: \"5ee54ba4-1afe-492b-a35b-23f0da447772\") " pod="openshift-marketplace/community-operators-76rvg" Mar 01 09:24:51 crc kubenswrapper[4792]: E0301 09:24:51.734055 4792 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 01 09:24:51 crc kubenswrapper[4792]: E0301 09:24:51.734146 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-metrics-certs podName:d1d3783f-78e9-461a-916a-5a46e3083e70 nodeName:}" failed. No retries permitted until 2026-03-01 09:24:59.734127888 +0000 UTC m=+1028.976007085 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-metrics-certs") pod "openstack-operator-controller-manager-864b865b94-5ndlx" (UID: "d1d3783f-78e9-461a-916a-5a46e3083e70") : secret "metrics-server-cert" not found Mar 01 09:24:51 crc kubenswrapper[4792]: I0301 09:24:51.735422 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ee54ba4-1afe-492b-a35b-23f0da447772-catalog-content\") pod \"community-operators-76rvg\" (UID: \"5ee54ba4-1afe-492b-a35b-23f0da447772\") " pod="openshift-marketplace/community-operators-76rvg" Mar 01 09:24:51 crc kubenswrapper[4792]: E0301 09:24:51.735563 4792 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 01 09:24:51 crc kubenswrapper[4792]: E0301 09:24:51.735628 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-webhook-certs podName:d1d3783f-78e9-461a-916a-5a46e3083e70 nodeName:}" failed. No retries permitted until 2026-03-01 09:24:59.735585834 +0000 UTC m=+1028.977465031 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-webhook-certs") pod "openstack-operator-controller-manager-864b865b94-5ndlx" (UID: "d1d3783f-78e9-461a-916a-5a46e3083e70") : secret "webhook-server-cert" not found Mar 01 09:24:51 crc kubenswrapper[4792]: I0301 09:24:51.779095 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c9l7\" (UniqueName: \"kubernetes.io/projected/5ee54ba4-1afe-492b-a35b-23f0da447772-kube-api-access-4c9l7\") pod \"community-operators-76rvg\" (UID: \"5ee54ba4-1afe-492b-a35b-23f0da447772\") " pod="openshift-marketplace/community-operators-76rvg" Mar 01 09:24:51 crc kubenswrapper[4792]: I0301 09:24:51.854636 4792 generic.go:334] "Generic (PLEG): container finished" podID="15fa5cd2-57f2-4589-9947-c4a227fa68b6" containerID="8fa90defd7fd5e3f42f2b3e3f4e2672234268081fbce5bff10b4ec243b2afba1" exitCode=0 Mar 01 09:24:51 crc kubenswrapper[4792]: I0301 09:24:51.854691 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-29ht5" event={"ID":"15fa5cd2-57f2-4589-9947-c4a227fa68b6","Type":"ContainerDied","Data":"8fa90defd7fd5e3f42f2b3e3f4e2672234268081fbce5bff10b4ec243b2afba1"} Mar 01 09:24:51 crc kubenswrapper[4792]: I0301 09:24:51.861768 4792 generic.go:334] "Generic (PLEG): container finished" podID="02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a" containerID="eab704b2ab894fb8cdec51e904b0f50429dd2cdba1e21487a963aeddb64f6526" exitCode=0 Mar 01 09:24:51 crc kubenswrapper[4792]: I0301 09:24:51.862831 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vgfjs" event={"ID":"02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a","Type":"ContainerDied","Data":"eab704b2ab894fb8cdec51e904b0f50429dd2cdba1e21487a963aeddb64f6526"} Mar 01 09:24:51 crc kubenswrapper[4792]: I0301 09:24:51.872528 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-76rvg" Mar 01 09:24:52 crc kubenswrapper[4792]: I0301 09:24:52.946678 4792 scope.go:117] "RemoveContainer" containerID="ad33205b5c6776c36f5f90bc6d51a56bb6cf073bf39f5d634c13c03da022cc95" Mar 01 09:24:57 crc kubenswrapper[4792]: E0301 09:24:57.805704 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8fa90defd7fd5e3f42f2b3e3f4e2672234268081fbce5bff10b4ec243b2afba1 is running failed: container process not found" containerID="8fa90defd7fd5e3f42f2b3e3f4e2672234268081fbce5bff10b4ec243b2afba1" cmd=["grpc_health_probe","-addr=:50051"] Mar 01 09:24:57 crc kubenswrapper[4792]: E0301 09:24:57.806726 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8fa90defd7fd5e3f42f2b3e3f4e2672234268081fbce5bff10b4ec243b2afba1 is running failed: container process not found" containerID="8fa90defd7fd5e3f42f2b3e3f4e2672234268081fbce5bff10b4ec243b2afba1" cmd=["grpc_health_probe","-addr=:50051"] Mar 01 09:24:57 crc kubenswrapper[4792]: E0301 09:24:57.807110 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8fa90defd7fd5e3f42f2b3e3f4e2672234268081fbce5bff10b4ec243b2afba1 is running failed: container process not found" containerID="8fa90defd7fd5e3f42f2b3e3f4e2672234268081fbce5bff10b4ec243b2afba1" cmd=["grpc_health_probe","-addr=:50051"] Mar 01 09:24:57 crc kubenswrapper[4792]: E0301 09:24:57.807144 4792 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8fa90defd7fd5e3f42f2b3e3f4e2672234268081fbce5bff10b4ec243b2afba1 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-29ht5" podUID="15fa5cd2-57f2-4589-9947-c4a227fa68b6" containerName="registry-server" Mar 01 09:24:58 crc kubenswrapper[4792]: I0301 09:24:58.735315 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea6739c2-185a-43e7-8fcf-0b2ae31957a0-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-dsqtf\" (UID: \"ea6739c2-185a-43e7-8fcf-0b2ae31957a0\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-dsqtf" Mar 01 09:24:58 crc kubenswrapper[4792]: I0301 09:24:58.743105 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea6739c2-185a-43e7-8fcf-0b2ae31957a0-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-dsqtf\" (UID: \"ea6739c2-185a-43e7-8fcf-0b2ae31957a0\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-dsqtf" Mar 01 09:24:58 crc kubenswrapper[4792]: I0301 09:24:58.760368 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-vn8ng" Mar 01 09:24:58 crc kubenswrapper[4792]: I0301 09:24:58.769334 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-dsqtf" Mar 01 09:24:59 crc kubenswrapper[4792]: I0301 09:24:59.242429 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9244686e-175e-45f9-9eb7-23621cd1f3cd-cert\") pod \"openstack-baremetal-operator-controller-manager-7b4cc4776948grv\" (UID: \"9244686e-175e-45f9-9eb7-23621cd1f3cd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776948grv" Mar 01 09:24:59 crc kubenswrapper[4792]: I0301 09:24:59.246458 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9244686e-175e-45f9-9eb7-23621cd1f3cd-cert\") pod \"openstack-baremetal-operator-controller-manager-7b4cc4776948grv\" (UID: \"9244686e-175e-45f9-9eb7-23621cd1f3cd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776948grv" Mar 01 09:24:59 crc kubenswrapper[4792]: I0301 09:24:59.501828 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-h6dst" Mar 01 09:24:59 crc kubenswrapper[4792]: I0301 09:24:59.510986 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776948grv" Mar 01 09:24:59 crc kubenswrapper[4792]: I0301 09:24:59.749514 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-metrics-certs\") pod \"openstack-operator-controller-manager-864b865b94-5ndlx\" (UID: \"d1d3783f-78e9-461a-916a-5a46e3083e70\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" Mar 01 09:24:59 crc kubenswrapper[4792]: I0301 09:24:59.749608 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-webhook-certs\") pod \"openstack-operator-controller-manager-864b865b94-5ndlx\" (UID: \"d1d3783f-78e9-461a-916a-5a46e3083e70\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" Mar 01 09:24:59 crc kubenswrapper[4792]: I0301 09:24:59.757187 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-metrics-certs\") pod \"openstack-operator-controller-manager-864b865b94-5ndlx\" (UID: \"d1d3783f-78e9-461a-916a-5a46e3083e70\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" Mar 01 09:24:59 crc kubenswrapper[4792]: I0301 09:24:59.758755 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-webhook-certs\") pod \"openstack-operator-controller-manager-864b865b94-5ndlx\" (UID: \"d1d3783f-78e9-461a-916a-5a46e3083e70\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" Mar 01 09:24:59 crc kubenswrapper[4792]: I0301 09:24:59.946411 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-chzwj" Mar 01 09:24:59 crc kubenswrapper[4792]: I0301 09:24:59.954750 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" Mar 01 09:25:03 crc kubenswrapper[4792]: E0301 09:25:03.113275 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:1b9074a4ce16396d8bd2d30a475fc8c2f004f75a023e3eef8950661e89c0bcc6" Mar 01 09:25:03 crc kubenswrapper[4792]: E0301 09:25:03.113860 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:1b9074a4ce16396d8bd2d30a475fc8c2f004f75a023e3eef8950661e89c0bcc6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8l9fq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-5fdb694969-jpxwz_openstack-operators(4fe8270e-a46d-40bc-8d24-a4585b196f5e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 01 09:25:03 crc kubenswrapper[4792]: E0301 09:25:03.115012 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-jpxwz" podUID="4fe8270e-a46d-40bc-8d24-a4585b196f5e" Mar 01 09:25:03 crc kubenswrapper[4792]: E0301 09:25:03.899273 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:2d59045b8d8e6f9c5483c4fdda7c5057218d553200dc4bcf26789980ac1d9abd" Mar 01 09:25:03 crc kubenswrapper[4792]: E0301 09:25:03.899525 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:2d59045b8d8e6f9c5483c4fdda7c5057218d553200dc4bcf26789980ac1d9abd,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lc6l4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-5d86c7ddb7-54rpl_openstack-operators(ecc17c18-7695-4d22-9a95-bcac51800d60): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 01 09:25:03 crc kubenswrapper[4792]: E0301 09:25:03.900712 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-54rpl" podUID="ecc17c18-7695-4d22-9a95-bcac51800d60" Mar 01 09:25:03 crc kubenswrapper[4792]: E0301 09:25:03.944244 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:2d59045b8d8e6f9c5483c4fdda7c5057218d553200dc4bcf26789980ac1d9abd\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-54rpl" podUID="ecc17c18-7695-4d22-9a95-bcac51800d60" Mar 01 09:25:03 crc kubenswrapper[4792]: E0301 09:25:03.944333 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:1b9074a4ce16396d8bd2d30a475fc8c2f004f75a023e3eef8950661e89c0bcc6\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-jpxwz" podUID="4fe8270e-a46d-40bc-8d24-a4585b196f5e" Mar 01 09:25:05 crc kubenswrapper[4792]: E0301 09:25:05.278744 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:9d03f03aa9a460f1fcac8875064808c03e4ecd0388873bbfb9c7dc58331f3968" Mar 01 09:25:05 crc kubenswrapper[4792]: E0301 09:25:05.278950 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:9d03f03aa9a460f1fcac8875064808c03e4ecd0388873bbfb9c7dc58331f3968,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dmrmm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-55b5ff4dbb-bcnns_openstack-operators(2970c60c-7b03-4667-99e4-08c094cdbfc2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 01 09:25:05 crc kubenswrapper[4792]: E0301 09:25:05.280126 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-bcnns" podUID="2970c60c-7b03-4667-99e4-08c094cdbfc2" Mar 01 09:25:05 crc kubenswrapper[4792]: E0301 09:25:05.952923 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:9d03f03aa9a460f1fcac8875064808c03e4ecd0388873bbfb9c7dc58331f3968\\\"\"" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-bcnns" podUID="2970c60c-7b03-4667-99e4-08c094cdbfc2" Mar 01 09:25:06 crc kubenswrapper[4792]: E0301 09:25:06.946498 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:f309cdea8084a4b1e8cbcd732d6e250fd93c55cfd1b48ba9026907c8591faab7" Mar 01 09:25:06 crc kubenswrapper[4792]: E0301 09:25:06.946658 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:f309cdea8084a4b1e8cbcd732d6e250fd93c55cfd1b48ba9026907c8591faab7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-74rgc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-9b9ff9f4d-mqndr_openstack-operators(e0cef8e2-a392-4612-97c6-17c611b2a44e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 01 09:25:06 crc kubenswrapper[4792]: E0301 09:25:06.947829 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-mqndr" podUID="e0cef8e2-a392-4612-97c6-17c611b2a44e" Mar 01 09:25:06 crc kubenswrapper[4792]: E0301 09:25:06.958639 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:f309cdea8084a4b1e8cbcd732d6e250fd93c55cfd1b48ba9026907c8591faab7\\\"\"" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-mqndr" podUID="e0cef8e2-a392-4612-97c6-17c611b2a44e" Mar 01 09:25:07 crc kubenswrapper[4792]: E0301 09:25:07.574401 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:5592ec4a6fbe2c832d1828b51af0b907e5d733d478b6f378a9b2f6d6cf0ac505" Mar 01 09:25:07 crc kubenswrapper[4792]: E0301 09:25:07.574607 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:5592ec4a6fbe2c832d1828b51af0b907e5d733d478b6f378a9b2f6d6cf0ac505,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rhplh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-7b6bfb6475-hlzm6_openstack-operators(1793465e-1273-4250-a238-c99798788618): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 01 09:25:07 crc kubenswrapper[4792]: E0301 09:25:07.575765 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-hlzm6" podUID="1793465e-1273-4250-a238-c99798788618" Mar 01 09:25:07 crc kubenswrapper[4792]: E0301 09:25:07.805587 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8fa90defd7fd5e3f42f2b3e3f4e2672234268081fbce5bff10b4ec243b2afba1 is running failed: container process not found" containerID="8fa90defd7fd5e3f42f2b3e3f4e2672234268081fbce5bff10b4ec243b2afba1" cmd=["grpc_health_probe","-addr=:50051"] Mar 01 09:25:07 crc kubenswrapper[4792]: E0301 09:25:07.822755 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8fa90defd7fd5e3f42f2b3e3f4e2672234268081fbce5bff10b4ec243b2afba1 is running failed: container process not found" containerID="8fa90defd7fd5e3f42f2b3e3f4e2672234268081fbce5bff10b4ec243b2afba1" cmd=["grpc_health_probe","-addr=:50051"] Mar 01 09:25:07 crc kubenswrapper[4792]: E0301 09:25:07.823333 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8fa90defd7fd5e3f42f2b3e3f4e2672234268081fbce5bff10b4ec243b2afba1 is running failed: container process not found" containerID="8fa90defd7fd5e3f42f2b3e3f4e2672234268081fbce5bff10b4ec243b2afba1" cmd=["grpc_health_probe","-addr=:50051"] Mar 01 09:25:07 crc kubenswrapper[4792]: E0301 09:25:07.823401 4792 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8fa90defd7fd5e3f42f2b3e3f4e2672234268081fbce5bff10b4ec243b2afba1 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-29ht5" podUID="15fa5cd2-57f2-4589-9947-c4a227fa68b6" containerName="registry-server" Mar 01 09:25:07 crc kubenswrapper[4792]: E0301 09:25:07.976275 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:5592ec4a6fbe2c832d1828b51af0b907e5d733d478b6f378a9b2f6d6cf0ac505\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-hlzm6" podUID="1793465e-1273-4250-a238-c99798788618" Mar 01 09:25:10 crc kubenswrapper[4792]: E0301 09:25:10.364952 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:81e43c058d9af1d3bc31704010c630bc2a574c2ee388aa0ffe8c7b9621a7d051" Mar 01 09:25:10 crc kubenswrapper[4792]: E0301 09:25:10.365375 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:81e43c058d9af1d3bc31704010c630bc2a574c2ee388aa0ffe8c7b9621a7d051,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jqf5f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-64db6967f8-9wzbh_openstack-operators(02dd5cc0-c44b-4ede-972b-9d26c9c54100): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 01 09:25:10 crc kubenswrapper[4792]: E0301 09:25:10.366704 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-9wzbh" podUID="02dd5cc0-c44b-4ede-972b-9d26c9c54100" Mar 01 09:25:10 crc kubenswrapper[4792]: E0301 09:25:10.902996 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:3f9b0446a124745439306dc3bb7faec8c02c0b6be33f788b9d455fa57fb60120" Mar 01 09:25:10 crc kubenswrapper[4792]: E0301 09:25:10.903167 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:3f9b0446a124745439306dc3bb7faec8c02c0b6be33f788b9d455fa57fb60120,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nj6r9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-6db6876945-ggspg_openstack-operators(b9e3fd6b-e3e2-4380-b8d7-900891df562a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 01 09:25:10 crc kubenswrapper[4792]: E0301 09:25:10.904559 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-ggspg" podUID="b9e3fd6b-e3e2-4380-b8d7-900891df562a" Mar 01 09:25:10 crc kubenswrapper[4792]: E0301 09:25:10.993793 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:3f9b0446a124745439306dc3bb7faec8c02c0b6be33f788b9d455fa57fb60120\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-ggspg" podUID="b9e3fd6b-e3e2-4380-b8d7-900891df562a" Mar 01 09:25:10 crc kubenswrapper[4792]: E0301 09:25:10.993793 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:81e43c058d9af1d3bc31704010c630bc2a574c2ee388aa0ffe8c7b9621a7d051\\\"\"" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-9wzbh" podUID="02dd5cc0-c44b-4ede-972b-9d26c9c54100" Mar 01 09:25:12 crc kubenswrapper[4792]: E0301 09:25:12.817842 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:f1158ec4d879c4646eee4323bc501eba4d377beb2ad6fbe08ed30070c441ac26" Mar 01 09:25:12 crc kubenswrapper[4792]: E0301 09:25:12.818996 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:f1158ec4d879c4646eee4323bc501eba4d377beb2ad6fbe08ed30070c441ac26,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q9d52,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-67d996989d-t5fsn_openstack-operators(376afe52-646d-44b7-b32e-ce6cd6dc21a6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 01 09:25:12 crc kubenswrapper[4792]: E0301 09:25:12.821611 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-67d996989d-t5fsn" podUID="376afe52-646d-44b7-b32e-ce6cd6dc21a6" Mar 01 09:25:13 crc kubenswrapper[4792]: E0301 09:25:13.007189 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:f1158ec4d879c4646eee4323bc501eba4d377beb2ad6fbe08ed30070c441ac26\\\"\"" pod="openstack-operators/manila-operator-controller-manager-67d996989d-t5fsn" podUID="376afe52-646d-44b7-b32e-ce6cd6dc21a6" Mar 01 09:25:13 crc kubenswrapper[4792]: E0301 09:25:13.403666 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:508859beb0e5b69169393dbb0039dc03a9d4ba05f16f6ff74f9b25e19d446214" Mar 01 09:25:13 crc kubenswrapper[4792]: E0301 09:25:13.405432 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:508859beb0e5b69169393dbb0039dc03a9d4ba05f16f6ff74f9b25e19d446214,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lxh8z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-5d87c9d997-72srw_openstack-operators(bf1f37ea-a566-4dfd-b45b-02f284f19ce3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 01 09:25:13 crc kubenswrapper[4792]: E0301 09:25:13.407201 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-72srw" podUID="bf1f37ea-a566-4dfd-b45b-02f284f19ce3" Mar 01 09:25:13 crc kubenswrapper[4792]: E0301 09:25:13.972480 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:114c0dee0bab1d453890e9dcc7727de749055bdbea049384d5696e7ac8d78fe3" Mar 01 09:25:13 crc kubenswrapper[4792]: E0301 09:25:13.973074 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:114c0dee0bab1d453890e9dcc7727de749055bdbea049384d5696e7ac8d78fe3,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-d8khs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-78bc7f9bd9-55qzx_openstack-operators(cd83ed19-023d-43c2-92db-d290499db3d4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 01 09:25:13 crc kubenswrapper[4792]: E0301 09:25:13.974217 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-55qzx" podUID="cd83ed19-023d-43c2-92db-d290499db3d4" Mar 01 09:25:14 crc kubenswrapper[4792]: I0301 09:25:14.017630 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-29ht5" event={"ID":"15fa5cd2-57f2-4589-9947-c4a227fa68b6","Type":"ContainerDied","Data":"5ee4a8e4800ff27037b52aa58b081311f95a6ba4d258c46fcee562038196b6f2"} Mar 01 09:25:14 crc kubenswrapper[4792]: I0301 09:25:14.017666 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ee4a8e4800ff27037b52aa58b081311f95a6ba4d258c46fcee562038196b6f2" Mar 01 09:25:14 crc kubenswrapper[4792]: E0301 09:25:14.019350 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:508859beb0e5b69169393dbb0039dc03a9d4ba05f16f6ff74f9b25e19d446214\\\"\"" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-72srw" podUID="bf1f37ea-a566-4dfd-b45b-02f284f19ce3" Mar 01 09:25:14 crc kubenswrapper[4792]: E0301 09:25:14.019353 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:114c0dee0bab1d453890e9dcc7727de749055bdbea049384d5696e7ac8d78fe3\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-55qzx" podUID="cd83ed19-023d-43c2-92db-d290499db3d4" Mar 01 09:25:14 crc kubenswrapper[4792]: I0301 09:25:14.066294 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-29ht5" Mar 01 09:25:14 crc kubenswrapper[4792]: I0301 09:25:14.152133 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15fa5cd2-57f2-4589-9947-c4a227fa68b6-utilities\") pod \"15fa5cd2-57f2-4589-9947-c4a227fa68b6\" (UID: \"15fa5cd2-57f2-4589-9947-c4a227fa68b6\") " Mar 01 09:25:14 crc kubenswrapper[4792]: I0301 09:25:14.152251 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtpbt\" (UniqueName: \"kubernetes.io/projected/15fa5cd2-57f2-4589-9947-c4a227fa68b6-kube-api-access-dtpbt\") pod \"15fa5cd2-57f2-4589-9947-c4a227fa68b6\" (UID: \"15fa5cd2-57f2-4589-9947-c4a227fa68b6\") " Mar 01 09:25:14 crc kubenswrapper[4792]: I0301 09:25:14.152317 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15fa5cd2-57f2-4589-9947-c4a227fa68b6-catalog-content\") pod \"15fa5cd2-57f2-4589-9947-c4a227fa68b6\" (UID: \"15fa5cd2-57f2-4589-9947-c4a227fa68b6\") " Mar 01 09:25:14 crc kubenswrapper[4792]: I0301 09:25:14.153883 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15fa5cd2-57f2-4589-9947-c4a227fa68b6-utilities" (OuterVolumeSpecName: "utilities") pod "15fa5cd2-57f2-4589-9947-c4a227fa68b6" (UID: "15fa5cd2-57f2-4589-9947-c4a227fa68b6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:25:14 crc kubenswrapper[4792]: I0301 09:25:14.158190 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15fa5cd2-57f2-4589-9947-c4a227fa68b6-kube-api-access-dtpbt" (OuterVolumeSpecName: "kube-api-access-dtpbt") pod "15fa5cd2-57f2-4589-9947-c4a227fa68b6" (UID: "15fa5cd2-57f2-4589-9947-c4a227fa68b6"). InnerVolumeSpecName "kube-api-access-dtpbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:25:14 crc kubenswrapper[4792]: I0301 09:25:14.183328 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15fa5cd2-57f2-4589-9947-c4a227fa68b6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "15fa5cd2-57f2-4589-9947-c4a227fa68b6" (UID: "15fa5cd2-57f2-4589-9947-c4a227fa68b6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:25:14 crc kubenswrapper[4792]: I0301 09:25:14.254384 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15fa5cd2-57f2-4589-9947-c4a227fa68b6-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:25:14 crc kubenswrapper[4792]: I0301 09:25:14.254415 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtpbt\" (UniqueName: \"kubernetes.io/projected/15fa5cd2-57f2-4589-9947-c4a227fa68b6-kube-api-access-dtpbt\") on node \"crc\" DevicePath \"\"" Mar 01 09:25:14 crc kubenswrapper[4792]: I0301 09:25:14.254426 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15fa5cd2-57f2-4589-9947-c4a227fa68b6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:25:14 crc kubenswrapper[4792]: E0301 09:25:14.659202 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:172f24bd4603ac3498536a8a2c8fffb07cf9113dd52bc132778ea0aa275c6b84" Mar 01 09:25:14 crc kubenswrapper[4792]: E0301 09:25:14.659381 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:172f24bd4603ac3498536a8a2c8fffb07cf9113dd52bc132778ea0aa275c6b84,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pcdk4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-74b6b5dc96-knk7m_openstack-operators(8307ba19-5fc4-4cfc-b3cd-cafe5eac9cb9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 01 09:25:14 crc kubenswrapper[4792]: E0301 09:25:14.660699 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-knk7m" podUID="8307ba19-5fc4-4cfc-b3cd-cafe5eac9cb9" Mar 01 09:25:15 crc kubenswrapper[4792]: I0301 09:25:15.028401 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-29ht5" Mar 01 09:25:15 crc kubenswrapper[4792]: E0301 09:25:15.029164 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:172f24bd4603ac3498536a8a2c8fffb07cf9113dd52bc132778ea0aa275c6b84\\\"\"" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-knk7m" podUID="8307ba19-5fc4-4cfc-b3cd-cafe5eac9cb9" Mar 01 09:25:15 crc kubenswrapper[4792]: I0301 09:25:15.070357 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-29ht5"] Mar 01 09:25:15 crc kubenswrapper[4792]: I0301 09:25:15.078686 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-29ht5"] Mar 01 09:25:15 crc kubenswrapper[4792]: E0301 09:25:15.262422 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97" Mar 01 09:25:15 crc kubenswrapper[4792]: E0301 09:25:15.262581 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7jwgz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-bccc79885-64lkf_openstack-operators(e45ebab9-87d5-4b2f-b3d1-f1832864584d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 01 09:25:15 crc kubenswrapper[4792]: E0301 09:25:15.264488 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-64lkf" podUID="e45ebab9-87d5-4b2f-b3d1-f1832864584d" Mar 01 09:25:15 crc kubenswrapper[4792]: I0301 09:25:15.417259 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15fa5cd2-57f2-4589-9947-c4a227fa68b6" path="/var/lib/kubelet/pods/15fa5cd2-57f2-4589-9947-c4a227fa68b6/volumes" Mar 01 09:25:15 crc kubenswrapper[4792]: E0301 09:25:15.748531 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:9d723ab33964ee44704eed3223b64e828349d45dee04695434a6fcf4b6807d4c" Mar 01 09:25:15 crc kubenswrapper[4792]: E0301 09:25:15.748700 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:9d723ab33964ee44704eed3223b64e828349d45dee04695434a6fcf4b6807d4c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sxwh7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7c789f89c6-wjf62_openstack-operators(234d2ae5-7589-44cc-83f4-b0ee8a91940a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 01 09:25:15 crc kubenswrapper[4792]: E0301 09:25:15.749884 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wjf62" podUID="234d2ae5-7589-44cc-83f4-b0ee8a91940a" Mar 01 09:25:16 crc kubenswrapper[4792]: E0301 09:25:16.036510 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:9d723ab33964ee44704eed3223b64e828349d45dee04695434a6fcf4b6807d4c\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wjf62" podUID="234d2ae5-7589-44cc-83f4-b0ee8a91940a" Mar 01 09:25:16 crc kubenswrapper[4792]: E0301 09:25:16.289274 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:bb939885bd04593ad03af901adb77ee2a2d18529b328c23288c7cc7a2ba5282e" Mar 01 09:25:16 crc kubenswrapper[4792]: E0301 09:25:16.289447 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:bb939885bd04593ad03af901adb77ee2a2d18529b328c23288c7cc7a2ba5282e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fct2d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-648564c9fc-jdn6k_openstack-operators(808b8753-0a20-419b-8b04-dcbccaa2d77e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 01 09:25:16 crc kubenswrapper[4792]: E0301 09:25:16.290623 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-jdn6k" podUID="808b8753-0a20-419b-8b04-dcbccaa2d77e" Mar 01 09:25:16 crc kubenswrapper[4792]: E0301 09:25:16.725765 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Mar 01 09:25:16 crc kubenswrapper[4792]: E0301 09:25:16.726054 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nsppl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-r5l9m_openstack-operators(1ecd6b07-eda9-41d6-90af-6471699ff808): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 01 09:25:16 crc kubenswrapper[4792]: E0301 09:25:16.728887 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5l9m" podUID="1ecd6b07-eda9-41d6-90af-6471699ff808" Mar 01 09:25:17 crc kubenswrapper[4792]: I0301 09:25:17.045810 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776948grv"] Mar 01 09:25:17 crc kubenswrapper[4792]: I0301 09:25:17.048684 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54688575f-qjqd2" event={"ID":"dfb10d33-c4f1-4287-be83-dff835c733ba","Type":"ContainerStarted","Data":"7f3a04247b646b10567c07c9e3e71548969040e3e9071182a3e180e0de1ed7d2"} Mar 01 09:25:17 crc kubenswrapper[4792]: W0301 09:25:17.058880 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9244686e_175e_45f9_9eb7_23621cd1f3cd.slice/crio-3f6926cb229a8c54312c01ad1c5c76705cfa50a019a881ae8c04e230a3bdbb3b WatchSource:0}: Error finding container 3f6926cb229a8c54312c01ad1c5c76705cfa50a019a881ae8c04e230a3bdbb3b: Status 404 returned error can't find the container with id 3f6926cb229a8c54312c01ad1c5c76705cfa50a019a881ae8c04e230a3bdbb3b Mar 01 09:25:17 crc kubenswrapper[4792]: I0301 09:25:17.062593 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-54688575f-qjqd2" Mar 01 09:25:17 crc kubenswrapper[4792]: E0301 09:25:17.062752 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5l9m" podUID="1ecd6b07-eda9-41d6-90af-6471699ff808" Mar 01 09:25:17 crc kubenswrapper[4792]: I0301 09:25:17.090590 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-54688575f-qjqd2" podStartSLOduration=7.142139149 podStartE2EDuration="35.090568357s" podCreationTimestamp="2026-03-01 09:24:42 +0000 UTC" firstStartedPulling="2026-03-01 09:24:46.733623252 +0000 UTC m=+1015.975502449" lastFinishedPulling="2026-03-01 09:25:14.68205242 +0000 UTC m=+1043.923931657" observedRunningTime="2026-03-01 09:25:17.084252647 +0000 UTC m=+1046.326131844" watchObservedRunningTime="2026-03-01 09:25:17.090568357 +0000 UTC m=+1046.332447554" Mar 01 09:25:17 crc kubenswrapper[4792]: I0301 09:25:17.335862 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-f7fcc58b9-dsqtf"] Mar 01 09:25:17 crc kubenswrapper[4792]: W0301 09:25:17.339408 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea6739c2_185a_43e7_8fcf_0b2ae31957a0.slice/crio-f96a68fcaf233f289334a7086cece4460f0c737f53e6d3344ee4df9e6fddae97 WatchSource:0}: Error finding container f96a68fcaf233f289334a7086cece4460f0c737f53e6d3344ee4df9e6fddae97: Status 404 returned error can't find the container with id f96a68fcaf233f289334a7086cece4460f0c737f53e6d3344ee4df9e6fddae97 Mar 01 09:25:17 crc kubenswrapper[4792]: I0301 09:25:17.340861 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-76rvg"] Mar 01 09:25:17 crc kubenswrapper[4792]: I0301 09:25:17.369588 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx"] Mar 01 09:25:17 crc kubenswrapper[4792]: W0301 09:25:17.387196 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1d3783f_78e9_461a_916a_5a46e3083e70.slice/crio-4015b7338d1f84c58b14a44bada706d5334972f3a577cbe5b4d43f140d3fbc28 WatchSource:0}: Error finding container 4015b7338d1f84c58b14a44bada706d5334972f3a577cbe5b4d43f140d3fbc28: Status 404 returned error can't find the container with id 4015b7338d1f84c58b14a44bada706d5334972f3a577cbe5b4d43f140d3fbc28 Mar 01 09:25:18 crc kubenswrapper[4792]: I0301 09:25:18.055188 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-7v65r" event={"ID":"5044cf86-f557-41d4-b6c0-a41a668ac999","Type":"ContainerStarted","Data":"ad7ac441d6a6d297f23e4f60f71aaf15be42b6ef9a3f77c6078baffb9583af3d"} Mar 01 09:25:18 crc kubenswrapper[4792]: I0301 09:25:18.056529 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-7v65r" Mar 01 09:25:18 crc kubenswrapper[4792]: I0301 09:25:18.058296 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-dsqtf" event={"ID":"ea6739c2-185a-43e7-8fcf-0b2ae31957a0","Type":"ContainerStarted","Data":"f96a68fcaf233f289334a7086cece4460f0c737f53e6d3344ee4df9e6fddae97"} Mar 01 09:25:18 crc kubenswrapper[4792]: I0301 09:25:18.059113 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776948grv" event={"ID":"9244686e-175e-45f9-9eb7-23621cd1f3cd","Type":"ContainerStarted","Data":"3f6926cb229a8c54312c01ad1c5c76705cfa50a019a881ae8c04e230a3bdbb3b"} Mar 01 09:25:18 crc kubenswrapper[4792]: I0301 09:25:18.064151 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-zkx7c" event={"ID":"3d38195c-e4ff-49cf-9592-e9f52d73f2df","Type":"ContainerStarted","Data":"d9878c65a4fd17c28fde0a37cca1809fdeaa343383582c866c265cc65d15bcfa"} Mar 01 09:25:18 crc kubenswrapper[4792]: I0301 09:25:18.064740 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-zkx7c" Mar 01 09:25:18 crc kubenswrapper[4792]: I0301 09:25:18.065768 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" event={"ID":"d1d3783f-78e9-461a-916a-5a46e3083e70","Type":"ContainerStarted","Data":"4787f8f2d253ff4dd1d652823910ffe40b98fc88f9619b345142107028aa83f6"} Mar 01 09:25:18 crc kubenswrapper[4792]: I0301 09:25:18.065795 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" event={"ID":"d1d3783f-78e9-461a-916a-5a46e3083e70","Type":"ContainerStarted","Data":"4015b7338d1f84c58b14a44bada706d5334972f3a577cbe5b4d43f140d3fbc28"} Mar 01 09:25:18 crc kubenswrapper[4792]: I0301 09:25:18.066278 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" Mar 01 09:25:18 crc kubenswrapper[4792]: I0301 09:25:18.067358 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jlnsb" event={"ID":"8741a141-0194-4eb2-956e-c41f4ffe1338","Type":"ContainerStarted","Data":"ea5bb485455205d13e998f2d92ddff8adb9c1ab3676579be65add75470498c08"} Mar 01 09:25:18 crc kubenswrapper[4792]: I0301 09:25:18.067811 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jlnsb" Mar 01 09:25:18 crc kubenswrapper[4792]: I0301 09:25:18.069058 4792 generic.go:334] "Generic (PLEG): container finished" podID="5ee54ba4-1afe-492b-a35b-23f0da447772" containerID="832abdde1bd9df0dee6fd890cd41053134638610acc4bc9948b5dd0abc79f243" exitCode=0 Mar 01 09:25:18 crc kubenswrapper[4792]: I0301 09:25:18.069110 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-76rvg" event={"ID":"5ee54ba4-1afe-492b-a35b-23f0da447772","Type":"ContainerDied","Data":"832abdde1bd9df0dee6fd890cd41053134638610acc4bc9948b5dd0abc79f243"} Mar 01 09:25:18 crc kubenswrapper[4792]: I0301 09:25:18.069129 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-76rvg" event={"ID":"5ee54ba4-1afe-492b-a35b-23f0da447772","Type":"ContainerStarted","Data":"6f39b288ec76c37d95cc09cbb8367cea58bf1050a4c16d38edfc803ed8d4b5b8"} Mar 01 09:25:18 crc kubenswrapper[4792]: I0301 09:25:18.073428 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-jvw5j" event={"ID":"2af4993f-9ba9-4f7a-a31e-2bd133a7d4c5","Type":"ContainerStarted","Data":"d22ef362e5f7c3b60d8a91d8c51b4543bfe8b25dee50b04edef1cacaa3f86986"} Mar 01 09:25:18 crc kubenswrapper[4792]: I0301 09:25:18.073515 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-jvw5j" Mar 01 09:25:18 crc kubenswrapper[4792]: I0301 09:25:18.075374 4792 generic.go:334] "Generic (PLEG): container finished" podID="02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a" containerID="7d843e7f8ed1891003a854d8c5e8a92905cc32dc6fd52496f440d3191a085277" exitCode=0 Mar 01 09:25:18 crc kubenswrapper[4792]: I0301 09:25:18.075413 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vgfjs" event={"ID":"02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a","Type":"ContainerDied","Data":"7d843e7f8ed1891003a854d8c5e8a92905cc32dc6fd52496f440d3191a085277"} Mar 01 09:25:18 crc kubenswrapper[4792]: I0301 09:25:18.115404 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-zkx7c" podStartSLOduration=7.508994189 podStartE2EDuration="35.11538325s" podCreationTimestamp="2026-03-01 09:24:43 +0000 UTC" firstStartedPulling="2026-03-01 09:24:47.054362381 +0000 UTC m=+1016.296241578" lastFinishedPulling="2026-03-01 09:25:14.660751442 +0000 UTC m=+1043.902630639" observedRunningTime="2026-03-01 09:25:18.115093393 +0000 UTC m=+1047.356972590" watchObservedRunningTime="2026-03-01 09:25:18.11538325 +0000 UTC m=+1047.357262457" Mar 01 09:25:18 crc kubenswrapper[4792]: I0301 09:25:18.119345 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-7v65r" podStartSLOduration=7.643419735 podStartE2EDuration="36.119331184s" podCreationTimestamp="2026-03-01 09:24:42 +0000 UTC" firstStartedPulling="2026-03-01 09:24:46.206177622 +0000 UTC m=+1015.448056819" lastFinishedPulling="2026-03-01 09:25:14.682089071 +0000 UTC m=+1043.923968268" observedRunningTime="2026-03-01 09:25:18.089724138 +0000 UTC m=+1047.331603345" watchObservedRunningTime="2026-03-01 09:25:18.119331184 +0000 UTC m=+1047.361210381" Mar 01 09:25:18 crc kubenswrapper[4792]: I0301 09:25:18.138510 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jlnsb" podStartSLOduration=6.049465239 podStartE2EDuration="36.13848575s" podCreationTimestamp="2026-03-01 09:24:42 +0000 UTC" firstStartedPulling="2026-03-01 09:24:44.57169961 +0000 UTC m=+1013.813578817" lastFinishedPulling="2026-03-01 09:25:14.660720101 +0000 UTC m=+1043.902599328" observedRunningTime="2026-03-01 09:25:18.133718667 +0000 UTC m=+1047.375597864" watchObservedRunningTime="2026-03-01 09:25:18.13848575 +0000 UTC m=+1047.380364947" Mar 01 09:25:18 crc kubenswrapper[4792]: I0301 09:25:18.209346 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" podStartSLOduration=35.209321838 podStartE2EDuration="35.209321838s" podCreationTimestamp="2026-03-01 09:24:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:25:18.199783421 +0000 UTC m=+1047.441662618" watchObservedRunningTime="2026-03-01 09:25:18.209321838 +0000 UTC m=+1047.451201045" Mar 01 09:25:18 crc kubenswrapper[4792]: I0301 09:25:18.236201 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-jvw5j" podStartSLOduration=6.573772833 podStartE2EDuration="36.236184638s" podCreationTimestamp="2026-03-01 09:24:42 +0000 UTC" firstStartedPulling="2026-03-01 09:24:46.61697233 +0000 UTC m=+1015.858851527" lastFinishedPulling="2026-03-01 09:25:16.279384135 +0000 UTC m=+1045.521263332" observedRunningTime="2026-03-01 09:25:18.22868142 +0000 UTC m=+1047.470560617" watchObservedRunningTime="2026-03-01 09:25:18.236184638 +0000 UTC m=+1047.478063835" Mar 01 09:25:22 crc kubenswrapper[4792]: I0301 09:25:22.119978 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-jpxwz" event={"ID":"4fe8270e-a46d-40bc-8d24-a4585b196f5e","Type":"ContainerStarted","Data":"6892640fbb705e28b0c71c2706ad76c2d091ae6f2dd03c538b97f78bac1cc741"} Mar 01 09:25:22 crc kubenswrapper[4792]: I0301 09:25:22.125521 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-jpxwz" Mar 01 09:25:22 crc kubenswrapper[4792]: I0301 09:25:22.134252 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-mqndr" event={"ID":"e0cef8e2-a392-4612-97c6-17c611b2a44e","Type":"ContainerStarted","Data":"3b2391a3b24c4743bb3f6633537111671e24e6bbc2f99b760acdcabb3d972ce9"} Mar 01 09:25:22 crc kubenswrapper[4792]: I0301 09:25:22.137234 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-bcnns" event={"ID":"2970c60c-7b03-4667-99e4-08c094cdbfc2","Type":"ContainerStarted","Data":"aa90c8aa701a9cf03433d8e5e5defb7d1475d444068902178946a74477ab2a0f"} Mar 01 09:25:22 crc kubenswrapper[4792]: I0301 09:25:22.137578 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-bcnns" Mar 01 09:25:22 crc kubenswrapper[4792]: I0301 09:25:22.143651 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vgfjs" event={"ID":"02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a","Type":"ContainerStarted","Data":"888c652da8c6af91af7d18e82384bcef82d7b06f24eb6e5b62926e1135a94b10"} Mar 01 09:25:22 crc kubenswrapper[4792]: I0301 09:25:22.146089 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-54rpl" event={"ID":"ecc17c18-7695-4d22-9a95-bcac51800d60","Type":"ContainerStarted","Data":"0f09ae0e09ec4ee1981076cf0062bdb867f23798fa0a4b442b4fe351e4f18779"} Mar 01 09:25:22 crc kubenswrapper[4792]: I0301 09:25:22.146360 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-54rpl" Mar 01 09:25:22 crc kubenswrapper[4792]: I0301 09:25:22.149470 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-76rvg" event={"ID":"5ee54ba4-1afe-492b-a35b-23f0da447772","Type":"ContainerStarted","Data":"d56625efa0b27d9e3673d2b751ede19c1c4b198dcd6670a8ed75741872c3784a"} Mar 01 09:25:22 crc kubenswrapper[4792]: I0301 09:25:22.167656 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-jpxwz" podStartSLOduration=4.970947146 podStartE2EDuration="39.167634479s" podCreationTimestamp="2026-03-01 09:24:43 +0000 UTC" firstStartedPulling="2026-03-01 09:24:47.168784099 +0000 UTC m=+1016.410663296" lastFinishedPulling="2026-03-01 09:25:21.365471432 +0000 UTC m=+1050.607350629" observedRunningTime="2026-03-01 09:25:22.145530892 +0000 UTC m=+1051.387410099" watchObservedRunningTime="2026-03-01 09:25:22.167634479 +0000 UTC m=+1051.409513676" Mar 01 09:25:22 crc kubenswrapper[4792]: I0301 09:25:22.180969 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-dsqtf" event={"ID":"ea6739c2-185a-43e7-8fcf-0b2ae31957a0","Type":"ContainerStarted","Data":"f9736d0776a1969518aa8cf69ea66d57fc40ece1f9479a0b2732c5251133a5e1"} Mar 01 09:25:22 crc kubenswrapper[4792]: I0301 09:25:22.181593 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-dsqtf" Mar 01 09:25:22 crc kubenswrapper[4792]: I0301 09:25:22.203273 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vgfjs" podStartSLOduration=6.772570505 podStartE2EDuration="34.203253907s" podCreationTimestamp="2026-03-01 09:24:48 +0000 UTC" firstStartedPulling="2026-03-01 09:24:53.237570305 +0000 UTC m=+1022.479449512" lastFinishedPulling="2026-03-01 09:25:20.668253717 +0000 UTC m=+1049.910132914" observedRunningTime="2026-03-01 09:25:22.182460041 +0000 UTC m=+1051.424339238" watchObservedRunningTime="2026-03-01 09:25:22.203253907 +0000 UTC m=+1051.445133104" Mar 01 09:25:22 crc kubenswrapper[4792]: I0301 09:25:22.208270 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776948grv" event={"ID":"9244686e-175e-45f9-9eb7-23621cd1f3cd","Type":"ContainerStarted","Data":"f62adb87846062dc897503d4685420a6831398730f723a3a8bffe44bda9273dc"} Mar 01 09:25:22 crc kubenswrapper[4792]: I0301 09:25:22.209105 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776948grv" Mar 01 09:25:22 crc kubenswrapper[4792]: I0301 09:25:22.210672 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-bcnns" podStartSLOduration=6.591924972 podStartE2EDuration="39.210660354s" podCreationTimestamp="2026-03-01 09:24:43 +0000 UTC" firstStartedPulling="2026-03-01 09:24:47.121919149 +0000 UTC m=+1016.363798346" lastFinishedPulling="2026-03-01 09:25:19.740654511 +0000 UTC m=+1048.982533728" observedRunningTime="2026-03-01 09:25:22.208967573 +0000 UTC m=+1051.450846770" watchObservedRunningTime="2026-03-01 09:25:22.210660354 +0000 UTC m=+1051.452539551" Mar 01 09:25:22 crc kubenswrapper[4792]: I0301 09:25:22.244863 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776948grv" podStartSLOduration=34.954972636 podStartE2EDuration="39.244842418s" podCreationTimestamp="2026-03-01 09:24:43 +0000 UTC" firstStartedPulling="2026-03-01 09:25:17.0768495 +0000 UTC m=+1046.318728697" lastFinishedPulling="2026-03-01 09:25:21.366719282 +0000 UTC m=+1050.608598479" observedRunningTime="2026-03-01 09:25:22.238083357 +0000 UTC m=+1051.479962574" watchObservedRunningTime="2026-03-01 09:25:22.244842418 +0000 UTC m=+1051.486721615" Mar 01 09:25:22 crc kubenswrapper[4792]: I0301 09:25:22.289866 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-dsqtf" podStartSLOduration=36.267530955 podStartE2EDuration="40.289848611s" podCreationTimestamp="2026-03-01 09:24:42 +0000 UTC" firstStartedPulling="2026-03-01 09:25:17.342090811 +0000 UTC m=+1046.583969998" lastFinishedPulling="2026-03-01 09:25:21.364408457 +0000 UTC m=+1050.606287654" observedRunningTime="2026-03-01 09:25:22.28479562 +0000 UTC m=+1051.526674817" watchObservedRunningTime="2026-03-01 09:25:22.289848611 +0000 UTC m=+1051.531727808" Mar 01 09:25:22 crc kubenswrapper[4792]: I0301 09:25:22.311547 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-54rpl" podStartSLOduration=6.51000262 podStartE2EDuration="39.311528747s" podCreationTimestamp="2026-03-01 09:24:43 +0000 UTC" firstStartedPulling="2026-03-01 09:24:46.940123778 +0000 UTC m=+1016.182002975" lastFinishedPulling="2026-03-01 09:25:19.741649895 +0000 UTC m=+1048.983529102" observedRunningTime="2026-03-01 09:25:22.306897067 +0000 UTC m=+1051.548776264" watchObservedRunningTime="2026-03-01 09:25:22.311528747 +0000 UTC m=+1051.553407944" Mar 01 09:25:22 crc kubenswrapper[4792]: I0301 09:25:22.898247 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jlnsb" Mar 01 09:25:23 crc kubenswrapper[4792]: I0301 09:25:23.073095 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-7v65r" Mar 01 09:25:23 crc kubenswrapper[4792]: I0301 09:25:23.218202 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-hlzm6" event={"ID":"1793465e-1273-4250-a238-c99798788618","Type":"ContainerStarted","Data":"1babc5a330340ac6903322191b57ac6f28537095bd94a297360a9b3ffade03eb"} Mar 01 09:25:23 crc kubenswrapper[4792]: I0301 09:25:23.218516 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-jvw5j" Mar 01 09:25:23 crc kubenswrapper[4792]: I0301 09:25:23.219153 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-hlzm6" Mar 01 09:25:23 crc kubenswrapper[4792]: I0301 09:25:23.220976 4792 generic.go:334] "Generic (PLEG): container finished" podID="5ee54ba4-1afe-492b-a35b-23f0da447772" containerID="d56625efa0b27d9e3673d2b751ede19c1c4b198dcd6670a8ed75741872c3784a" exitCode=0 Mar 01 09:25:23 crc kubenswrapper[4792]: I0301 09:25:23.221346 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-76rvg" event={"ID":"5ee54ba4-1afe-492b-a35b-23f0da447772","Type":"ContainerDied","Data":"d56625efa0b27d9e3673d2b751ede19c1c4b198dcd6670a8ed75741872c3784a"} Mar 01 09:25:23 crc kubenswrapper[4792]: I0301 09:25:23.222783 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-mqndr" Mar 01 09:25:23 crc kubenswrapper[4792]: I0301 09:25:23.238469 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-hlzm6" podStartSLOduration=6.208555556 podStartE2EDuration="41.238448387s" podCreationTimestamp="2026-03-01 09:24:42 +0000 UTC" firstStartedPulling="2026-03-01 09:24:46.826102031 +0000 UTC m=+1016.067981228" lastFinishedPulling="2026-03-01 09:25:21.855994862 +0000 UTC m=+1051.097874059" observedRunningTime="2026-03-01 09:25:23.237226028 +0000 UTC m=+1052.479105225" watchObservedRunningTime="2026-03-01 09:25:23.238448387 +0000 UTC m=+1052.480327584" Mar 01 09:25:23 crc kubenswrapper[4792]: I0301 09:25:23.273638 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-mqndr" podStartSLOduration=5.542889423 podStartE2EDuration="40.273615835s" podCreationTimestamp="2026-03-01 09:24:43 +0000 UTC" firstStartedPulling="2026-03-01 09:24:47.124212535 +0000 UTC m=+1016.366091732" lastFinishedPulling="2026-03-01 09:25:21.854938947 +0000 UTC m=+1051.096818144" observedRunningTime="2026-03-01 09:25:23.267606502 +0000 UTC m=+1052.509485699" watchObservedRunningTime="2026-03-01 09:25:23.273615835 +0000 UTC m=+1052.515495032" Mar 01 09:25:23 crc kubenswrapper[4792]: I0301 09:25:23.362382 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-54688575f-qjqd2" Mar 01 09:25:23 crc kubenswrapper[4792]: I0301 09:25:23.777352 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-zkx7c" Mar 01 09:25:27 crc kubenswrapper[4792]: E0301 09:25:27.557023 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:bb939885bd04593ad03af901adb77ee2a2d18529b328c23288c7cc7a2ba5282e\\\"\"" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-jdn6k" podUID="808b8753-0a20-419b-8b04-dcbccaa2d77e" Mar 01 09:25:27 crc kubenswrapper[4792]: E0301 09:25:27.571100 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-64lkf" podUID="e45ebab9-87d5-4b2f-b3d1-f1832864584d" Mar 01 09:25:28 crc kubenswrapper[4792]: I0301 09:25:28.253779 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-76rvg" event={"ID":"5ee54ba4-1afe-492b-a35b-23f0da447772","Type":"ContainerStarted","Data":"0be24cd4ccbbe75283901f8e9a66a85ecb2a776492e33a934b9d2c63f6f16453"} Mar 01 09:25:28 crc kubenswrapper[4792]: I0301 09:25:28.255617 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-9wzbh" event={"ID":"02dd5cc0-c44b-4ede-972b-9d26c9c54100","Type":"ContainerStarted","Data":"1f0fdd6f611b2cf63cb513a2021cdbfb6a0e69f83021884f9e2a5716c7dfdb7d"} Mar 01 09:25:28 crc kubenswrapper[4792]: I0301 09:25:28.255976 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-9wzbh" Mar 01 09:25:28 crc kubenswrapper[4792]: I0301 09:25:28.271959 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-76rvg" podStartSLOduration=27.769899138 podStartE2EDuration="37.271940911s" podCreationTimestamp="2026-03-01 09:24:51 +0000 UTC" firstStartedPulling="2026-03-01 09:25:18.070166792 +0000 UTC m=+1047.312045989" lastFinishedPulling="2026-03-01 09:25:27.572208565 +0000 UTC m=+1056.814087762" observedRunningTime="2026-03-01 09:25:28.270319342 +0000 UTC m=+1057.512198529" watchObservedRunningTime="2026-03-01 09:25:28.271940911 +0000 UTC m=+1057.513820108" Mar 01 09:25:28 crc kubenswrapper[4792]: I0301 09:25:28.307269 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-9wzbh" podStartSLOduration=5.340439285 podStartE2EDuration="46.307240502s" podCreationTimestamp="2026-03-01 09:24:42 +0000 UTC" firstStartedPulling="2026-03-01 09:24:46.651014585 +0000 UTC m=+1015.892893782" lastFinishedPulling="2026-03-01 09:25:27.617815802 +0000 UTC m=+1056.859694999" observedRunningTime="2026-03-01 09:25:28.299683392 +0000 UTC m=+1057.541562609" watchObservedRunningTime="2026-03-01 09:25:28.307240502 +0000 UTC m=+1057.549119699" Mar 01 09:25:28 crc kubenswrapper[4792]: I0301 09:25:28.774554 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-dsqtf" Mar 01 09:25:29 crc kubenswrapper[4792]: I0301 09:25:29.206164 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vgfjs" Mar 01 09:25:29 crc kubenswrapper[4792]: I0301 09:25:29.206233 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vgfjs" Mar 01 09:25:29 crc kubenswrapper[4792]: I0301 09:25:29.277611 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-72srw" event={"ID":"bf1f37ea-a566-4dfd-b45b-02f284f19ce3","Type":"ContainerStarted","Data":"560bd85111d581b27b39be06d4086bec168b864f989e9133615e7a97f6221eb9"} Mar 01 09:25:29 crc kubenswrapper[4792]: I0301 09:25:29.278878 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-72srw" Mar 01 09:25:29 crc kubenswrapper[4792]: I0301 09:25:29.282857 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-ggspg" event={"ID":"b9e3fd6b-e3e2-4380-b8d7-900891df562a","Type":"ContainerStarted","Data":"45614005e3c7ecd0bab6ebc1c2f744a57f9d4ee3751d0a1f01582d0ad0b5f387"} Mar 01 09:25:29 crc kubenswrapper[4792]: I0301 09:25:29.283158 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-ggspg" Mar 01 09:25:29 crc kubenswrapper[4792]: I0301 09:25:29.287419 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5l9m" event={"ID":"1ecd6b07-eda9-41d6-90af-6471699ff808","Type":"ContainerStarted","Data":"80541088e8c24ff4ba3c18c56556e60eda017df94f18130d34db13224329f773"} Mar 01 09:25:29 crc kubenswrapper[4792]: I0301 09:25:29.295213 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-knk7m" event={"ID":"8307ba19-5fc4-4cfc-b3cd-cafe5eac9cb9","Type":"ContainerStarted","Data":"8737006219f5372a30ad61237d82618383e87d9283f6ad5f49b2991c2224b97f"} Mar 01 09:25:29 crc kubenswrapper[4792]: I0301 09:25:29.295741 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-knk7m" Mar 01 09:25:29 crc kubenswrapper[4792]: I0301 09:25:29.297553 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wjf62" event={"ID":"234d2ae5-7589-44cc-83f4-b0ee8a91940a","Type":"ContainerStarted","Data":"172e116d6c1904833ece964abe39b26379c590a14618bb39e7bb55e8277d5f6a"} Mar 01 09:25:29 crc kubenswrapper[4792]: I0301 09:25:29.297939 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wjf62" Mar 01 09:25:29 crc kubenswrapper[4792]: I0301 09:25:29.299484 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-t5fsn" event={"ID":"376afe52-646d-44b7-b32e-ce6cd6dc21a6","Type":"ContainerStarted","Data":"5e3956812ed98050d4a71d8b39110c43a71dfa78c7f35697b9c16752b4a6a549"} Mar 01 09:25:29 crc kubenswrapper[4792]: I0301 09:25:29.299810 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-67d996989d-t5fsn" Mar 01 09:25:29 crc kubenswrapper[4792]: I0301 09:25:29.343810 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5l9m" podStartSLOduration=4.600301953 podStartE2EDuration="46.343794263s" podCreationTimestamp="2026-03-01 09:24:43 +0000 UTC" firstStartedPulling="2026-03-01 09:24:47.242869266 +0000 UTC m=+1016.484748463" lastFinishedPulling="2026-03-01 09:25:28.986361576 +0000 UTC m=+1058.228240773" observedRunningTime="2026-03-01 09:25:29.338535838 +0000 UTC m=+1058.580415035" watchObservedRunningTime="2026-03-01 09:25:29.343794263 +0000 UTC m=+1058.585673460" Mar 01 09:25:29 crc kubenswrapper[4792]: I0301 09:25:29.344212 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-72srw" podStartSLOduration=3.360604743 podStartE2EDuration="47.344206783s" podCreationTimestamp="2026-03-01 09:24:42 +0000 UTC" firstStartedPulling="2026-03-01 09:24:44.02145788 +0000 UTC m=+1013.263337077" lastFinishedPulling="2026-03-01 09:25:28.0050599 +0000 UTC m=+1057.246939117" observedRunningTime="2026-03-01 09:25:29.32136155 +0000 UTC m=+1058.563240747" watchObservedRunningTime="2026-03-01 09:25:29.344206783 +0000 UTC m=+1058.586085980" Mar 01 09:25:29 crc kubenswrapper[4792]: I0301 09:25:29.372922 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wjf62" podStartSLOduration=6.206345242 podStartE2EDuration="47.372889477s" podCreationTimestamp="2026-03-01 09:24:42 +0000 UTC" firstStartedPulling="2026-03-01 09:24:46.795863279 +0000 UTC m=+1016.037742476" lastFinishedPulling="2026-03-01 09:25:27.962407494 +0000 UTC m=+1057.204286711" observedRunningTime="2026-03-01 09:25:29.365752197 +0000 UTC m=+1058.607631394" watchObservedRunningTime="2026-03-01 09:25:29.372889477 +0000 UTC m=+1058.614768674" Mar 01 09:25:29 crc kubenswrapper[4792]: I0301 09:25:29.394105 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-67d996989d-t5fsn" podStartSLOduration=5.220785248 podStartE2EDuration="47.394088622s" podCreationTimestamp="2026-03-01 09:24:42 +0000 UTC" firstStartedPulling="2026-03-01 09:24:46.686556627 +0000 UTC m=+1015.928435824" lastFinishedPulling="2026-03-01 09:25:28.859860001 +0000 UTC m=+1058.101739198" observedRunningTime="2026-03-01 09:25:29.393630641 +0000 UTC m=+1058.635509838" watchObservedRunningTime="2026-03-01 09:25:29.394088622 +0000 UTC m=+1058.635967819" Mar 01 09:25:29 crc kubenswrapper[4792]: I0301 09:25:29.424201 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-knk7m" podStartSLOduration=5.111613239 podStartE2EDuration="47.424181629s" podCreationTimestamp="2026-03-01 09:24:42 +0000 UTC" firstStartedPulling="2026-03-01 09:24:46.676266255 +0000 UTC m=+1015.918145452" lastFinishedPulling="2026-03-01 09:25:28.988834645 +0000 UTC m=+1058.230713842" observedRunningTime="2026-03-01 09:25:29.416773612 +0000 UTC m=+1058.658652809" watchObservedRunningTime="2026-03-01 09:25:29.424181629 +0000 UTC m=+1058.666060826" Mar 01 09:25:29 crc kubenswrapper[4792]: I0301 09:25:29.435116 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-ggspg" podStartSLOduration=4.058551011 podStartE2EDuration="47.435098089s" podCreationTimestamp="2026-03-01 09:24:42 +0000 UTC" firstStartedPulling="2026-03-01 09:24:44.584426002 +0000 UTC m=+1013.826305199" lastFinishedPulling="2026-03-01 09:25:27.96097308 +0000 UTC m=+1057.202852277" observedRunningTime="2026-03-01 09:25:29.429581558 +0000 UTC m=+1058.671460755" watchObservedRunningTime="2026-03-01 09:25:29.435098089 +0000 UTC m=+1058.676977286" Mar 01 09:25:29 crc kubenswrapper[4792]: I0301 09:25:29.528345 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776948grv" Mar 01 09:25:29 crc kubenswrapper[4792]: I0301 09:25:29.965576 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" Mar 01 09:25:30 crc kubenswrapper[4792]: I0301 09:25:30.314392 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-55qzx" event={"ID":"cd83ed19-023d-43c2-92db-d290499db3d4","Type":"ContainerStarted","Data":"23bd907d7303eab6c38e1e32812fcbbfb88d20a5a6e1bf2c2d7e1863cd0a2ccb"} Mar 01 09:25:30 crc kubenswrapper[4792]: I0301 09:25:30.339446 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-55qzx" podStartSLOduration=4.776126929 podStartE2EDuration="48.3394304s" podCreationTimestamp="2026-03-01 09:24:42 +0000 UTC" firstStartedPulling="2026-03-01 09:24:46.241871048 +0000 UTC m=+1015.483750245" lastFinishedPulling="2026-03-01 09:25:29.805174519 +0000 UTC m=+1059.047053716" observedRunningTime="2026-03-01 09:25:30.334821941 +0000 UTC m=+1059.576701138" watchObservedRunningTime="2026-03-01 09:25:30.3394304 +0000 UTC m=+1059.581309597" Mar 01 09:25:30 crc kubenswrapper[4792]: I0301 09:25:30.413399 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-vgfjs" podUID="02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a" containerName="registry-server" probeResult="failure" output=< Mar 01 09:25:30 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 09:25:30 crc kubenswrapper[4792]: > Mar 01 09:25:31 crc kubenswrapper[4792]: I0301 09:25:31.872578 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-76rvg" Mar 01 09:25:31 crc kubenswrapper[4792]: I0301 09:25:31.874244 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-76rvg" Mar 01 09:25:32 crc kubenswrapper[4792]: I0301 09:25:32.914351 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-76rvg" podUID="5ee54ba4-1afe-492b-a35b-23f0da447772" containerName="registry-server" probeResult="failure" output=< Mar 01 09:25:32 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 09:25:32 crc kubenswrapper[4792]: > Mar 01 09:25:32 crc kubenswrapper[4792]: I0301 09:25:32.969705 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-9wzbh" Mar 01 09:25:33 crc kubenswrapper[4792]: I0301 09:25:33.139536 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-55qzx" Mar 01 09:25:33 crc kubenswrapper[4792]: I0301 09:25:33.217776 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wjf62" Mar 01 09:25:33 crc kubenswrapper[4792]: I0301 09:25:33.312668 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-hlzm6" Mar 01 09:25:33 crc kubenswrapper[4792]: I0301 09:25:33.653088 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-54rpl" Mar 01 09:25:33 crc kubenswrapper[4792]: I0301 09:25:33.953392 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-jpxwz" Mar 01 09:25:33 crc kubenswrapper[4792]: I0301 09:25:33.970975 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-mqndr" Mar 01 09:25:34 crc kubenswrapper[4792]: I0301 09:25:34.229441 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-bcnns" Mar 01 09:25:39 crc kubenswrapper[4792]: I0301 09:25:39.259600 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vgfjs" Mar 01 09:25:39 crc kubenswrapper[4792]: I0301 09:25:39.304092 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vgfjs" Mar 01 09:25:39 crc kubenswrapper[4792]: I0301 09:25:39.498605 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vgfjs"] Mar 01 09:25:40 crc kubenswrapper[4792]: I0301 09:25:40.372537 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vgfjs" podUID="02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a" containerName="registry-server" containerID="cri-o://888c652da8c6af91af7d18e82384bcef82d7b06f24eb6e5b62926e1135a94b10" gracePeriod=2 Mar 01 09:25:40 crc kubenswrapper[4792]: I0301 09:25:40.874364 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vgfjs" Mar 01 09:25:40 crc kubenswrapper[4792]: I0301 09:25:40.984184 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xztgl\" (UniqueName: \"kubernetes.io/projected/02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a-kube-api-access-xztgl\") pod \"02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a\" (UID: \"02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a\") " Mar 01 09:25:40 crc kubenswrapper[4792]: I0301 09:25:40.984261 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a-catalog-content\") pod \"02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a\" (UID: \"02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a\") " Mar 01 09:25:40 crc kubenswrapper[4792]: I0301 09:25:40.984283 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a-utilities\") pod \"02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a\" (UID: \"02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a\") " Mar 01 09:25:40 crc kubenswrapper[4792]: I0301 09:25:40.985146 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a-utilities" (OuterVolumeSpecName: "utilities") pod "02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a" (UID: "02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:25:40 crc kubenswrapper[4792]: I0301 09:25:40.995241 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a-kube-api-access-xztgl" (OuterVolumeSpecName: "kube-api-access-xztgl") pod "02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a" (UID: "02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a"). InnerVolumeSpecName "kube-api-access-xztgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:25:41 crc kubenswrapper[4792]: I0301 09:25:41.042695 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a" (UID: "02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:25:41 crc kubenswrapper[4792]: I0301 09:25:41.085183 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xztgl\" (UniqueName: \"kubernetes.io/projected/02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a-kube-api-access-xztgl\") on node \"crc\" DevicePath \"\"" Mar 01 09:25:41 crc kubenswrapper[4792]: I0301 09:25:41.085370 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:25:41 crc kubenswrapper[4792]: I0301 09:25:41.085457 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:25:41 crc kubenswrapper[4792]: I0301 09:25:41.383239 4792 generic.go:334] "Generic (PLEG): container finished" podID="02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a" containerID="888c652da8c6af91af7d18e82384bcef82d7b06f24eb6e5b62926e1135a94b10" exitCode=0 Mar 01 09:25:41 crc kubenswrapper[4792]: I0301 09:25:41.383279 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vgfjs" event={"ID":"02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a","Type":"ContainerDied","Data":"888c652da8c6af91af7d18e82384bcef82d7b06f24eb6e5b62926e1135a94b10"} Mar 01 09:25:41 crc kubenswrapper[4792]: I0301 09:25:41.383314 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vgfjs" event={"ID":"02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a","Type":"ContainerDied","Data":"07b2ccb12f444a169347668634bc26575de0ebccb8ee9dc035b529cef91259bc"} Mar 01 09:25:41 crc kubenswrapper[4792]: I0301 09:25:41.383341 4792 scope.go:117] "RemoveContainer" containerID="888c652da8c6af91af7d18e82384bcef82d7b06f24eb6e5b62926e1135a94b10" Mar 01 09:25:41 crc kubenswrapper[4792]: I0301 09:25:41.384709 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vgfjs" Mar 01 09:25:41 crc kubenswrapper[4792]: I0301 09:25:41.425628 4792 scope.go:117] "RemoveContainer" containerID="7d843e7f8ed1891003a854d8c5e8a92905cc32dc6fd52496f440d3191a085277" Mar 01 09:25:41 crc kubenswrapper[4792]: I0301 09:25:41.426885 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vgfjs"] Mar 01 09:25:41 crc kubenswrapper[4792]: I0301 09:25:41.439560 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vgfjs"] Mar 01 09:25:41 crc kubenswrapper[4792]: I0301 09:25:41.454492 4792 scope.go:117] "RemoveContainer" containerID="eab704b2ab894fb8cdec51e904b0f50429dd2cdba1e21487a963aeddb64f6526" Mar 01 09:25:41 crc kubenswrapper[4792]: I0301 09:25:41.493432 4792 scope.go:117] "RemoveContainer" containerID="888c652da8c6af91af7d18e82384bcef82d7b06f24eb6e5b62926e1135a94b10" Mar 01 09:25:41 crc kubenswrapper[4792]: E0301 09:25:41.493822 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"888c652da8c6af91af7d18e82384bcef82d7b06f24eb6e5b62926e1135a94b10\": container with ID starting with 888c652da8c6af91af7d18e82384bcef82d7b06f24eb6e5b62926e1135a94b10 not found: ID does not exist" containerID="888c652da8c6af91af7d18e82384bcef82d7b06f24eb6e5b62926e1135a94b10" Mar 01 09:25:41 crc kubenswrapper[4792]: I0301 09:25:41.493855 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"888c652da8c6af91af7d18e82384bcef82d7b06f24eb6e5b62926e1135a94b10"} err="failed to get container status \"888c652da8c6af91af7d18e82384bcef82d7b06f24eb6e5b62926e1135a94b10\": rpc error: code = NotFound desc = could not find container \"888c652da8c6af91af7d18e82384bcef82d7b06f24eb6e5b62926e1135a94b10\": container with ID starting with 888c652da8c6af91af7d18e82384bcef82d7b06f24eb6e5b62926e1135a94b10 not found: ID does not exist" Mar 01 09:25:41 crc kubenswrapper[4792]: I0301 09:25:41.493880 4792 scope.go:117] "RemoveContainer" containerID="7d843e7f8ed1891003a854d8c5e8a92905cc32dc6fd52496f440d3191a085277" Mar 01 09:25:41 crc kubenswrapper[4792]: E0301 09:25:41.494181 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d843e7f8ed1891003a854d8c5e8a92905cc32dc6fd52496f440d3191a085277\": container with ID starting with 7d843e7f8ed1891003a854d8c5e8a92905cc32dc6fd52496f440d3191a085277 not found: ID does not exist" containerID="7d843e7f8ed1891003a854d8c5e8a92905cc32dc6fd52496f440d3191a085277" Mar 01 09:25:41 crc kubenswrapper[4792]: I0301 09:25:41.494208 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d843e7f8ed1891003a854d8c5e8a92905cc32dc6fd52496f440d3191a085277"} err="failed to get container status \"7d843e7f8ed1891003a854d8c5e8a92905cc32dc6fd52496f440d3191a085277\": rpc error: code = NotFound desc = could not find container \"7d843e7f8ed1891003a854d8c5e8a92905cc32dc6fd52496f440d3191a085277\": container with ID starting with 7d843e7f8ed1891003a854d8c5e8a92905cc32dc6fd52496f440d3191a085277 not found: ID does not exist" Mar 01 09:25:41 crc kubenswrapper[4792]: I0301 09:25:41.494225 4792 scope.go:117] "RemoveContainer" containerID="eab704b2ab894fb8cdec51e904b0f50429dd2cdba1e21487a963aeddb64f6526" Mar 01 09:25:41 crc kubenswrapper[4792]: E0301 09:25:41.494814 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eab704b2ab894fb8cdec51e904b0f50429dd2cdba1e21487a963aeddb64f6526\": container with ID starting with eab704b2ab894fb8cdec51e904b0f50429dd2cdba1e21487a963aeddb64f6526 not found: ID does not exist" containerID="eab704b2ab894fb8cdec51e904b0f50429dd2cdba1e21487a963aeddb64f6526" Mar 01 09:25:41 crc kubenswrapper[4792]: I0301 09:25:41.494837 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eab704b2ab894fb8cdec51e904b0f50429dd2cdba1e21487a963aeddb64f6526"} err="failed to get container status \"eab704b2ab894fb8cdec51e904b0f50429dd2cdba1e21487a963aeddb64f6526\": rpc error: code = NotFound desc = could not find container \"eab704b2ab894fb8cdec51e904b0f50429dd2cdba1e21487a963aeddb64f6526\": container with ID starting with eab704b2ab894fb8cdec51e904b0f50429dd2cdba1e21487a963aeddb64f6526 not found: ID does not exist" Mar 01 09:25:41 crc kubenswrapper[4792]: I0301 09:25:41.957540 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-76rvg" Mar 01 09:25:42 crc kubenswrapper[4792]: I0301 09:25:42.023511 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-76rvg" Mar 01 09:25:42 crc kubenswrapper[4792]: I0301 09:25:42.390112 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-jdn6k" event={"ID":"808b8753-0a20-419b-8b04-dcbccaa2d77e","Type":"ContainerStarted","Data":"3256fa7dea275757daf6c0806ce9bfa49ca3f09dc326c3af4e322bc3a564d5fb"} Mar 01 09:25:42 crc kubenswrapper[4792]: I0301 09:25:42.391074 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-jdn6k" Mar 01 09:25:42 crc kubenswrapper[4792]: I0301 09:25:42.410574 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-jdn6k" podStartSLOduration=4.784279243 podStartE2EDuration="59.410557608s" podCreationTimestamp="2026-03-01 09:24:43 +0000 UTC" firstStartedPulling="2026-03-01 09:24:47.27642797 +0000 UTC m=+1016.518307167" lastFinishedPulling="2026-03-01 09:25:41.902706335 +0000 UTC m=+1071.144585532" observedRunningTime="2026-03-01 09:25:42.405969639 +0000 UTC m=+1071.647848836" watchObservedRunningTime="2026-03-01 09:25:42.410557608 +0000 UTC m=+1071.652436815" Mar 01 09:25:42 crc kubenswrapper[4792]: I0301 09:25:42.880846 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-ggspg" Mar 01 09:25:42 crc kubenswrapper[4792]: I0301 09:25:42.952638 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-72srw" Mar 01 09:25:43 crc kubenswrapper[4792]: I0301 09:25:43.142657 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-55qzx" Mar 01 09:25:43 crc kubenswrapper[4792]: I0301 09:25:43.346456 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-67d996989d-t5fsn" Mar 01 09:25:43 crc kubenswrapper[4792]: I0301 09:25:43.399398 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-64lkf" event={"ID":"e45ebab9-87d5-4b2f-b3d1-f1832864584d","Type":"ContainerStarted","Data":"fc2c4132f6b6f61506c1aef72d0aaac00e147da7528cfb7453b927b244ba87d0"} Mar 01 09:25:43 crc kubenswrapper[4792]: I0301 09:25:43.399932 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-64lkf" Mar 01 09:25:43 crc kubenswrapper[4792]: I0301 09:25:43.420031 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a" path="/var/lib/kubelet/pods/02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a/volumes" Mar 01 09:25:43 crc kubenswrapper[4792]: I0301 09:25:43.421356 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-64lkf" podStartSLOduration=4.883327199 podStartE2EDuration="1m0.421339606s" podCreationTimestamp="2026-03-01 09:24:43 +0000 UTC" firstStartedPulling="2026-03-01 09:24:47.257359652 +0000 UTC m=+1016.499238849" lastFinishedPulling="2026-03-01 09:25:42.795372059 +0000 UTC m=+1072.037251256" observedRunningTime="2026-03-01 09:25:43.418465898 +0000 UTC m=+1072.660345095" watchObservedRunningTime="2026-03-01 09:25:43.421339606 +0000 UTC m=+1072.663218803" Mar 01 09:25:43 crc kubenswrapper[4792]: I0301 09:25:43.558587 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-knk7m" Mar 01 09:25:43 crc kubenswrapper[4792]: I0301 09:25:43.896888 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-76rvg"] Mar 01 09:25:43 crc kubenswrapper[4792]: I0301 09:25:43.897228 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-76rvg" podUID="5ee54ba4-1afe-492b-a35b-23f0da447772" containerName="registry-server" containerID="cri-o://0be24cd4ccbbe75283901f8e9a66a85ecb2a776492e33a934b9d2c63f6f16453" gracePeriod=2 Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.341534 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-76rvg" Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.407114 4792 generic.go:334] "Generic (PLEG): container finished" podID="5ee54ba4-1afe-492b-a35b-23f0da447772" containerID="0be24cd4ccbbe75283901f8e9a66a85ecb2a776492e33a934b9d2c63f6f16453" exitCode=0 Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.407767 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-76rvg" Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.408185 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-76rvg" event={"ID":"5ee54ba4-1afe-492b-a35b-23f0da447772","Type":"ContainerDied","Data":"0be24cd4ccbbe75283901f8e9a66a85ecb2a776492e33a934b9d2c63f6f16453"} Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.408214 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-76rvg" event={"ID":"5ee54ba4-1afe-492b-a35b-23f0da447772","Type":"ContainerDied","Data":"6f39b288ec76c37d95cc09cbb8367cea58bf1050a4c16d38edfc803ed8d4b5b8"} Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.408231 4792 scope.go:117] "RemoveContainer" containerID="0be24cd4ccbbe75283901f8e9a66a85ecb2a776492e33a934b9d2c63f6f16453" Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.426209 4792 scope.go:117] "RemoveContainer" containerID="d56625efa0b27d9e3673d2b751ede19c1c4b198dcd6670a8ed75741872c3784a" Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.452160 4792 scope.go:117] "RemoveContainer" containerID="832abdde1bd9df0dee6fd890cd41053134638610acc4bc9948b5dd0abc79f243" Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.452831 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ee54ba4-1afe-492b-a35b-23f0da447772-utilities\") pod \"5ee54ba4-1afe-492b-a35b-23f0da447772\" (UID: \"5ee54ba4-1afe-492b-a35b-23f0da447772\") " Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.452956 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ee54ba4-1afe-492b-a35b-23f0da447772-catalog-content\") pod \"5ee54ba4-1afe-492b-a35b-23f0da447772\" (UID: \"5ee54ba4-1afe-492b-a35b-23f0da447772\") " Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.452991 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4c9l7\" (UniqueName: \"kubernetes.io/projected/5ee54ba4-1afe-492b-a35b-23f0da447772-kube-api-access-4c9l7\") pod \"5ee54ba4-1afe-492b-a35b-23f0da447772\" (UID: \"5ee54ba4-1afe-492b-a35b-23f0da447772\") " Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.462041 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ee54ba4-1afe-492b-a35b-23f0da447772-kube-api-access-4c9l7" (OuterVolumeSpecName: "kube-api-access-4c9l7") pod "5ee54ba4-1afe-492b-a35b-23f0da447772" (UID: "5ee54ba4-1afe-492b-a35b-23f0da447772"). InnerVolumeSpecName "kube-api-access-4c9l7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.463001 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ee54ba4-1afe-492b-a35b-23f0da447772-utilities" (OuterVolumeSpecName: "utilities") pod "5ee54ba4-1afe-492b-a35b-23f0da447772" (UID: "5ee54ba4-1afe-492b-a35b-23f0da447772"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.476219 4792 scope.go:117] "RemoveContainer" containerID="0be24cd4ccbbe75283901f8e9a66a85ecb2a776492e33a934b9d2c63f6f16453" Mar 01 09:25:44 crc kubenswrapper[4792]: E0301 09:25:44.477244 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0be24cd4ccbbe75283901f8e9a66a85ecb2a776492e33a934b9d2c63f6f16453\": container with ID starting with 0be24cd4ccbbe75283901f8e9a66a85ecb2a776492e33a934b9d2c63f6f16453 not found: ID does not exist" containerID="0be24cd4ccbbe75283901f8e9a66a85ecb2a776492e33a934b9d2c63f6f16453" Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.477273 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0be24cd4ccbbe75283901f8e9a66a85ecb2a776492e33a934b9d2c63f6f16453"} err="failed to get container status \"0be24cd4ccbbe75283901f8e9a66a85ecb2a776492e33a934b9d2c63f6f16453\": rpc error: code = NotFound desc = could not find container \"0be24cd4ccbbe75283901f8e9a66a85ecb2a776492e33a934b9d2c63f6f16453\": container with ID starting with 0be24cd4ccbbe75283901f8e9a66a85ecb2a776492e33a934b9d2c63f6f16453 not found: ID does not exist" Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.477307 4792 scope.go:117] "RemoveContainer" containerID="d56625efa0b27d9e3673d2b751ede19c1c4b198dcd6670a8ed75741872c3784a" Mar 01 09:25:44 crc kubenswrapper[4792]: E0301 09:25:44.477978 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d56625efa0b27d9e3673d2b751ede19c1c4b198dcd6670a8ed75741872c3784a\": container with ID starting with d56625efa0b27d9e3673d2b751ede19c1c4b198dcd6670a8ed75741872c3784a not found: ID does not exist" containerID="d56625efa0b27d9e3673d2b751ede19c1c4b198dcd6670a8ed75741872c3784a" Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.478028 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d56625efa0b27d9e3673d2b751ede19c1c4b198dcd6670a8ed75741872c3784a"} err="failed to get container status \"d56625efa0b27d9e3673d2b751ede19c1c4b198dcd6670a8ed75741872c3784a\": rpc error: code = NotFound desc = could not find container \"d56625efa0b27d9e3673d2b751ede19c1c4b198dcd6670a8ed75741872c3784a\": container with ID starting with d56625efa0b27d9e3673d2b751ede19c1c4b198dcd6670a8ed75741872c3784a not found: ID does not exist" Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.478055 4792 scope.go:117] "RemoveContainer" containerID="832abdde1bd9df0dee6fd890cd41053134638610acc4bc9948b5dd0abc79f243" Mar 01 09:25:44 crc kubenswrapper[4792]: E0301 09:25:44.478555 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"832abdde1bd9df0dee6fd890cd41053134638610acc4bc9948b5dd0abc79f243\": container with ID starting with 832abdde1bd9df0dee6fd890cd41053134638610acc4bc9948b5dd0abc79f243 not found: ID does not exist" containerID="832abdde1bd9df0dee6fd890cd41053134638610acc4bc9948b5dd0abc79f243" Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.478583 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"832abdde1bd9df0dee6fd890cd41053134638610acc4bc9948b5dd0abc79f243"} err="failed to get container status \"832abdde1bd9df0dee6fd890cd41053134638610acc4bc9948b5dd0abc79f243\": rpc error: code = NotFound desc = could not find container \"832abdde1bd9df0dee6fd890cd41053134638610acc4bc9948b5dd0abc79f243\": container with ID starting with 832abdde1bd9df0dee6fd890cd41053134638610acc4bc9948b5dd0abc79f243 not found: ID does not exist" Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.504670 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ee54ba4-1afe-492b-a35b-23f0da447772-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5ee54ba4-1afe-492b-a35b-23f0da447772" (UID: "5ee54ba4-1afe-492b-a35b-23f0da447772"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.554954 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ee54ba4-1afe-492b-a35b-23f0da447772-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.555006 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ee54ba4-1afe-492b-a35b-23f0da447772-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.555021 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4c9l7\" (UniqueName: \"kubernetes.io/projected/5ee54ba4-1afe-492b-a35b-23f0da447772-kube-api-access-4c9l7\") on node \"crc\" DevicePath \"\"" Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.735186 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-76rvg"] Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.742731 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-76rvg"] Mar 01 09:25:45 crc kubenswrapper[4792]: I0301 09:25:45.419702 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ee54ba4-1afe-492b-a35b-23f0da447772" path="/var/lib/kubelet/pods/5ee54ba4-1afe-492b-a35b-23f0da447772/volumes" Mar 01 09:25:53 crc kubenswrapper[4792]: I0301 09:25:53.984760 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-jdn6k" Mar 01 09:25:53 crc kubenswrapper[4792]: I0301 09:25:53.996251 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-64lkf" Mar 01 09:26:00 crc kubenswrapper[4792]: I0301 09:26:00.142862 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539286-l47hq"] Mar 01 09:26:00 crc kubenswrapper[4792]: E0301 09:26:00.148699 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a" containerName="extract-content" Mar 01 09:26:00 crc kubenswrapper[4792]: I0301 09:26:00.148744 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a" containerName="extract-content" Mar 01 09:26:00 crc kubenswrapper[4792]: E0301 09:26:00.148763 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15fa5cd2-57f2-4589-9947-c4a227fa68b6" containerName="extract-content" Mar 01 09:26:00 crc kubenswrapper[4792]: I0301 09:26:00.148773 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="15fa5cd2-57f2-4589-9947-c4a227fa68b6" containerName="extract-content" Mar 01 09:26:00 crc kubenswrapper[4792]: E0301 09:26:00.148784 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a" containerName="extract-utilities" Mar 01 09:26:00 crc kubenswrapper[4792]: I0301 09:26:00.148793 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a" containerName="extract-utilities" Mar 01 09:26:00 crc kubenswrapper[4792]: E0301 09:26:00.148830 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ee54ba4-1afe-492b-a35b-23f0da447772" containerName="extract-utilities" Mar 01 09:26:00 crc kubenswrapper[4792]: I0301 09:26:00.148839 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ee54ba4-1afe-492b-a35b-23f0da447772" containerName="extract-utilities" Mar 01 09:26:00 crc kubenswrapper[4792]: E0301 09:26:00.148852 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15fa5cd2-57f2-4589-9947-c4a227fa68b6" containerName="extract-utilities" Mar 01 09:26:00 crc kubenswrapper[4792]: I0301 09:26:00.148861 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="15fa5cd2-57f2-4589-9947-c4a227fa68b6" containerName="extract-utilities" Mar 01 09:26:00 crc kubenswrapper[4792]: E0301 09:26:00.148875 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a" containerName="registry-server" Mar 01 09:26:00 crc kubenswrapper[4792]: I0301 09:26:00.148919 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a" containerName="registry-server" Mar 01 09:26:00 crc kubenswrapper[4792]: E0301 09:26:00.148933 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15fa5cd2-57f2-4589-9947-c4a227fa68b6" containerName="registry-server" Mar 01 09:26:00 crc kubenswrapper[4792]: I0301 09:26:00.148941 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="15fa5cd2-57f2-4589-9947-c4a227fa68b6" containerName="registry-server" Mar 01 09:26:00 crc kubenswrapper[4792]: E0301 09:26:00.148956 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ee54ba4-1afe-492b-a35b-23f0da447772" containerName="registry-server" Mar 01 09:26:00 crc kubenswrapper[4792]: I0301 09:26:00.149021 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ee54ba4-1afe-492b-a35b-23f0da447772" containerName="registry-server" Mar 01 09:26:00 crc kubenswrapper[4792]: E0301 09:26:00.149039 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ee54ba4-1afe-492b-a35b-23f0da447772" containerName="extract-content" Mar 01 09:26:00 crc kubenswrapper[4792]: I0301 09:26:00.149047 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ee54ba4-1afe-492b-a35b-23f0da447772" containerName="extract-content" Mar 01 09:26:00 crc kubenswrapper[4792]: I0301 09:26:00.149297 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="15fa5cd2-57f2-4589-9947-c4a227fa68b6" containerName="registry-server" Mar 01 09:26:00 crc kubenswrapper[4792]: I0301 09:26:00.149337 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ee54ba4-1afe-492b-a35b-23f0da447772" containerName="registry-server" Mar 01 09:26:00 crc kubenswrapper[4792]: I0301 09:26:00.149353 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a" containerName="registry-server" Mar 01 09:26:00 crc kubenswrapper[4792]: I0301 09:26:00.150091 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539286-l47hq"] Mar 01 09:26:00 crc kubenswrapper[4792]: I0301 09:26:00.150272 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539286-l47hq" Mar 01 09:26:00 crc kubenswrapper[4792]: I0301 09:26:00.153134 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:26:00 crc kubenswrapper[4792]: I0301 09:26:00.153511 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:26:00 crc kubenswrapper[4792]: I0301 09:26:00.153654 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:26:00 crc kubenswrapper[4792]: I0301 09:26:00.257562 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8rtl\" (UniqueName: \"kubernetes.io/projected/2eeb77af-03ae-4e32-80a6-3c16ed5ef64e-kube-api-access-g8rtl\") pod \"auto-csr-approver-29539286-l47hq\" (UID: \"2eeb77af-03ae-4e32-80a6-3c16ed5ef64e\") " pod="openshift-infra/auto-csr-approver-29539286-l47hq" Mar 01 09:26:00 crc kubenswrapper[4792]: I0301 09:26:00.359156 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8rtl\" (UniqueName: \"kubernetes.io/projected/2eeb77af-03ae-4e32-80a6-3c16ed5ef64e-kube-api-access-g8rtl\") pod \"auto-csr-approver-29539286-l47hq\" (UID: \"2eeb77af-03ae-4e32-80a6-3c16ed5ef64e\") " pod="openshift-infra/auto-csr-approver-29539286-l47hq" Mar 01 09:26:00 crc kubenswrapper[4792]: I0301 09:26:00.382612 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8rtl\" (UniqueName: \"kubernetes.io/projected/2eeb77af-03ae-4e32-80a6-3c16ed5ef64e-kube-api-access-g8rtl\") pod \"auto-csr-approver-29539286-l47hq\" (UID: \"2eeb77af-03ae-4e32-80a6-3c16ed5ef64e\") " pod="openshift-infra/auto-csr-approver-29539286-l47hq" Mar 01 09:26:00 crc kubenswrapper[4792]: I0301 09:26:00.467592 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539286-l47hq" Mar 01 09:26:00 crc kubenswrapper[4792]: I0301 09:26:00.924399 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539286-l47hq"] Mar 01 09:26:00 crc kubenswrapper[4792]: W0301 09:26:00.934741 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2eeb77af_03ae_4e32_80a6_3c16ed5ef64e.slice/crio-c19841951e9472daf35bdf79a2795bc24df43866e565d8780899803e8cd6f4ad WatchSource:0}: Error finding container c19841951e9472daf35bdf79a2795bc24df43866e565d8780899803e8cd6f4ad: Status 404 returned error can't find the container with id c19841951e9472daf35bdf79a2795bc24df43866e565d8780899803e8cd6f4ad Mar 01 09:26:01 crc kubenswrapper[4792]: I0301 09:26:01.520068 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539286-l47hq" event={"ID":"2eeb77af-03ae-4e32-80a6-3c16ed5ef64e","Type":"ContainerStarted","Data":"c19841951e9472daf35bdf79a2795bc24df43866e565d8780899803e8cd6f4ad"} Mar 01 09:26:02 crc kubenswrapper[4792]: I0301 09:26:02.528165 4792 generic.go:334] "Generic (PLEG): container finished" podID="2eeb77af-03ae-4e32-80a6-3c16ed5ef64e" containerID="30d57fe1f686a0e7d648422ad7801f657bc274b2e9502cf906d12a5e85e207f4" exitCode=0 Mar 01 09:26:02 crc kubenswrapper[4792]: I0301 09:26:02.528222 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539286-l47hq" event={"ID":"2eeb77af-03ae-4e32-80a6-3c16ed5ef64e","Type":"ContainerDied","Data":"30d57fe1f686a0e7d648422ad7801f657bc274b2e9502cf906d12a5e85e207f4"} Mar 01 09:26:03 crc kubenswrapper[4792]: I0301 09:26:03.786683 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539286-l47hq" Mar 01 09:26:03 crc kubenswrapper[4792]: I0301 09:26:03.912435 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8rtl\" (UniqueName: \"kubernetes.io/projected/2eeb77af-03ae-4e32-80a6-3c16ed5ef64e-kube-api-access-g8rtl\") pod \"2eeb77af-03ae-4e32-80a6-3c16ed5ef64e\" (UID: \"2eeb77af-03ae-4e32-80a6-3c16ed5ef64e\") " Mar 01 09:26:03 crc kubenswrapper[4792]: I0301 09:26:03.917523 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2eeb77af-03ae-4e32-80a6-3c16ed5ef64e-kube-api-access-g8rtl" (OuterVolumeSpecName: "kube-api-access-g8rtl") pod "2eeb77af-03ae-4e32-80a6-3c16ed5ef64e" (UID: "2eeb77af-03ae-4e32-80a6-3c16ed5ef64e"). InnerVolumeSpecName "kube-api-access-g8rtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:26:04 crc kubenswrapper[4792]: I0301 09:26:04.014305 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8rtl\" (UniqueName: \"kubernetes.io/projected/2eeb77af-03ae-4e32-80a6-3c16ed5ef64e-kube-api-access-g8rtl\") on node \"crc\" DevicePath \"\"" Mar 01 09:26:04 crc kubenswrapper[4792]: I0301 09:26:04.544362 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539286-l47hq" event={"ID":"2eeb77af-03ae-4e32-80a6-3c16ed5ef64e","Type":"ContainerDied","Data":"c19841951e9472daf35bdf79a2795bc24df43866e565d8780899803e8cd6f4ad"} Mar 01 09:26:04 crc kubenswrapper[4792]: I0301 09:26:04.544410 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c19841951e9472daf35bdf79a2795bc24df43866e565d8780899803e8cd6f4ad" Mar 01 09:26:04 crc kubenswrapper[4792]: I0301 09:26:04.544809 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539286-l47hq" Mar 01 09:26:04 crc kubenswrapper[4792]: I0301 09:26:04.867762 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539280-gz7v9"] Mar 01 09:26:04 crc kubenswrapper[4792]: I0301 09:26:04.874122 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539280-gz7v9"] Mar 01 09:26:04 crc kubenswrapper[4792]: I0301 09:26:04.943385 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:26:04 crc kubenswrapper[4792]: I0301 09:26:04.943455 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:26:05 crc kubenswrapper[4792]: I0301 09:26:05.416932 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71c922d5-9de8-48d8-9f96-ad47d1d4017e" path="/var/lib/kubelet/pods/71c922d5-9de8-48d8-9f96-ad47d1d4017e/volumes" Mar 01 09:26:16 crc kubenswrapper[4792]: I0301 09:26:16.753688 4792 scope.go:117] "RemoveContainer" containerID="752079600b535956d369c891a21eba391b40ef46c0f767fc3b0fcdc6ceb1bddc" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.558488 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-sqrnc"] Mar 01 09:26:18 crc kubenswrapper[4792]: E0301 09:26:18.558990 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eeb77af-03ae-4e32-80a6-3c16ed5ef64e" containerName="oc" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.559002 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eeb77af-03ae-4e32-80a6-3c16ed5ef64e" containerName="oc" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.559137 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="2eeb77af-03ae-4e32-80a6-3c16ed5ef64e" containerName="oc" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.559791 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-sqrnc" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.562385 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.562593 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.562807 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-glwmn" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.566245 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.569876 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-sqrnc"] Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.641412 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-brcdx"] Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.642552 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-brcdx" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.652924 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.662041 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-brcdx"] Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.705764 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40e11436-ae27-48c0-8baf-3f0aecc8e73c-config\") pod \"dnsmasq-dns-589db6c89c-sqrnc\" (UID: \"40e11436-ae27-48c0-8baf-3f0aecc8e73c\") " pod="openstack/dnsmasq-dns-589db6c89c-sqrnc" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.706055 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf78c5d7-c647-4862-bc7b-e14f8de9ef0f-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-brcdx\" (UID: \"bf78c5d7-c647-4862-bc7b-e14f8de9ef0f\") " pod="openstack/dnsmasq-dns-86bbd886cf-brcdx" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.706195 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cczf7\" (UniqueName: \"kubernetes.io/projected/40e11436-ae27-48c0-8baf-3f0aecc8e73c-kube-api-access-cczf7\") pod \"dnsmasq-dns-589db6c89c-sqrnc\" (UID: \"40e11436-ae27-48c0-8baf-3f0aecc8e73c\") " pod="openstack/dnsmasq-dns-589db6c89c-sqrnc" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.706340 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf78c5d7-c647-4862-bc7b-e14f8de9ef0f-config\") pod \"dnsmasq-dns-86bbd886cf-brcdx\" (UID: \"bf78c5d7-c647-4862-bc7b-e14f8de9ef0f\") " pod="openstack/dnsmasq-dns-86bbd886cf-brcdx" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.706450 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q27bj\" (UniqueName: \"kubernetes.io/projected/bf78c5d7-c647-4862-bc7b-e14f8de9ef0f-kube-api-access-q27bj\") pod \"dnsmasq-dns-86bbd886cf-brcdx\" (UID: \"bf78c5d7-c647-4862-bc7b-e14f8de9ef0f\") " pod="openstack/dnsmasq-dns-86bbd886cf-brcdx" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.808124 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf78c5d7-c647-4862-bc7b-e14f8de9ef0f-config\") pod \"dnsmasq-dns-86bbd886cf-brcdx\" (UID: \"bf78c5d7-c647-4862-bc7b-e14f8de9ef0f\") " pod="openstack/dnsmasq-dns-86bbd886cf-brcdx" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.808458 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q27bj\" (UniqueName: \"kubernetes.io/projected/bf78c5d7-c647-4862-bc7b-e14f8de9ef0f-kube-api-access-q27bj\") pod \"dnsmasq-dns-86bbd886cf-brcdx\" (UID: \"bf78c5d7-c647-4862-bc7b-e14f8de9ef0f\") " pod="openstack/dnsmasq-dns-86bbd886cf-brcdx" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.808708 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40e11436-ae27-48c0-8baf-3f0aecc8e73c-config\") pod \"dnsmasq-dns-589db6c89c-sqrnc\" (UID: \"40e11436-ae27-48c0-8baf-3f0aecc8e73c\") " pod="openstack/dnsmasq-dns-589db6c89c-sqrnc" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.808827 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf78c5d7-c647-4862-bc7b-e14f8de9ef0f-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-brcdx\" (UID: \"bf78c5d7-c647-4862-bc7b-e14f8de9ef0f\") " pod="openstack/dnsmasq-dns-86bbd886cf-brcdx" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.809046 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cczf7\" (UniqueName: \"kubernetes.io/projected/40e11436-ae27-48c0-8baf-3f0aecc8e73c-kube-api-access-cczf7\") pod \"dnsmasq-dns-589db6c89c-sqrnc\" (UID: \"40e11436-ae27-48c0-8baf-3f0aecc8e73c\") " pod="openstack/dnsmasq-dns-589db6c89c-sqrnc" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.809483 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf78c5d7-c647-4862-bc7b-e14f8de9ef0f-config\") pod \"dnsmasq-dns-86bbd886cf-brcdx\" (UID: \"bf78c5d7-c647-4862-bc7b-e14f8de9ef0f\") " pod="openstack/dnsmasq-dns-86bbd886cf-brcdx" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.810362 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40e11436-ae27-48c0-8baf-3f0aecc8e73c-config\") pod \"dnsmasq-dns-589db6c89c-sqrnc\" (UID: \"40e11436-ae27-48c0-8baf-3f0aecc8e73c\") " pod="openstack/dnsmasq-dns-589db6c89c-sqrnc" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.811005 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf78c5d7-c647-4862-bc7b-e14f8de9ef0f-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-brcdx\" (UID: \"bf78c5d7-c647-4862-bc7b-e14f8de9ef0f\") " pod="openstack/dnsmasq-dns-86bbd886cf-brcdx" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.828119 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cczf7\" (UniqueName: \"kubernetes.io/projected/40e11436-ae27-48c0-8baf-3f0aecc8e73c-kube-api-access-cczf7\") pod \"dnsmasq-dns-589db6c89c-sqrnc\" (UID: \"40e11436-ae27-48c0-8baf-3f0aecc8e73c\") " pod="openstack/dnsmasq-dns-589db6c89c-sqrnc" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.830767 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q27bj\" (UniqueName: \"kubernetes.io/projected/bf78c5d7-c647-4862-bc7b-e14f8de9ef0f-kube-api-access-q27bj\") pod \"dnsmasq-dns-86bbd886cf-brcdx\" (UID: \"bf78c5d7-c647-4862-bc7b-e14f8de9ef0f\") " pod="openstack/dnsmasq-dns-86bbd886cf-brcdx" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.928165 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-sqrnc" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.961685 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-brcdx" Mar 01 09:26:19 crc kubenswrapper[4792]: I0301 09:26:19.386646 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-sqrnc"] Mar 01 09:26:19 crc kubenswrapper[4792]: I0301 09:26:19.444545 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-brcdx"] Mar 01 09:26:19 crc kubenswrapper[4792]: W0301 09:26:19.450261 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf78c5d7_c647_4862_bc7b_e14f8de9ef0f.slice/crio-8e90037762d911668e790efa9d44feb3d2539a5a373a55b8095279629113a42e WatchSource:0}: Error finding container 8e90037762d911668e790efa9d44feb3d2539a5a373a55b8095279629113a42e: Status 404 returned error can't find the container with id 8e90037762d911668e790efa9d44feb3d2539a5a373a55b8095279629113a42e Mar 01 09:26:19 crc kubenswrapper[4792]: I0301 09:26:19.632372 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589db6c89c-sqrnc" event={"ID":"40e11436-ae27-48c0-8baf-3f0aecc8e73c","Type":"ContainerStarted","Data":"89744d9bb053371289c3f391707b9cba897bfe50eed733d33f5f66e5f68036ab"} Mar 01 09:26:19 crc kubenswrapper[4792]: I0301 09:26:19.633538 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86bbd886cf-brcdx" event={"ID":"bf78c5d7-c647-4862-bc7b-e14f8de9ef0f","Type":"ContainerStarted","Data":"8e90037762d911668e790efa9d44feb3d2539a5a373a55b8095279629113a42e"} Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.354354 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-sqrnc"] Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.388770 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-czwcl"] Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.390001 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cb4465c9-czwcl" Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.404212 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-czwcl"] Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.442824 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8a60915-b57b-4331-ad0e-b671ff576a69-dns-svc\") pod \"dnsmasq-dns-78cb4465c9-czwcl\" (UID: \"f8a60915-b57b-4331-ad0e-b671ff576a69\") " pod="openstack/dnsmasq-dns-78cb4465c9-czwcl" Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.443135 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8a60915-b57b-4331-ad0e-b671ff576a69-config\") pod \"dnsmasq-dns-78cb4465c9-czwcl\" (UID: \"f8a60915-b57b-4331-ad0e-b671ff576a69\") " pod="openstack/dnsmasq-dns-78cb4465c9-czwcl" Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.443248 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws769\" (UniqueName: \"kubernetes.io/projected/f8a60915-b57b-4331-ad0e-b671ff576a69-kube-api-access-ws769\") pod \"dnsmasq-dns-78cb4465c9-czwcl\" (UID: \"f8a60915-b57b-4331-ad0e-b671ff576a69\") " pod="openstack/dnsmasq-dns-78cb4465c9-czwcl" Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.546087 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8a60915-b57b-4331-ad0e-b671ff576a69-dns-svc\") pod \"dnsmasq-dns-78cb4465c9-czwcl\" (UID: \"f8a60915-b57b-4331-ad0e-b671ff576a69\") " pod="openstack/dnsmasq-dns-78cb4465c9-czwcl" Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.546149 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws769\" (UniqueName: \"kubernetes.io/projected/f8a60915-b57b-4331-ad0e-b671ff576a69-kube-api-access-ws769\") pod \"dnsmasq-dns-78cb4465c9-czwcl\" (UID: \"f8a60915-b57b-4331-ad0e-b671ff576a69\") " pod="openstack/dnsmasq-dns-78cb4465c9-czwcl" Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.546180 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8a60915-b57b-4331-ad0e-b671ff576a69-config\") pod \"dnsmasq-dns-78cb4465c9-czwcl\" (UID: \"f8a60915-b57b-4331-ad0e-b671ff576a69\") " pod="openstack/dnsmasq-dns-78cb4465c9-czwcl" Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.548396 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8a60915-b57b-4331-ad0e-b671ff576a69-dns-svc\") pod \"dnsmasq-dns-78cb4465c9-czwcl\" (UID: \"f8a60915-b57b-4331-ad0e-b671ff576a69\") " pod="openstack/dnsmasq-dns-78cb4465c9-czwcl" Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.550428 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8a60915-b57b-4331-ad0e-b671ff576a69-config\") pod \"dnsmasq-dns-78cb4465c9-czwcl\" (UID: \"f8a60915-b57b-4331-ad0e-b671ff576a69\") " pod="openstack/dnsmasq-dns-78cb4465c9-czwcl" Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.589094 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws769\" (UniqueName: \"kubernetes.io/projected/f8a60915-b57b-4331-ad0e-b671ff576a69-kube-api-access-ws769\") pod \"dnsmasq-dns-78cb4465c9-czwcl\" (UID: \"f8a60915-b57b-4331-ad0e-b671ff576a69\") " pod="openstack/dnsmasq-dns-78cb4465c9-czwcl" Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.711559 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-brcdx"] Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.715065 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cb4465c9-czwcl" Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.743593 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-xrm9m"] Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.745093 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-xrm9m" Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.770799 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-xrm9m"] Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.859146 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fwv9\" (UniqueName: \"kubernetes.io/projected/1ebfee14-6440-476a-87ff-fc933df3eaa8-kube-api-access-6fwv9\") pod \"dnsmasq-dns-7c47bcb9f9-xrm9m\" (UID: \"1ebfee14-6440-476a-87ff-fc933df3eaa8\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-xrm9m" Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.859191 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ebfee14-6440-476a-87ff-fc933df3eaa8-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-xrm9m\" (UID: \"1ebfee14-6440-476a-87ff-fc933df3eaa8\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-xrm9m" Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.859275 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ebfee14-6440-476a-87ff-fc933df3eaa8-config\") pod \"dnsmasq-dns-7c47bcb9f9-xrm9m\" (UID: \"1ebfee14-6440-476a-87ff-fc933df3eaa8\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-xrm9m" Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.960042 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ebfee14-6440-476a-87ff-fc933df3eaa8-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-xrm9m\" (UID: \"1ebfee14-6440-476a-87ff-fc933df3eaa8\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-xrm9m" Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.960287 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ebfee14-6440-476a-87ff-fc933df3eaa8-config\") pod \"dnsmasq-dns-7c47bcb9f9-xrm9m\" (UID: \"1ebfee14-6440-476a-87ff-fc933df3eaa8\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-xrm9m" Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.960365 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fwv9\" (UniqueName: \"kubernetes.io/projected/1ebfee14-6440-476a-87ff-fc933df3eaa8-kube-api-access-6fwv9\") pod \"dnsmasq-dns-7c47bcb9f9-xrm9m\" (UID: \"1ebfee14-6440-476a-87ff-fc933df3eaa8\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-xrm9m" Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.961567 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ebfee14-6440-476a-87ff-fc933df3eaa8-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-xrm9m\" (UID: \"1ebfee14-6440-476a-87ff-fc933df3eaa8\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-xrm9m" Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.965090 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ebfee14-6440-476a-87ff-fc933df3eaa8-config\") pod \"dnsmasq-dns-7c47bcb9f9-xrm9m\" (UID: \"1ebfee14-6440-476a-87ff-fc933df3eaa8\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-xrm9m" Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.987187 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fwv9\" (UniqueName: \"kubernetes.io/projected/1ebfee14-6440-476a-87ff-fc933df3eaa8-kube-api-access-6fwv9\") pod \"dnsmasq-dns-7c47bcb9f9-xrm9m\" (UID: \"1ebfee14-6440-476a-87ff-fc933df3eaa8\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-xrm9m" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.089344 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-xrm9m" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.140277 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-czwcl"] Mar 01 09:26:22 crc kubenswrapper[4792]: W0301 09:26:22.150267 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8a60915_b57b_4331_ad0e_b671ff576a69.slice/crio-67be12a989d9394e207bd8944b8cd6294460a40d572fd7dffd8e27ec9d363809 WatchSource:0}: Error finding container 67be12a989d9394e207bd8944b8cd6294460a40d572fd7dffd8e27ec9d363809: Status 404 returned error can't find the container with id 67be12a989d9394e207bd8944b8cd6294460a40d572fd7dffd8e27ec9d363809 Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.542368 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.543522 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.548766 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.548950 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.549064 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.549158 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.549312 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.549324 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-584kl" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.550596 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.564180 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.620922 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-xrm9m"] Mar 01 09:26:22 crc kubenswrapper[4792]: W0301 09:26:22.631748 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ebfee14_6440_476a_87ff_fc933df3eaa8.slice/crio-dfb18f2a46a9a4bd99a6d3ee488952b80257c666cf6b574e91f2c62a39fd84f6 WatchSource:0}: Error finding container dfb18f2a46a9a4bd99a6d3ee488952b80257c666cf6b574e91f2c62a39fd84f6: Status 404 returned error can't find the container with id dfb18f2a46a9a4bd99a6d3ee488952b80257c666cf6b574e91f2c62a39fd84f6 Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.671764 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-xrm9m" event={"ID":"1ebfee14-6440-476a-87ff-fc933df3eaa8","Type":"ContainerStarted","Data":"dfb18f2a46a9a4bd99a6d3ee488952b80257c666cf6b574e91f2c62a39fd84f6"} Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.673452 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cb4465c9-czwcl" event={"ID":"f8a60915-b57b-4331-ad0e-b671ff576a69","Type":"ContainerStarted","Data":"67be12a989d9394e207bd8944b8cd6294460a40d572fd7dffd8e27ec9d363809"} Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.675336 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.675384 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.675413 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.675439 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.675477 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.675524 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.675547 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.675596 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.675629 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.675659 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8fhc\" (UniqueName: \"kubernetes.io/projected/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-kube-api-access-q8fhc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.675834 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.777433 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.777475 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.777501 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.777532 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.777553 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.777579 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.777602 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.777618 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.777652 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.777687 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.777706 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8fhc\" (UniqueName: \"kubernetes.io/projected/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-kube-api-access-q8fhc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.779466 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.780165 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.780195 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.781067 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.781235 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.782003 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.793276 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.793356 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.796412 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.805092 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8fhc\" (UniqueName: \"kubernetes.io/projected/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-kube-api-access-q8fhc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.811158 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.825069 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.882556 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.891606 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.895936 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.899444 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.899656 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.899870 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-5zwb6" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.899708 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.900081 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.900310 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.909144 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.925672 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.984303 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.984406 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.984449 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnscv\" (UniqueName: \"kubernetes.io/projected/6252a079-917c-46e8-a848-10569e1e057e-kube-api-access-qnscv\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.984482 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6252a079-917c-46e8-a848-10569e1e057e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.984500 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6252a079-917c-46e8-a848-10569e1e057e-config-data\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.984536 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6252a079-917c-46e8-a848-10569e1e057e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.984562 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.984575 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6252a079-917c-46e8-a848-10569e1e057e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.984601 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.984638 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6252a079-917c-46e8-a848-10569e1e057e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.984654 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.086099 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.086139 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.086192 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6252a079-917c-46e8-a848-10569e1e057e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.086278 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.088525 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6252a079-917c-46e8-a848-10569e1e057e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.088596 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.088688 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.088785 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.088831 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnscv\" (UniqueName: \"kubernetes.io/projected/6252a079-917c-46e8-a848-10569e1e057e-kube-api-access-qnscv\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.088878 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6252a079-917c-46e8-a848-10569e1e057e-config-data\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.089090 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6252a079-917c-46e8-a848-10569e1e057e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.089101 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.089120 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6252a079-917c-46e8-a848-10569e1e057e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.089406 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.089783 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6252a079-917c-46e8-a848-10569e1e057e-config-data\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.090581 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6252a079-917c-46e8-a848-10569e1e057e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.090785 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6252a079-917c-46e8-a848-10569e1e057e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.098429 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.100756 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.106677 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6252a079-917c-46e8-a848-10569e1e057e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.107564 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnscv\" (UniqueName: \"kubernetes.io/projected/6252a079-917c-46e8-a848-10569e1e057e-kube-api-access-qnscv\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.111770 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6252a079-917c-46e8-a848-10569e1e057e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.114970 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.230667 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.587254 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.686356 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24","Type":"ContainerStarted","Data":"3e8de91b3c58261b32cbdb52401a16acdc8aa762850b0b7a587dfa85e98e1d6e"} Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.860566 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 01 09:26:23 crc kubenswrapper[4792]: W0301 09:26:23.861177 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6252a079_917c_46e8_a848_10569e1e057e.slice/crio-dda196254ac808b630ce64a84572b51fae2b42596880acb14f053e9eec301086 WatchSource:0}: Error finding container dda196254ac808b630ce64a84572b51fae2b42596880acb14f053e9eec301086: Status 404 returned error can't find the container with id dda196254ac808b630ce64a84572b51fae2b42596880acb14f053e9eec301086 Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.995391 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.997094 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:23.998531 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-cnt9x" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.005949 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.006145 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.006276 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.006729 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.008879 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.120207 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b969e6eb-14a7-4e45-8342-ccbd05c06261-config-data-default\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.120250 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b969e6eb-14a7-4e45-8342-ccbd05c06261-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.120284 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b969e6eb-14a7-4e45-8342-ccbd05c06261-kolla-config\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.120300 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b969e6eb-14a7-4e45-8342-ccbd05c06261-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.120420 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b969e6eb-14a7-4e45-8342-ccbd05c06261-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.120503 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.120531 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzb9z\" (UniqueName: \"kubernetes.io/projected/b969e6eb-14a7-4e45-8342-ccbd05c06261-kube-api-access-zzb9z\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.120550 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b969e6eb-14a7-4e45-8342-ccbd05c06261-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.222853 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b969e6eb-14a7-4e45-8342-ccbd05c06261-config-data-default\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.222937 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b969e6eb-14a7-4e45-8342-ccbd05c06261-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.222982 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b969e6eb-14a7-4e45-8342-ccbd05c06261-kolla-config\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.223003 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b969e6eb-14a7-4e45-8342-ccbd05c06261-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.223031 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b969e6eb-14a7-4e45-8342-ccbd05c06261-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.223077 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.223112 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzb9z\" (UniqueName: \"kubernetes.io/projected/b969e6eb-14a7-4e45-8342-ccbd05c06261-kube-api-access-zzb9z\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.223134 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b969e6eb-14a7-4e45-8342-ccbd05c06261-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.223639 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b969e6eb-14a7-4e45-8342-ccbd05c06261-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.224025 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.224767 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b969e6eb-14a7-4e45-8342-ccbd05c06261-config-data-default\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.229784 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b969e6eb-14a7-4e45-8342-ccbd05c06261-kolla-config\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.231569 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b969e6eb-14a7-4e45-8342-ccbd05c06261-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.241141 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b969e6eb-14a7-4e45-8342-ccbd05c06261-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.248963 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzb9z\" (UniqueName: \"kubernetes.io/projected/b969e6eb-14a7-4e45-8342-ccbd05c06261-kube-api-access-zzb9z\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.254544 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b969e6eb-14a7-4e45-8342-ccbd05c06261-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.277405 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.332819 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.782298 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6252a079-917c-46e8-a848-10569e1e057e","Type":"ContainerStarted","Data":"dda196254ac808b630ce64a84572b51fae2b42596880acb14f053e9eec301086"} Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.997018 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 01 09:26:25 crc kubenswrapper[4792]: W0301 09:26:25.023168 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb969e6eb_14a7_4e45_8342_ccbd05c06261.slice/crio-afc0801880f737b0b1ad27c6152a8ab04e9874208174641d073668979b7169df WatchSource:0}: Error finding container afc0801880f737b0b1ad27c6152a8ab04e9874208174641d073668979b7169df: Status 404 returned error can't find the container with id afc0801880f737b0b1ad27c6152a8ab04e9874208174641d073668979b7169df Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.375283 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.377992 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.380371 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.381154 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.381199 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-7wbzt" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.382614 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.385568 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.545383 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.545430 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f2d03d42-7830-444b-a8ae-c91e16d352b9-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.545494 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2d03d42-7830-444b-a8ae-c91e16d352b9-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.545524 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f2d03d42-7830-444b-a8ae-c91e16d352b9-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.545555 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9gx9\" (UniqueName: \"kubernetes.io/projected/f2d03d42-7830-444b-a8ae-c91e16d352b9-kube-api-access-p9gx9\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.545579 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2d03d42-7830-444b-a8ae-c91e16d352b9-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.545654 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f2d03d42-7830-444b-a8ae-c91e16d352b9-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.545677 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2d03d42-7830-444b-a8ae-c91e16d352b9-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.612656 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.613636 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.614423 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.617637 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.617786 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-fwwfv" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.617898 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.646462 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2d03d42-7830-444b-a8ae-c91e16d352b9-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.646506 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f2d03d42-7830-444b-a8ae-c91e16d352b9-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.646543 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9gx9\" (UniqueName: \"kubernetes.io/projected/f2d03d42-7830-444b-a8ae-c91e16d352b9-kube-api-access-p9gx9\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.646572 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2d03d42-7830-444b-a8ae-c91e16d352b9-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.646619 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f2d03d42-7830-444b-a8ae-c91e16d352b9-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.646644 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2d03d42-7830-444b-a8ae-c91e16d352b9-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.646673 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.646689 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f2d03d42-7830-444b-a8ae-c91e16d352b9-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.647996 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f2d03d42-7830-444b-a8ae-c91e16d352b9-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.648250 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f2d03d42-7830-444b-a8ae-c91e16d352b9-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.648628 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f2d03d42-7830-444b-a8ae-c91e16d352b9-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.648778 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.652838 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2d03d42-7830-444b-a8ae-c91e16d352b9-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.665285 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2d03d42-7830-444b-a8ae-c91e16d352b9-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.676148 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2d03d42-7830-444b-a8ae-c91e16d352b9-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.688263 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.688951 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9gx9\" (UniqueName: \"kubernetes.io/projected/f2d03d42-7830-444b-a8ae-c91e16d352b9-kube-api-access-p9gx9\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.718563 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.749087 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/84d455ad-7bbb-4771-a8ed-9aa1984e1d40-memcached-tls-certs\") pod \"memcached-0\" (UID: \"84d455ad-7bbb-4771-a8ed-9aa1984e1d40\") " pod="openstack/memcached-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.749158 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/84d455ad-7bbb-4771-a8ed-9aa1984e1d40-kolla-config\") pod \"memcached-0\" (UID: \"84d455ad-7bbb-4771-a8ed-9aa1984e1d40\") " pod="openstack/memcached-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.749185 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84d455ad-7bbb-4771-a8ed-9aa1984e1d40-combined-ca-bundle\") pod \"memcached-0\" (UID: \"84d455ad-7bbb-4771-a8ed-9aa1984e1d40\") " pod="openstack/memcached-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.749222 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j72rb\" (UniqueName: \"kubernetes.io/projected/84d455ad-7bbb-4771-a8ed-9aa1984e1d40-kube-api-access-j72rb\") pod \"memcached-0\" (UID: \"84d455ad-7bbb-4771-a8ed-9aa1984e1d40\") " pod="openstack/memcached-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.749304 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/84d455ad-7bbb-4771-a8ed-9aa1984e1d40-config-data\") pod \"memcached-0\" (UID: \"84d455ad-7bbb-4771-a8ed-9aa1984e1d40\") " pod="openstack/memcached-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.799397 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b969e6eb-14a7-4e45-8342-ccbd05c06261","Type":"ContainerStarted","Data":"afc0801880f737b0b1ad27c6152a8ab04e9874208174641d073668979b7169df"} Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.850601 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/84d455ad-7bbb-4771-a8ed-9aa1984e1d40-memcached-tls-certs\") pod \"memcached-0\" (UID: \"84d455ad-7bbb-4771-a8ed-9aa1984e1d40\") " pod="openstack/memcached-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.850661 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/84d455ad-7bbb-4771-a8ed-9aa1984e1d40-kolla-config\") pod \"memcached-0\" (UID: \"84d455ad-7bbb-4771-a8ed-9aa1984e1d40\") " pod="openstack/memcached-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.850684 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84d455ad-7bbb-4771-a8ed-9aa1984e1d40-combined-ca-bundle\") pod \"memcached-0\" (UID: \"84d455ad-7bbb-4771-a8ed-9aa1984e1d40\") " pod="openstack/memcached-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.850719 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j72rb\" (UniqueName: \"kubernetes.io/projected/84d455ad-7bbb-4771-a8ed-9aa1984e1d40-kube-api-access-j72rb\") pod \"memcached-0\" (UID: \"84d455ad-7bbb-4771-a8ed-9aa1984e1d40\") " pod="openstack/memcached-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.850776 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/84d455ad-7bbb-4771-a8ed-9aa1984e1d40-config-data\") pod \"memcached-0\" (UID: \"84d455ad-7bbb-4771-a8ed-9aa1984e1d40\") " pod="openstack/memcached-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.851823 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/84d455ad-7bbb-4771-a8ed-9aa1984e1d40-config-data\") pod \"memcached-0\" (UID: \"84d455ad-7bbb-4771-a8ed-9aa1984e1d40\") " pod="openstack/memcached-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.852315 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/84d455ad-7bbb-4771-a8ed-9aa1984e1d40-kolla-config\") pod \"memcached-0\" (UID: \"84d455ad-7bbb-4771-a8ed-9aa1984e1d40\") " pod="openstack/memcached-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.865532 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84d455ad-7bbb-4771-a8ed-9aa1984e1d40-combined-ca-bundle\") pod \"memcached-0\" (UID: \"84d455ad-7bbb-4771-a8ed-9aa1984e1d40\") " pod="openstack/memcached-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.865543 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/84d455ad-7bbb-4771-a8ed-9aa1984e1d40-memcached-tls-certs\") pod \"memcached-0\" (UID: \"84d455ad-7bbb-4771-a8ed-9aa1984e1d40\") " pod="openstack/memcached-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.893874 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j72rb\" (UniqueName: \"kubernetes.io/projected/84d455ad-7bbb-4771-a8ed-9aa1984e1d40-kube-api-access-j72rb\") pod \"memcached-0\" (UID: \"84d455ad-7bbb-4771-a8ed-9aa1984e1d40\") " pod="openstack/memcached-0" Mar 01 09:26:26 crc kubenswrapper[4792]: I0301 09:26:26.025095 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 01 09:26:26 crc kubenswrapper[4792]: I0301 09:26:26.458527 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 01 09:26:26 crc kubenswrapper[4792]: W0301 09:26:26.466543 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2d03d42_7830_444b_a8ae_c91e16d352b9.slice/crio-03d3c0f01b804fa527d0850dc14fc1a1b576cac0905f5fb14b0077932be52f81 WatchSource:0}: Error finding container 03d3c0f01b804fa527d0850dc14fc1a1b576cac0905f5fb14b0077932be52f81: Status 404 returned error can't find the container with id 03d3c0f01b804fa527d0850dc14fc1a1b576cac0905f5fb14b0077932be52f81 Mar 01 09:26:26 crc kubenswrapper[4792]: I0301 09:26:26.668806 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 01 09:26:26 crc kubenswrapper[4792]: W0301 09:26:26.713577 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84d455ad_7bbb_4771_a8ed_9aa1984e1d40.slice/crio-66a8dfae8907fc1609feadd743952be29a2aa99fa175799dda640835c77de4d2 WatchSource:0}: Error finding container 66a8dfae8907fc1609feadd743952be29a2aa99fa175799dda640835c77de4d2: Status 404 returned error can't find the container with id 66a8dfae8907fc1609feadd743952be29a2aa99fa175799dda640835c77de4d2 Mar 01 09:26:26 crc kubenswrapper[4792]: I0301 09:26:26.818990 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f2d03d42-7830-444b-a8ae-c91e16d352b9","Type":"ContainerStarted","Data":"03d3c0f01b804fa527d0850dc14fc1a1b576cac0905f5fb14b0077932be52f81"} Mar 01 09:26:26 crc kubenswrapper[4792]: I0301 09:26:26.825130 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"84d455ad-7bbb-4771-a8ed-9aa1984e1d40","Type":"ContainerStarted","Data":"66a8dfae8907fc1609feadd743952be29a2aa99fa175799dda640835c77de4d2"} Mar 01 09:26:28 crc kubenswrapper[4792]: I0301 09:26:28.211440 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 01 09:26:28 crc kubenswrapper[4792]: I0301 09:26:28.212739 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 01 09:26:28 crc kubenswrapper[4792]: I0301 09:26:28.216073 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-nc787" Mar 01 09:26:28 crc kubenswrapper[4792]: I0301 09:26:28.217456 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 01 09:26:28 crc kubenswrapper[4792]: I0301 09:26:28.311332 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5xkf\" (UniqueName: \"kubernetes.io/projected/c5db40bf-18aa-4877-ad92-35d50c549309-kube-api-access-t5xkf\") pod \"kube-state-metrics-0\" (UID: \"c5db40bf-18aa-4877-ad92-35d50c549309\") " pod="openstack/kube-state-metrics-0" Mar 01 09:26:28 crc kubenswrapper[4792]: I0301 09:26:28.413739 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5xkf\" (UniqueName: \"kubernetes.io/projected/c5db40bf-18aa-4877-ad92-35d50c549309-kube-api-access-t5xkf\") pod \"kube-state-metrics-0\" (UID: \"c5db40bf-18aa-4877-ad92-35d50c549309\") " pod="openstack/kube-state-metrics-0" Mar 01 09:26:28 crc kubenswrapper[4792]: I0301 09:26:28.436950 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5xkf\" (UniqueName: \"kubernetes.io/projected/c5db40bf-18aa-4877-ad92-35d50c549309-kube-api-access-t5xkf\") pod \"kube-state-metrics-0\" (UID: \"c5db40bf-18aa-4877-ad92-35d50c549309\") " pod="openstack/kube-state-metrics-0" Mar 01 09:26:28 crc kubenswrapper[4792]: I0301 09:26:28.548777 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 01 09:26:29 crc kubenswrapper[4792]: I0301 09:26:29.089054 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 01 09:26:30 crc kubenswrapper[4792]: I0301 09:26:29.865499 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c5db40bf-18aa-4877-ad92-35d50c549309","Type":"ContainerStarted","Data":"d3de3b349ed8682aadffbec7a09f7bd847d16614859016d386affe481743f302"} Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.196604 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-mpvqc"] Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.197814 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.201592 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-s2dfs" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.201794 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.201937 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.223656 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mpvqc"] Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.235101 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-nfzrr"] Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.236848 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.261968 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-nfzrr"] Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.369784 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4sph\" (UniqueName: \"kubernetes.io/projected/22d78adc-2ff6-4f03-b60e-ac8e9a0f3699-kube-api-access-p4sph\") pod \"ovn-controller-ovs-nfzrr\" (UID: \"22d78adc-2ff6-4f03-b60e-ac8e9a0f3699\") " pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.369837 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22d78adc-2ff6-4f03-b60e-ac8e9a0f3699-scripts\") pod \"ovn-controller-ovs-nfzrr\" (UID: \"22d78adc-2ff6-4f03-b60e-ac8e9a0f3699\") " pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.369928 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d50ee3b1-4f97-4644-802d-04c85d9c3abc-var-log-ovn\") pod \"ovn-controller-mpvqc\" (UID: \"d50ee3b1-4f97-4644-802d-04c85d9c3abc\") " pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.369949 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/22d78adc-2ff6-4f03-b60e-ac8e9a0f3699-var-run\") pod \"ovn-controller-ovs-nfzrr\" (UID: \"22d78adc-2ff6-4f03-b60e-ac8e9a0f3699\") " pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.370012 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d50ee3b1-4f97-4644-802d-04c85d9c3abc-combined-ca-bundle\") pod \"ovn-controller-mpvqc\" (UID: \"d50ee3b1-4f97-4644-802d-04c85d9c3abc\") " pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.370109 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d50ee3b1-4f97-4644-802d-04c85d9c3abc-scripts\") pod \"ovn-controller-mpvqc\" (UID: \"d50ee3b1-4f97-4644-802d-04c85d9c3abc\") " pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.370143 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/22d78adc-2ff6-4f03-b60e-ac8e9a0f3699-var-log\") pod \"ovn-controller-ovs-nfzrr\" (UID: \"22d78adc-2ff6-4f03-b60e-ac8e9a0f3699\") " pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.370174 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/22d78adc-2ff6-4f03-b60e-ac8e9a0f3699-var-lib\") pod \"ovn-controller-ovs-nfzrr\" (UID: \"22d78adc-2ff6-4f03-b60e-ac8e9a0f3699\") " pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.370194 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/22d78adc-2ff6-4f03-b60e-ac8e9a0f3699-etc-ovs\") pod \"ovn-controller-ovs-nfzrr\" (UID: \"22d78adc-2ff6-4f03-b60e-ac8e9a0f3699\") " pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.370214 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d50ee3b1-4f97-4644-802d-04c85d9c3abc-ovn-controller-tls-certs\") pod \"ovn-controller-mpvqc\" (UID: \"d50ee3b1-4f97-4644-802d-04c85d9c3abc\") " pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.370279 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d50ee3b1-4f97-4644-802d-04c85d9c3abc-var-run-ovn\") pod \"ovn-controller-mpvqc\" (UID: \"d50ee3b1-4f97-4644-802d-04c85d9c3abc\") " pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.370302 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffh47\" (UniqueName: \"kubernetes.io/projected/d50ee3b1-4f97-4644-802d-04c85d9c3abc-kube-api-access-ffh47\") pod \"ovn-controller-mpvqc\" (UID: \"d50ee3b1-4f97-4644-802d-04c85d9c3abc\") " pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.370342 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d50ee3b1-4f97-4644-802d-04c85d9c3abc-var-run\") pod \"ovn-controller-mpvqc\" (UID: \"d50ee3b1-4f97-4644-802d-04c85d9c3abc\") " pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.472168 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22d78adc-2ff6-4f03-b60e-ac8e9a0f3699-scripts\") pod \"ovn-controller-ovs-nfzrr\" (UID: \"22d78adc-2ff6-4f03-b60e-ac8e9a0f3699\") " pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.472315 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d50ee3b1-4f97-4644-802d-04c85d9c3abc-var-log-ovn\") pod \"ovn-controller-mpvqc\" (UID: \"d50ee3b1-4f97-4644-802d-04c85d9c3abc\") " pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.472339 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d50ee3b1-4f97-4644-802d-04c85d9c3abc-combined-ca-bundle\") pod \"ovn-controller-mpvqc\" (UID: \"d50ee3b1-4f97-4644-802d-04c85d9c3abc\") " pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.472358 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/22d78adc-2ff6-4f03-b60e-ac8e9a0f3699-var-run\") pod \"ovn-controller-ovs-nfzrr\" (UID: \"22d78adc-2ff6-4f03-b60e-ac8e9a0f3699\") " pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.472386 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d50ee3b1-4f97-4644-802d-04c85d9c3abc-scripts\") pod \"ovn-controller-mpvqc\" (UID: \"d50ee3b1-4f97-4644-802d-04c85d9c3abc\") " pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.472404 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/22d78adc-2ff6-4f03-b60e-ac8e9a0f3699-var-log\") pod \"ovn-controller-ovs-nfzrr\" (UID: \"22d78adc-2ff6-4f03-b60e-ac8e9a0f3699\") " pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.472428 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/22d78adc-2ff6-4f03-b60e-ac8e9a0f3699-var-lib\") pod \"ovn-controller-ovs-nfzrr\" (UID: \"22d78adc-2ff6-4f03-b60e-ac8e9a0f3699\") " pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.472445 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/22d78adc-2ff6-4f03-b60e-ac8e9a0f3699-etc-ovs\") pod \"ovn-controller-ovs-nfzrr\" (UID: \"22d78adc-2ff6-4f03-b60e-ac8e9a0f3699\") " pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.472462 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d50ee3b1-4f97-4644-802d-04c85d9c3abc-ovn-controller-tls-certs\") pod \"ovn-controller-mpvqc\" (UID: \"d50ee3b1-4f97-4644-802d-04c85d9c3abc\") " pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.472493 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d50ee3b1-4f97-4644-802d-04c85d9c3abc-var-run-ovn\") pod \"ovn-controller-mpvqc\" (UID: \"d50ee3b1-4f97-4644-802d-04c85d9c3abc\") " pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.472509 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffh47\" (UniqueName: \"kubernetes.io/projected/d50ee3b1-4f97-4644-802d-04c85d9c3abc-kube-api-access-ffh47\") pod \"ovn-controller-mpvqc\" (UID: \"d50ee3b1-4f97-4644-802d-04c85d9c3abc\") " pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.472533 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d50ee3b1-4f97-4644-802d-04c85d9c3abc-var-run\") pod \"ovn-controller-mpvqc\" (UID: \"d50ee3b1-4f97-4644-802d-04c85d9c3abc\") " pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.472557 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4sph\" (UniqueName: \"kubernetes.io/projected/22d78adc-2ff6-4f03-b60e-ac8e9a0f3699-kube-api-access-p4sph\") pod \"ovn-controller-ovs-nfzrr\" (UID: \"22d78adc-2ff6-4f03-b60e-ac8e9a0f3699\") " pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.472870 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/22d78adc-2ff6-4f03-b60e-ac8e9a0f3699-var-run\") pod \"ovn-controller-ovs-nfzrr\" (UID: \"22d78adc-2ff6-4f03-b60e-ac8e9a0f3699\") " pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.472967 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d50ee3b1-4f97-4644-802d-04c85d9c3abc-var-log-ovn\") pod \"ovn-controller-mpvqc\" (UID: \"d50ee3b1-4f97-4644-802d-04c85d9c3abc\") " pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.474147 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/22d78adc-2ff6-4f03-b60e-ac8e9a0f3699-var-lib\") pod \"ovn-controller-ovs-nfzrr\" (UID: \"22d78adc-2ff6-4f03-b60e-ac8e9a0f3699\") " pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.474291 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/22d78adc-2ff6-4f03-b60e-ac8e9a0f3699-var-log\") pod \"ovn-controller-ovs-nfzrr\" (UID: \"22d78adc-2ff6-4f03-b60e-ac8e9a0f3699\") " pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.474428 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d50ee3b1-4f97-4644-802d-04c85d9c3abc-var-run\") pod \"ovn-controller-mpvqc\" (UID: \"d50ee3b1-4f97-4644-802d-04c85d9c3abc\") " pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.474540 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/22d78adc-2ff6-4f03-b60e-ac8e9a0f3699-etc-ovs\") pod \"ovn-controller-ovs-nfzrr\" (UID: \"22d78adc-2ff6-4f03-b60e-ac8e9a0f3699\") " pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.474580 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d50ee3b1-4f97-4644-802d-04c85d9c3abc-var-run-ovn\") pod \"ovn-controller-mpvqc\" (UID: \"d50ee3b1-4f97-4644-802d-04c85d9c3abc\") " pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.474848 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d50ee3b1-4f97-4644-802d-04c85d9c3abc-scripts\") pod \"ovn-controller-mpvqc\" (UID: \"d50ee3b1-4f97-4644-802d-04c85d9c3abc\") " pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.475973 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22d78adc-2ff6-4f03-b60e-ac8e9a0f3699-scripts\") pod \"ovn-controller-ovs-nfzrr\" (UID: \"22d78adc-2ff6-4f03-b60e-ac8e9a0f3699\") " pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.480633 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d50ee3b1-4f97-4644-802d-04c85d9c3abc-ovn-controller-tls-certs\") pod \"ovn-controller-mpvqc\" (UID: \"d50ee3b1-4f97-4644-802d-04c85d9c3abc\") " pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.480674 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d50ee3b1-4f97-4644-802d-04c85d9c3abc-combined-ca-bundle\") pod \"ovn-controller-mpvqc\" (UID: \"d50ee3b1-4f97-4644-802d-04c85d9c3abc\") " pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.497518 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffh47\" (UniqueName: \"kubernetes.io/projected/d50ee3b1-4f97-4644-802d-04c85d9c3abc-kube-api-access-ffh47\") pod \"ovn-controller-mpvqc\" (UID: \"d50ee3b1-4f97-4644-802d-04c85d9c3abc\") " pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.500940 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4sph\" (UniqueName: \"kubernetes.io/projected/22d78adc-2ff6-4f03-b60e-ac8e9a0f3699-kube-api-access-p4sph\") pod \"ovn-controller-ovs-nfzrr\" (UID: \"22d78adc-2ff6-4f03-b60e-ac8e9a0f3699\") " pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.513742 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.556512 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:26:33 crc kubenswrapper[4792]: I0301 09:26:33.896758 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 01 09:26:33 crc kubenswrapper[4792]: I0301 09:26:33.898698 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:33 crc kubenswrapper[4792]: I0301 09:26:33.905325 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 01 09:26:33 crc kubenswrapper[4792]: I0301 09:26:33.905449 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 01 09:26:33 crc kubenswrapper[4792]: I0301 09:26:33.906084 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 01 09:26:33 crc kubenswrapper[4792]: I0301 09:26:33.906363 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-5fgfl" Mar 01 09:26:33 crc kubenswrapper[4792]: I0301 09:26:33.909148 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 01 09:26:33 crc kubenswrapper[4792]: I0301 09:26:33.909442 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.016367 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a20f7417-3c04-411a-88b9-d60664faaee3-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.016416 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a20f7417-3c04-411a-88b9-d60664faaee3-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.016444 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a20f7417-3c04-411a-88b9-d60664faaee3-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.016586 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a20f7417-3c04-411a-88b9-d60664faaee3-config\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.016743 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psjgb\" (UniqueName: \"kubernetes.io/projected/a20f7417-3c04-411a-88b9-d60664faaee3-kube-api-access-psjgb\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.016819 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.016872 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a20f7417-3c04-411a-88b9-d60664faaee3-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.016964 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a20f7417-3c04-411a-88b9-d60664faaee3-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.124792 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psjgb\" (UniqueName: \"kubernetes.io/projected/a20f7417-3c04-411a-88b9-d60664faaee3-kube-api-access-psjgb\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.124863 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.124895 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a20f7417-3c04-411a-88b9-d60664faaee3-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.124952 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a20f7417-3c04-411a-88b9-d60664faaee3-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.125008 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a20f7417-3c04-411a-88b9-d60664faaee3-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.125026 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a20f7417-3c04-411a-88b9-d60664faaee3-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.125041 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a20f7417-3c04-411a-88b9-d60664faaee3-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.125062 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a20f7417-3c04-411a-88b9-d60664faaee3-config\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.126255 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a20f7417-3c04-411a-88b9-d60664faaee3-config\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.126760 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a20f7417-3c04-411a-88b9-d60664faaee3-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.127012 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.128345 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a20f7417-3c04-411a-88b9-d60664faaee3-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.132454 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a20f7417-3c04-411a-88b9-d60664faaee3-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.142797 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a20f7417-3c04-411a-88b9-d60664faaee3-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.145363 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a20f7417-3c04-411a-88b9-d60664faaee3-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.149355 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psjgb\" (UniqueName: \"kubernetes.io/projected/a20f7417-3c04-411a-88b9-d60664faaee3-kube-api-access-psjgb\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.150552 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.228938 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.583811 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.585026 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.592224 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.592273 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.593437 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.593815 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-zmrfb" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.594122 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.733507 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c9312b5-705e-42f0-8462-62c8fdeb0791-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.733549 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c9312b5-705e-42f0-8462-62c8fdeb0791-config\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.733595 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c9312b5-705e-42f0-8462-62c8fdeb0791-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.733614 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkgrk\" (UniqueName: \"kubernetes.io/projected/2c9312b5-705e-42f0-8462-62c8fdeb0791-kube-api-access-vkgrk\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.733662 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2c9312b5-705e-42f0-8462-62c8fdeb0791-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.733687 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2c9312b5-705e-42f0-8462-62c8fdeb0791-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.733703 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.733723 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c9312b5-705e-42f0-8462-62c8fdeb0791-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.835344 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2c9312b5-705e-42f0-8462-62c8fdeb0791-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.835400 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.835417 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2c9312b5-705e-42f0-8462-62c8fdeb0791-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.835438 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c9312b5-705e-42f0-8462-62c8fdeb0791-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.835478 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c9312b5-705e-42f0-8462-62c8fdeb0791-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.835497 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c9312b5-705e-42f0-8462-62c8fdeb0791-config\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.835534 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c9312b5-705e-42f0-8462-62c8fdeb0791-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.835549 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkgrk\" (UniqueName: \"kubernetes.io/projected/2c9312b5-705e-42f0-8462-62c8fdeb0791-kube-api-access-vkgrk\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.836293 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.837236 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2c9312b5-705e-42f0-8462-62c8fdeb0791-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.837531 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c9312b5-705e-42f0-8462-62c8fdeb0791-config\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.838766 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2c9312b5-705e-42f0-8462-62c8fdeb0791-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.839803 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c9312b5-705e-42f0-8462-62c8fdeb0791-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.840058 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c9312b5-705e-42f0-8462-62c8fdeb0791-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.848721 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c9312b5-705e-42f0-8462-62c8fdeb0791-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.852510 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkgrk\" (UniqueName: \"kubernetes.io/projected/2c9312b5-705e-42f0-8462-62c8fdeb0791-kube-api-access-vkgrk\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.870083 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.920547 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.943958 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.944004 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:26:49 crc kubenswrapper[4792]: E0301 09:26:49.056283 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514" Mar 01 09:26:49 crc kubenswrapper[4792]: E0301 09:26:49.056997 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6fwv9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-7c47bcb9f9-xrm9m_openstack(1ebfee14-6440-476a-87ff-fc933df3eaa8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 01 09:26:49 crc kubenswrapper[4792]: E0301 09:26:49.058222 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-7c47bcb9f9-xrm9m" podUID="1ebfee14-6440-476a-87ff-fc933df3eaa8" Mar 01 09:26:49 crc kubenswrapper[4792]: E0301 09:26:49.102771 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514" Mar 01 09:26:49 crc kubenswrapper[4792]: E0301 09:26:49.103090 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ws769,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78cb4465c9-czwcl_openstack(f8a60915-b57b-4331-ad0e-b671ff576a69): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 01 09:26:49 crc kubenswrapper[4792]: E0301 09:26:49.104495 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78cb4465c9-czwcl" podUID="f8a60915-b57b-4331-ad0e-b671ff576a69" Mar 01 09:26:49 crc kubenswrapper[4792]: E0301 09:26:49.126616 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514" Mar 01 09:26:49 crc kubenswrapper[4792]: E0301 09:26:49.126785 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cczf7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-589db6c89c-sqrnc_openstack(40e11436-ae27-48c0-8baf-3f0aecc8e73c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 01 09:26:49 crc kubenswrapper[4792]: E0301 09:26:49.127536 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514" Mar 01 09:26:49 crc kubenswrapper[4792]: E0301 09:26:49.127704 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q27bj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-86bbd886cf-brcdx_openstack(bf78c5d7-c647-4862-bc7b-e14f8de9ef0f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 01 09:26:49 crc kubenswrapper[4792]: E0301 09:26:49.128898 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-589db6c89c-sqrnc" podUID="40e11436-ae27-48c0-8baf-3f0aecc8e73c" Mar 01 09:26:49 crc kubenswrapper[4792]: E0301 09:26:49.129199 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-86bbd886cf-brcdx" podUID="bf78c5d7-c647-4862-bc7b-e14f8de9ef0f" Mar 01 09:26:49 crc kubenswrapper[4792]: I0301 09:26:49.598338 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mpvqc"] Mar 01 09:26:49 crc kubenswrapper[4792]: W0301 09:26:49.946024 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd50ee3b1_4f97_4644_802d_04c85d9c3abc.slice/crio-2e6843120ab5bdcf9f58b38c0a595cc6cbfe7fb2ddd2cd6a13fd86ac14443855 WatchSource:0}: Error finding container 2e6843120ab5bdcf9f58b38c0a595cc6cbfe7fb2ddd2cd6a13fd86ac14443855: Status 404 returned error can't find the container with id 2e6843120ab5bdcf9f58b38c0a595cc6cbfe7fb2ddd2cd6a13fd86ac14443855 Mar 01 09:26:50 crc kubenswrapper[4792]: I0301 09:26:50.012688 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 01 09:26:50 crc kubenswrapper[4792]: I0301 09:26:50.032773 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b969e6eb-14a7-4e45-8342-ccbd05c06261","Type":"ContainerStarted","Data":"8454d15743412518e314144a6752ae268a6efc96deec97d3b909bd1028bb3b9b"} Mar 01 09:26:50 crc kubenswrapper[4792]: I0301 09:26:50.037179 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mpvqc" event={"ID":"d50ee3b1-4f97-4644-802d-04c85d9c3abc","Type":"ContainerStarted","Data":"2e6843120ab5bdcf9f58b38c0a595cc6cbfe7fb2ddd2cd6a13fd86ac14443855"} Mar 01 09:26:50 crc kubenswrapper[4792]: E0301 09:26:50.038537 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514\\\"\"" pod="openstack/dnsmasq-dns-7c47bcb9f9-xrm9m" podUID="1ebfee14-6440-476a-87ff-fc933df3eaa8" Mar 01 09:26:50 crc kubenswrapper[4792]: E0301 09:26:50.038714 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514\\\"\"" pod="openstack/dnsmasq-dns-78cb4465c9-czwcl" podUID="f8a60915-b57b-4331-ad0e-b671ff576a69" Mar 01 09:26:50 crc kubenswrapper[4792]: I0301 09:26:50.517412 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-brcdx" Mar 01 09:26:50 crc kubenswrapper[4792]: I0301 09:26:50.567383 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-sqrnc" Mar 01 09:26:50 crc kubenswrapper[4792]: I0301 09:26:50.613235 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf78c5d7-c647-4862-bc7b-e14f8de9ef0f-dns-svc\") pod \"bf78c5d7-c647-4862-bc7b-e14f8de9ef0f\" (UID: \"bf78c5d7-c647-4862-bc7b-e14f8de9ef0f\") " Mar 01 09:26:50 crc kubenswrapper[4792]: I0301 09:26:50.613414 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf78c5d7-c647-4862-bc7b-e14f8de9ef0f-config\") pod \"bf78c5d7-c647-4862-bc7b-e14f8de9ef0f\" (UID: \"bf78c5d7-c647-4862-bc7b-e14f8de9ef0f\") " Mar 01 09:26:50 crc kubenswrapper[4792]: I0301 09:26:50.613866 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf78c5d7-c647-4862-bc7b-e14f8de9ef0f-config" (OuterVolumeSpecName: "config") pod "bf78c5d7-c647-4862-bc7b-e14f8de9ef0f" (UID: "bf78c5d7-c647-4862-bc7b-e14f8de9ef0f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:26:50 crc kubenswrapper[4792]: I0301 09:26:50.613859 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf78c5d7-c647-4862-bc7b-e14f8de9ef0f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bf78c5d7-c647-4862-bc7b-e14f8de9ef0f" (UID: "bf78c5d7-c647-4862-bc7b-e14f8de9ef0f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:26:50 crc kubenswrapper[4792]: I0301 09:26:50.614019 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q27bj\" (UniqueName: \"kubernetes.io/projected/bf78c5d7-c647-4862-bc7b-e14f8de9ef0f-kube-api-access-q27bj\") pod \"bf78c5d7-c647-4862-bc7b-e14f8de9ef0f\" (UID: \"bf78c5d7-c647-4862-bc7b-e14f8de9ef0f\") " Mar 01 09:26:50 crc kubenswrapper[4792]: I0301 09:26:50.614503 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf78c5d7-c647-4862-bc7b-e14f8de9ef0f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 01 09:26:50 crc kubenswrapper[4792]: I0301 09:26:50.614523 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf78c5d7-c647-4862-bc7b-e14f8de9ef0f-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:26:50 crc kubenswrapper[4792]: I0301 09:26:50.620608 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf78c5d7-c647-4862-bc7b-e14f8de9ef0f-kube-api-access-q27bj" (OuterVolumeSpecName: "kube-api-access-q27bj") pod "bf78c5d7-c647-4862-bc7b-e14f8de9ef0f" (UID: "bf78c5d7-c647-4862-bc7b-e14f8de9ef0f"). InnerVolumeSpecName "kube-api-access-q27bj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:26:50 crc kubenswrapper[4792]: I0301 09:26:50.715433 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cczf7\" (UniqueName: \"kubernetes.io/projected/40e11436-ae27-48c0-8baf-3f0aecc8e73c-kube-api-access-cczf7\") pod \"40e11436-ae27-48c0-8baf-3f0aecc8e73c\" (UID: \"40e11436-ae27-48c0-8baf-3f0aecc8e73c\") " Mar 01 09:26:50 crc kubenswrapper[4792]: I0301 09:26:50.715560 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40e11436-ae27-48c0-8baf-3f0aecc8e73c-config\") pod \"40e11436-ae27-48c0-8baf-3f0aecc8e73c\" (UID: \"40e11436-ae27-48c0-8baf-3f0aecc8e73c\") " Mar 01 09:26:50 crc kubenswrapper[4792]: I0301 09:26:50.715898 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q27bj\" (UniqueName: \"kubernetes.io/projected/bf78c5d7-c647-4862-bc7b-e14f8de9ef0f-kube-api-access-q27bj\") on node \"crc\" DevicePath \"\"" Mar 01 09:26:50 crc kubenswrapper[4792]: I0301 09:26:50.716311 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40e11436-ae27-48c0-8baf-3f0aecc8e73c-config" (OuterVolumeSpecName: "config") pod "40e11436-ae27-48c0-8baf-3f0aecc8e73c" (UID: "40e11436-ae27-48c0-8baf-3f0aecc8e73c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:26:50 crc kubenswrapper[4792]: I0301 09:26:50.720225 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40e11436-ae27-48c0-8baf-3f0aecc8e73c-kube-api-access-cczf7" (OuterVolumeSpecName: "kube-api-access-cczf7") pod "40e11436-ae27-48c0-8baf-3f0aecc8e73c" (UID: "40e11436-ae27-48c0-8baf-3f0aecc8e73c"). InnerVolumeSpecName "kube-api-access-cczf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:26:50 crc kubenswrapper[4792]: I0301 09:26:50.801322 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-nfzrr"] Mar 01 09:26:50 crc kubenswrapper[4792]: I0301 09:26:50.817356 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40e11436-ae27-48c0-8baf-3f0aecc8e73c-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:26:50 crc kubenswrapper[4792]: I0301 09:26:50.817393 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cczf7\" (UniqueName: \"kubernetes.io/projected/40e11436-ae27-48c0-8baf-3f0aecc8e73c-kube-api-access-cczf7\") on node \"crc\" DevicePath \"\"" Mar 01 09:26:50 crc kubenswrapper[4792]: I0301 09:26:50.973637 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 01 09:26:51 crc kubenswrapper[4792]: I0301 09:26:51.043947 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2c9312b5-705e-42f0-8462-62c8fdeb0791","Type":"ContainerStarted","Data":"7a6c996b1e0956eceabbb1bb33423f1746d6ef7cb8bf6ead0556fc4690b162c1"} Mar 01 09:26:51 crc kubenswrapper[4792]: I0301 09:26:51.045489 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6252a079-917c-46e8-a848-10569e1e057e","Type":"ContainerStarted","Data":"81c1c2615bd05b6e2f8a23b6d892f8335b3c7a5c117575ce3ed245f2faa7543f"} Mar 01 09:26:51 crc kubenswrapper[4792]: I0301 09:26:51.048474 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f2d03d42-7830-444b-a8ae-c91e16d352b9","Type":"ContainerStarted","Data":"b1b2332536ff5b014987937edb8cc1b3217d724f3d46eed3bc72b1534e0bed78"} Mar 01 09:26:51 crc kubenswrapper[4792]: I0301 09:26:51.050860 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"84d455ad-7bbb-4771-a8ed-9aa1984e1d40","Type":"ContainerStarted","Data":"f83e1270986c4d5bf17f9e23a710fe04a8f8a3a1361ff4a562a5dbb07885c3b0"} Mar 01 09:26:51 crc kubenswrapper[4792]: I0301 09:26:51.051058 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 01 09:26:51 crc kubenswrapper[4792]: I0301 09:26:51.053009 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86bbd886cf-brcdx" event={"ID":"bf78c5d7-c647-4862-bc7b-e14f8de9ef0f","Type":"ContainerDied","Data":"8e90037762d911668e790efa9d44feb3d2539a5a373a55b8095279629113a42e"} Mar 01 09:26:51 crc kubenswrapper[4792]: I0301 09:26:51.053113 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-brcdx" Mar 01 09:26:51 crc kubenswrapper[4792]: I0301 09:26:51.053976 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589db6c89c-sqrnc" event={"ID":"40e11436-ae27-48c0-8baf-3f0aecc8e73c","Type":"ContainerDied","Data":"89744d9bb053371289c3f391707b9cba897bfe50eed733d33f5f66e5f68036ab"} Mar 01 09:26:51 crc kubenswrapper[4792]: I0301 09:26:51.054288 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-sqrnc" Mar 01 09:26:51 crc kubenswrapper[4792]: I0301 09:26:51.114440 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=3.699907571 podStartE2EDuration="26.114422572s" podCreationTimestamp="2026-03-01 09:26:25 +0000 UTC" firstStartedPulling="2026-03-01 09:26:26.715992601 +0000 UTC m=+1115.957871798" lastFinishedPulling="2026-03-01 09:26:49.130507602 +0000 UTC m=+1138.372386799" observedRunningTime="2026-03-01 09:26:51.109522655 +0000 UTC m=+1140.351401862" watchObservedRunningTime="2026-03-01 09:26:51.114422572 +0000 UTC m=+1140.356301769" Mar 01 09:26:51 crc kubenswrapper[4792]: I0301 09:26:51.162616 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-sqrnc"] Mar 01 09:26:51 crc kubenswrapper[4792]: I0301 09:26:51.168517 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-sqrnc"] Mar 01 09:26:51 crc kubenswrapper[4792]: I0301 09:26:51.201939 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-brcdx"] Mar 01 09:26:51 crc kubenswrapper[4792]: I0301 09:26:51.204594 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-brcdx"] Mar 01 09:26:51 crc kubenswrapper[4792]: I0301 09:26:51.441223 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40e11436-ae27-48c0-8baf-3f0aecc8e73c" path="/var/lib/kubelet/pods/40e11436-ae27-48c0-8baf-3f0aecc8e73c/volumes" Mar 01 09:26:51 crc kubenswrapper[4792]: I0301 09:26:51.445812 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf78c5d7-c647-4862-bc7b-e14f8de9ef0f" path="/var/lib/kubelet/pods/bf78c5d7-c647-4862-bc7b-e14f8de9ef0f/volumes" Mar 01 09:26:51 crc kubenswrapper[4792]: W0301 09:26:51.658837 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22d78adc_2ff6_4f03_b60e_ac8e9a0f3699.slice/crio-8302b894750954038bb6748f427e127e1b452eb998c2786a648ff5cb7f54a541 WatchSource:0}: Error finding container 8302b894750954038bb6748f427e127e1b452eb998c2786a648ff5cb7f54a541: Status 404 returned error can't find the container with id 8302b894750954038bb6748f427e127e1b452eb998c2786a648ff5cb7f54a541 Mar 01 09:26:51 crc kubenswrapper[4792]: W0301 09:26:51.662531 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda20f7417_3c04_411a_88b9_d60664faaee3.slice/crio-99a86d134f213914eac943cfcb04a2986a73b0106133daa7ff5aafdeef2850cf WatchSource:0}: Error finding container 99a86d134f213914eac943cfcb04a2986a73b0106133daa7ff5aafdeef2850cf: Status 404 returned error can't find the container with id 99a86d134f213914eac943cfcb04a2986a73b0106133daa7ff5aafdeef2850cf Mar 01 09:26:52 crc kubenswrapper[4792]: I0301 09:26:52.062038 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"a20f7417-3c04-411a-88b9-d60664faaee3","Type":"ContainerStarted","Data":"99a86d134f213914eac943cfcb04a2986a73b0106133daa7ff5aafdeef2850cf"} Mar 01 09:26:52 crc kubenswrapper[4792]: I0301 09:26:52.064111 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c5db40bf-18aa-4877-ad92-35d50c549309","Type":"ContainerStarted","Data":"ab032429016daa4de09e0b350e6149a60f82fbef10f58da185abc90fde56991b"} Mar 01 09:26:52 crc kubenswrapper[4792]: I0301 09:26:52.064375 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 01 09:26:52 crc kubenswrapper[4792]: I0301 09:26:52.067581 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24","Type":"ContainerStarted","Data":"6ee0d7e342e549ff7a0c75a829bbad1f0458089ec98c553779c523b140c0f36b"} Mar 01 09:26:52 crc kubenswrapper[4792]: I0301 09:26:52.069529 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-nfzrr" event={"ID":"22d78adc-2ff6-4f03-b60e-ac8e9a0f3699","Type":"ContainerStarted","Data":"8302b894750954038bb6748f427e127e1b452eb998c2786a648ff5cb7f54a541"} Mar 01 09:26:52 crc kubenswrapper[4792]: I0301 09:26:52.087655 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.474786667 podStartE2EDuration="24.087639814s" podCreationTimestamp="2026-03-01 09:26:28 +0000 UTC" firstStartedPulling="2026-03-01 09:26:29.099865842 +0000 UTC m=+1118.341745029" lastFinishedPulling="2026-03-01 09:26:51.712718969 +0000 UTC m=+1140.954598176" observedRunningTime="2026-03-01 09:26:52.08033771 +0000 UTC m=+1141.322216927" watchObservedRunningTime="2026-03-01 09:26:52.087639814 +0000 UTC m=+1141.329519011" Mar 01 09:26:53 crc kubenswrapper[4792]: I0301 09:26:53.076976 4792 generic.go:334] "Generic (PLEG): container finished" podID="b969e6eb-14a7-4e45-8342-ccbd05c06261" containerID="8454d15743412518e314144a6752ae268a6efc96deec97d3b909bd1028bb3b9b" exitCode=0 Mar 01 09:26:53 crc kubenswrapper[4792]: I0301 09:26:53.078296 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b969e6eb-14a7-4e45-8342-ccbd05c06261","Type":"ContainerDied","Data":"8454d15743412518e314144a6752ae268a6efc96deec97d3b909bd1028bb3b9b"} Mar 01 09:26:54 crc kubenswrapper[4792]: I0301 09:26:54.090670 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b969e6eb-14a7-4e45-8342-ccbd05c06261","Type":"ContainerStarted","Data":"527f7c5f55b21db39864a6aeed6e8a3a82f2058bad8f4f37fe92df1ddfa952ac"} Mar 01 09:26:54 crc kubenswrapper[4792]: I0301 09:26:54.096958 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mpvqc" event={"ID":"d50ee3b1-4f97-4644-802d-04c85d9c3abc","Type":"ContainerStarted","Data":"4c150d0b8509b0afa6465f1ae370aea56d239728b3bbd15c71f0d2de67de9cb9"} Mar 01 09:26:54 crc kubenswrapper[4792]: I0301 09:26:54.097090 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:54 crc kubenswrapper[4792]: I0301 09:26:54.098593 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-nfzrr" event={"ID":"22d78adc-2ff6-4f03-b60e-ac8e9a0f3699","Type":"ContainerStarted","Data":"ae8e8a3920d3b3061bc4fa93f61aeb832c261d8d34750eb1a7881044ff0f527a"} Mar 01 09:26:54 crc kubenswrapper[4792]: I0301 09:26:54.120650 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=8.019725942 podStartE2EDuration="32.120628322s" podCreationTimestamp="2026-03-01 09:26:22 +0000 UTC" firstStartedPulling="2026-03-01 09:26:25.030789221 +0000 UTC m=+1114.272668418" lastFinishedPulling="2026-03-01 09:26:49.131691601 +0000 UTC m=+1138.373570798" observedRunningTime="2026-03-01 09:26:54.109678421 +0000 UTC m=+1143.351557618" watchObservedRunningTime="2026-03-01 09:26:54.120628322 +0000 UTC m=+1143.362507519" Mar 01 09:26:54 crc kubenswrapper[4792]: I0301 09:26:54.137297 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-mpvqc" podStartSLOduration=19.383765249 podStartE2EDuration="23.137276859s" podCreationTimestamp="2026-03-01 09:26:31 +0000 UTC" firstStartedPulling="2026-03-01 09:26:49.951478647 +0000 UTC m=+1139.193357834" lastFinishedPulling="2026-03-01 09:26:53.704990247 +0000 UTC m=+1142.946869444" observedRunningTime="2026-03-01 09:26:54.129376191 +0000 UTC m=+1143.371255378" watchObservedRunningTime="2026-03-01 09:26:54.137276859 +0000 UTC m=+1143.379156056" Mar 01 09:26:54 crc kubenswrapper[4792]: I0301 09:26:54.334927 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 01 09:26:54 crc kubenswrapper[4792]: I0301 09:26:54.334974 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 01 09:26:55 crc kubenswrapper[4792]: I0301 09:26:55.107971 4792 generic.go:334] "Generic (PLEG): container finished" podID="22d78adc-2ff6-4f03-b60e-ac8e9a0f3699" containerID="ae8e8a3920d3b3061bc4fa93f61aeb832c261d8d34750eb1a7881044ff0f527a" exitCode=0 Mar 01 09:26:55 crc kubenswrapper[4792]: I0301 09:26:55.108041 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-nfzrr" event={"ID":"22d78adc-2ff6-4f03-b60e-ac8e9a0f3699","Type":"ContainerDied","Data":"ae8e8a3920d3b3061bc4fa93f61aeb832c261d8d34750eb1a7881044ff0f527a"} Mar 01 09:26:55 crc kubenswrapper[4792]: I0301 09:26:55.111934 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"a20f7417-3c04-411a-88b9-d60664faaee3","Type":"ContainerStarted","Data":"277b3aeef07b3328e0c9713011a75e57d246f6728cbc9d042c8b0f4f1f114f8b"} Mar 01 09:26:55 crc kubenswrapper[4792]: I0301 09:26:55.116938 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2c9312b5-705e-42f0-8462-62c8fdeb0791","Type":"ContainerStarted","Data":"4f34373b18e0c92342a207d871ff46d575d902ff599d1c4b8091120a52e3b9a0"} Mar 01 09:26:55 crc kubenswrapper[4792]: I0301 09:26:55.120500 4792 generic.go:334] "Generic (PLEG): container finished" podID="f2d03d42-7830-444b-a8ae-c91e16d352b9" containerID="b1b2332536ff5b014987937edb8cc1b3217d724f3d46eed3bc72b1534e0bed78" exitCode=0 Mar 01 09:26:55 crc kubenswrapper[4792]: I0301 09:26:55.120596 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f2d03d42-7830-444b-a8ae-c91e16d352b9","Type":"ContainerDied","Data":"b1b2332536ff5b014987937edb8cc1b3217d724f3d46eed3bc72b1534e0bed78"} Mar 01 09:26:56 crc kubenswrapper[4792]: I0301 09:26:56.026488 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 01 09:26:56 crc kubenswrapper[4792]: I0301 09:26:56.136788 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-nfzrr" event={"ID":"22d78adc-2ff6-4f03-b60e-ac8e9a0f3699","Type":"ContainerStarted","Data":"79c70ae2f20509a4f11e5f4d26c6756782d88cbdf2b0efc87c5a966cc1e41b31"} Mar 01 09:26:56 crc kubenswrapper[4792]: I0301 09:26:56.136869 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-nfzrr" event={"ID":"22d78adc-2ff6-4f03-b60e-ac8e9a0f3699","Type":"ContainerStarted","Data":"da0e79386e4a5ecd379d8a12d635afee3fe617678251383677aa69373b3f0009"} Mar 01 09:26:56 crc kubenswrapper[4792]: I0301 09:26:56.136945 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:26:56 crc kubenswrapper[4792]: I0301 09:26:56.142282 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f2d03d42-7830-444b-a8ae-c91e16d352b9","Type":"ContainerStarted","Data":"09bcfb4050ee82529f2845f71f1c73d4c107dbe358b73d31f7dcff55d2806351"} Mar 01 09:26:56 crc kubenswrapper[4792]: I0301 09:26:56.168175 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-nfzrr" podStartSLOduration=23.12564547 podStartE2EDuration="25.168151116s" podCreationTimestamp="2026-03-01 09:26:31 +0000 UTC" firstStartedPulling="2026-03-01 09:26:51.662721917 +0000 UTC m=+1140.904601114" lastFinishedPulling="2026-03-01 09:26:53.705227563 +0000 UTC m=+1142.947106760" observedRunningTime="2026-03-01 09:26:56.156807156 +0000 UTC m=+1145.398686353" watchObservedRunningTime="2026-03-01 09:26:56.168151116 +0000 UTC m=+1145.410030323" Mar 01 09:26:56 crc kubenswrapper[4792]: I0301 09:26:56.185855 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=9.497912282 podStartE2EDuration="32.185833308s" podCreationTimestamp="2026-03-01 09:26:24 +0000 UTC" firstStartedPulling="2026-03-01 09:26:26.46917743 +0000 UTC m=+1115.711056627" lastFinishedPulling="2026-03-01 09:26:49.157098456 +0000 UTC m=+1138.398977653" observedRunningTime="2026-03-01 09:26:56.181059744 +0000 UTC m=+1145.422938941" watchObservedRunningTime="2026-03-01 09:26:56.185833308 +0000 UTC m=+1145.427712515" Mar 01 09:26:56 crc kubenswrapper[4792]: I0301 09:26:56.556926 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:26:58 crc kubenswrapper[4792]: I0301 09:26:58.156794 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"a20f7417-3c04-411a-88b9-d60664faaee3","Type":"ContainerStarted","Data":"9ac8557067dcd7b2193acc9fb67f687c52d1c8c861acf95a5d0d3d85d48e58ad"} Mar 01 09:26:58 crc kubenswrapper[4792]: I0301 09:26:58.158711 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2c9312b5-705e-42f0-8462-62c8fdeb0791","Type":"ContainerStarted","Data":"7319fba5c487cb4e2250ecbe1f38126ed3f63bfddd53b74bfe7cd6a648f825f9"} Mar 01 09:26:58 crc kubenswrapper[4792]: I0301 09:26:58.181193 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=20.15292734 podStartE2EDuration="26.18117766s" podCreationTimestamp="2026-03-01 09:26:32 +0000 UTC" firstStartedPulling="2026-03-01 09:26:51.665544545 +0000 UTC m=+1140.907423742" lastFinishedPulling="2026-03-01 09:26:57.693794875 +0000 UTC m=+1146.935674062" observedRunningTime="2026-03-01 09:26:58.179367216 +0000 UTC m=+1147.421246413" watchObservedRunningTime="2026-03-01 09:26:58.18117766 +0000 UTC m=+1147.423056847" Mar 01 09:26:58 crc kubenswrapper[4792]: I0301 09:26:58.210260 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=17.717987374 podStartE2EDuration="25.210243762s" podCreationTimestamp="2026-03-01 09:26:33 +0000 UTC" firstStartedPulling="2026-03-01 09:26:50.191856546 +0000 UTC m=+1139.433735743" lastFinishedPulling="2026-03-01 09:26:57.684112934 +0000 UTC m=+1146.925992131" observedRunningTime="2026-03-01 09:26:58.203355878 +0000 UTC m=+1147.445235075" watchObservedRunningTime="2026-03-01 09:26:58.210243762 +0000 UTC m=+1147.452122959" Mar 01 09:26:58 crc kubenswrapper[4792]: I0301 09:26:58.229352 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:58 crc kubenswrapper[4792]: I0301 09:26:58.272043 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:58 crc kubenswrapper[4792]: I0301 09:26:58.423172 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 01 09:26:58 crc kubenswrapper[4792]: I0301 09:26:58.507947 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 01 09:26:58 crc kubenswrapper[4792]: I0301 09:26:58.556535 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 01 09:26:58 crc kubenswrapper[4792]: I0301 09:26:58.920779 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:58 crc kubenswrapper[4792]: I0301 09:26:58.957451 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.167094 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.167136 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.202815 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.208548 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.553628 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-czwcl"] Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.623058 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-j6c8g"] Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.632497 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.638740 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.641295 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-j6c8g"] Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.666738 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-7wc55"] Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.667690 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-7wc55" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.672171 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.683878 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-7wc55"] Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.763688 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9493aff0-58e3-44ca-ba01-69f3b284d732-ovs-rundir\") pod \"ovn-controller-metrics-7wc55\" (UID: \"9493aff0-58e3-44ca-ba01-69f3b284d732\") " pod="openstack/ovn-controller-metrics-7wc55" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.763738 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87678b56-0909-4735-ad6b-cb992dc86853-dns-svc\") pod \"dnsmasq-dns-6444958b7f-j6c8g\" (UID: \"87678b56-0909-4735-ad6b-cb992dc86853\") " pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.763762 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9493aff0-58e3-44ca-ba01-69f3b284d732-combined-ca-bundle\") pod \"ovn-controller-metrics-7wc55\" (UID: \"9493aff0-58e3-44ca-ba01-69f3b284d732\") " pod="openstack/ovn-controller-metrics-7wc55" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.763783 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9493aff0-58e3-44ca-ba01-69f3b284d732-config\") pod \"ovn-controller-metrics-7wc55\" (UID: \"9493aff0-58e3-44ca-ba01-69f3b284d732\") " pod="openstack/ovn-controller-metrics-7wc55" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.763825 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9493aff0-58e3-44ca-ba01-69f3b284d732-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-7wc55\" (UID: \"9493aff0-58e3-44ca-ba01-69f3b284d732\") " pod="openstack/ovn-controller-metrics-7wc55" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.763856 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzbw9\" (UniqueName: \"kubernetes.io/projected/87678b56-0909-4735-ad6b-cb992dc86853-kube-api-access-tzbw9\") pod \"dnsmasq-dns-6444958b7f-j6c8g\" (UID: \"87678b56-0909-4735-ad6b-cb992dc86853\") " pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.763876 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wx4d\" (UniqueName: \"kubernetes.io/projected/9493aff0-58e3-44ca-ba01-69f3b284d732-kube-api-access-6wx4d\") pod \"ovn-controller-metrics-7wc55\" (UID: \"9493aff0-58e3-44ca-ba01-69f3b284d732\") " pod="openstack/ovn-controller-metrics-7wc55" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.763900 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9493aff0-58e3-44ca-ba01-69f3b284d732-ovn-rundir\") pod \"ovn-controller-metrics-7wc55\" (UID: \"9493aff0-58e3-44ca-ba01-69f3b284d732\") " pod="openstack/ovn-controller-metrics-7wc55" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.763965 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87678b56-0909-4735-ad6b-cb992dc86853-config\") pod \"dnsmasq-dns-6444958b7f-j6c8g\" (UID: \"87678b56-0909-4735-ad6b-cb992dc86853\") " pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.763995 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87678b56-0909-4735-ad6b-cb992dc86853-ovsdbserver-nb\") pod \"dnsmasq-dns-6444958b7f-j6c8g\" (UID: \"87678b56-0909-4735-ad6b-cb992dc86853\") " pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.850226 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-xrm9m"] Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.861582 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.865009 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.865029 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87678b56-0909-4735-ad6b-cb992dc86853-dns-svc\") pod \"dnsmasq-dns-6444958b7f-j6c8g\" (UID: \"87678b56-0909-4735-ad6b-cb992dc86853\") " pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.865076 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9493aff0-58e3-44ca-ba01-69f3b284d732-combined-ca-bundle\") pod \"ovn-controller-metrics-7wc55\" (UID: \"9493aff0-58e3-44ca-ba01-69f3b284d732\") " pod="openstack/ovn-controller-metrics-7wc55" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.865097 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9493aff0-58e3-44ca-ba01-69f3b284d732-config\") pod \"ovn-controller-metrics-7wc55\" (UID: \"9493aff0-58e3-44ca-ba01-69f3b284d732\") " pod="openstack/ovn-controller-metrics-7wc55" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.865142 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9493aff0-58e3-44ca-ba01-69f3b284d732-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-7wc55\" (UID: \"9493aff0-58e3-44ca-ba01-69f3b284d732\") " pod="openstack/ovn-controller-metrics-7wc55" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.865182 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzbw9\" (UniqueName: \"kubernetes.io/projected/87678b56-0909-4735-ad6b-cb992dc86853-kube-api-access-tzbw9\") pod \"dnsmasq-dns-6444958b7f-j6c8g\" (UID: \"87678b56-0909-4735-ad6b-cb992dc86853\") " pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.865200 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wx4d\" (UniqueName: \"kubernetes.io/projected/9493aff0-58e3-44ca-ba01-69f3b284d732-kube-api-access-6wx4d\") pod \"ovn-controller-metrics-7wc55\" (UID: \"9493aff0-58e3-44ca-ba01-69f3b284d732\") " pod="openstack/ovn-controller-metrics-7wc55" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.865215 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9493aff0-58e3-44ca-ba01-69f3b284d732-ovn-rundir\") pod \"ovn-controller-metrics-7wc55\" (UID: \"9493aff0-58e3-44ca-ba01-69f3b284d732\") " pod="openstack/ovn-controller-metrics-7wc55" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.865270 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87678b56-0909-4735-ad6b-cb992dc86853-config\") pod \"dnsmasq-dns-6444958b7f-j6c8g\" (UID: \"87678b56-0909-4735-ad6b-cb992dc86853\") " pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.865302 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87678b56-0909-4735-ad6b-cb992dc86853-ovsdbserver-nb\") pod \"dnsmasq-dns-6444958b7f-j6c8g\" (UID: \"87678b56-0909-4735-ad6b-cb992dc86853\") " pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.865321 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9493aff0-58e3-44ca-ba01-69f3b284d732-ovs-rundir\") pod \"ovn-controller-metrics-7wc55\" (UID: \"9493aff0-58e3-44ca-ba01-69f3b284d732\") " pod="openstack/ovn-controller-metrics-7wc55" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.865703 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9493aff0-58e3-44ca-ba01-69f3b284d732-ovn-rundir\") pod \"ovn-controller-metrics-7wc55\" (UID: \"9493aff0-58e3-44ca-ba01-69f3b284d732\") " pod="openstack/ovn-controller-metrics-7wc55" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.866409 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9493aff0-58e3-44ca-ba01-69f3b284d732-config\") pod \"ovn-controller-metrics-7wc55\" (UID: \"9493aff0-58e3-44ca-ba01-69f3b284d732\") " pod="openstack/ovn-controller-metrics-7wc55" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.866435 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9493aff0-58e3-44ca-ba01-69f3b284d732-ovs-rundir\") pod \"ovn-controller-metrics-7wc55\" (UID: \"9493aff0-58e3-44ca-ba01-69f3b284d732\") " pod="openstack/ovn-controller-metrics-7wc55" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.866488 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87678b56-0909-4735-ad6b-cb992dc86853-config\") pod \"dnsmasq-dns-6444958b7f-j6c8g\" (UID: \"87678b56-0909-4735-ad6b-cb992dc86853\") " pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.866717 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87678b56-0909-4735-ad6b-cb992dc86853-ovsdbserver-nb\") pod \"dnsmasq-dns-6444958b7f-j6c8g\" (UID: \"87678b56-0909-4735-ad6b-cb992dc86853\") " pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.874643 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9493aff0-58e3-44ca-ba01-69f3b284d732-combined-ca-bundle\") pod \"ovn-controller-metrics-7wc55\" (UID: \"9493aff0-58e3-44ca-ba01-69f3b284d732\") " pod="openstack/ovn-controller-metrics-7wc55" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.887331 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.887525 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.887633 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.887732 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-f4phx" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.888710 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9493aff0-58e3-44ca-ba01-69f3b284d732-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-7wc55\" (UID: \"9493aff0-58e3-44ca-ba01-69f3b284d732\") " pod="openstack/ovn-controller-metrics-7wc55" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.897528 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87678b56-0909-4735-ad6b-cb992dc86853-dns-svc\") pod \"dnsmasq-dns-6444958b7f-j6c8g\" (UID: \"87678b56-0909-4735-ad6b-cb992dc86853\") " pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.925422 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.927569 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wx4d\" (UniqueName: \"kubernetes.io/projected/9493aff0-58e3-44ca-ba01-69f3b284d732-kube-api-access-6wx4d\") pod \"ovn-controller-metrics-7wc55\" (UID: \"9493aff0-58e3-44ca-ba01-69f3b284d732\") " pod="openstack/ovn-controller-metrics-7wc55" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.928074 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzbw9\" (UniqueName: \"kubernetes.io/projected/87678b56-0909-4735-ad6b-cb992dc86853-kube-api-access-tzbw9\") pod \"dnsmasq-dns-6444958b7f-j6c8g\" (UID: \"87678b56-0909-4735-ad6b-cb992dc86853\") " pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.946548 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-mz86z"] Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.956582 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.968608 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.972153 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.980944 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1712e112-23fd-402b-ae0b-f63a594d4fab-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"1712e112-23fd-402b-ae0b-f63a594d4fab\") " pod="openstack/ovn-northd-0" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.981061 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/1712e112-23fd-402b-ae0b-f63a594d4fab-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"1712e112-23fd-402b-ae0b-f63a594d4fab\") " pod="openstack/ovn-northd-0" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.981179 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1712e112-23fd-402b-ae0b-f63a594d4fab-config\") pod \"ovn-northd-0\" (UID: \"1712e112-23fd-402b-ae0b-f63a594d4fab\") " pod="openstack/ovn-northd-0" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.981279 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1712e112-23fd-402b-ae0b-f63a594d4fab-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"1712e112-23fd-402b-ae0b-f63a594d4fab\") " pod="openstack/ovn-northd-0" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.981356 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1712e112-23fd-402b-ae0b-f63a594d4fab-scripts\") pod \"ovn-northd-0\" (UID: \"1712e112-23fd-402b-ae0b-f63a594d4fab\") " pod="openstack/ovn-northd-0" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.981466 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84hzh\" (UniqueName: \"kubernetes.io/projected/1712e112-23fd-402b-ae0b-f63a594d4fab-kube-api-access-84hzh\") pod \"ovn-northd-0\" (UID: \"1712e112-23fd-402b-ae0b-f63a594d4fab\") " pod="openstack/ovn-northd-0" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.981547 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1712e112-23fd-402b-ae0b-f63a594d4fab-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"1712e112-23fd-402b-ae0b-f63a594d4fab\") " pod="openstack/ovn-northd-0" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.010230 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-7wc55" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.016227 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-mz86z"] Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.084037 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/1712e112-23fd-402b-ae0b-f63a594d4fab-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"1712e112-23fd-402b-ae0b-f63a594d4fab\") " pod="openstack/ovn-northd-0" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.084079 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m768\" (UniqueName: \"kubernetes.io/projected/b3682585-f554-4a65-86cb-096243ccc793-kube-api-access-8m768\") pod \"dnsmasq-dns-7b57d9888c-mz86z\" (UID: \"b3682585-f554-4a65-86cb-096243ccc793\") " pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.084113 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1712e112-23fd-402b-ae0b-f63a594d4fab-config\") pod \"ovn-northd-0\" (UID: \"1712e112-23fd-402b-ae0b-f63a594d4fab\") " pod="openstack/ovn-northd-0" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.084148 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1712e112-23fd-402b-ae0b-f63a594d4fab-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"1712e112-23fd-402b-ae0b-f63a594d4fab\") " pod="openstack/ovn-northd-0" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.085181 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1712e112-23fd-402b-ae0b-f63a594d4fab-scripts\") pod \"ovn-northd-0\" (UID: \"1712e112-23fd-402b-ae0b-f63a594d4fab\") " pod="openstack/ovn-northd-0" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.085257 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-ovsdbserver-sb\") pod \"dnsmasq-dns-7b57d9888c-mz86z\" (UID: \"b3682585-f554-4a65-86cb-096243ccc793\") " pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.085278 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84hzh\" (UniqueName: \"kubernetes.io/projected/1712e112-23fd-402b-ae0b-f63a594d4fab-kube-api-access-84hzh\") pod \"ovn-northd-0\" (UID: \"1712e112-23fd-402b-ae0b-f63a594d4fab\") " pod="openstack/ovn-northd-0" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.085306 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1712e112-23fd-402b-ae0b-f63a594d4fab-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"1712e112-23fd-402b-ae0b-f63a594d4fab\") " pod="openstack/ovn-northd-0" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.085332 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-config\") pod \"dnsmasq-dns-7b57d9888c-mz86z\" (UID: \"b3682585-f554-4a65-86cb-096243ccc793\") " pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.085357 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-dns-svc\") pod \"dnsmasq-dns-7b57d9888c-mz86z\" (UID: \"b3682585-f554-4a65-86cb-096243ccc793\") " pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.085407 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1712e112-23fd-402b-ae0b-f63a594d4fab-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"1712e112-23fd-402b-ae0b-f63a594d4fab\") " pod="openstack/ovn-northd-0" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.085424 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-ovsdbserver-nb\") pod \"dnsmasq-dns-7b57d9888c-mz86z\" (UID: \"b3682585-f554-4a65-86cb-096243ccc793\") " pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.088958 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1712e112-23fd-402b-ae0b-f63a594d4fab-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"1712e112-23fd-402b-ae0b-f63a594d4fab\") " pod="openstack/ovn-northd-0" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.089458 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1712e112-23fd-402b-ae0b-f63a594d4fab-config\") pod \"ovn-northd-0\" (UID: \"1712e112-23fd-402b-ae0b-f63a594d4fab\") " pod="openstack/ovn-northd-0" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.089599 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1712e112-23fd-402b-ae0b-f63a594d4fab-scripts\") pod \"ovn-northd-0\" (UID: \"1712e112-23fd-402b-ae0b-f63a594d4fab\") " pod="openstack/ovn-northd-0" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.094288 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/1712e112-23fd-402b-ae0b-f63a594d4fab-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"1712e112-23fd-402b-ae0b-f63a594d4fab\") " pod="openstack/ovn-northd-0" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.107558 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1712e112-23fd-402b-ae0b-f63a594d4fab-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"1712e112-23fd-402b-ae0b-f63a594d4fab\") " pod="openstack/ovn-northd-0" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.116450 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1712e112-23fd-402b-ae0b-f63a594d4fab-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"1712e112-23fd-402b-ae0b-f63a594d4fab\") " pod="openstack/ovn-northd-0" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.123073 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84hzh\" (UniqueName: \"kubernetes.io/projected/1712e112-23fd-402b-ae0b-f63a594d4fab-kube-api-access-84hzh\") pod \"ovn-northd-0\" (UID: \"1712e112-23fd-402b-ae0b-f63a594d4fab\") " pod="openstack/ovn-northd-0" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.187658 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-ovsdbserver-sb\") pod \"dnsmasq-dns-7b57d9888c-mz86z\" (UID: \"b3682585-f554-4a65-86cb-096243ccc793\") " pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.187756 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-config\") pod \"dnsmasq-dns-7b57d9888c-mz86z\" (UID: \"b3682585-f554-4a65-86cb-096243ccc793\") " pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.187793 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-dns-svc\") pod \"dnsmasq-dns-7b57d9888c-mz86z\" (UID: \"b3682585-f554-4a65-86cb-096243ccc793\") " pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.187845 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-ovsdbserver-nb\") pod \"dnsmasq-dns-7b57d9888c-mz86z\" (UID: \"b3682585-f554-4a65-86cb-096243ccc793\") " pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.187882 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8m768\" (UniqueName: \"kubernetes.io/projected/b3682585-f554-4a65-86cb-096243ccc793-kube-api-access-8m768\") pod \"dnsmasq-dns-7b57d9888c-mz86z\" (UID: \"b3682585-f554-4a65-86cb-096243ccc793\") " pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.190107 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-ovsdbserver-sb\") pod \"dnsmasq-dns-7b57d9888c-mz86z\" (UID: \"b3682585-f554-4a65-86cb-096243ccc793\") " pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.190786 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-config\") pod \"dnsmasq-dns-7b57d9888c-mz86z\" (UID: \"b3682585-f554-4a65-86cb-096243ccc793\") " pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.191605 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-dns-svc\") pod \"dnsmasq-dns-7b57d9888c-mz86z\" (UID: \"b3682585-f554-4a65-86cb-096243ccc793\") " pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.193636 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-ovsdbserver-nb\") pod \"dnsmasq-dns-7b57d9888c-mz86z\" (UID: \"b3682585-f554-4a65-86cb-096243ccc793\") " pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.202496 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cb4465c9-czwcl" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.214299 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cb4465c9-czwcl" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.214474 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cb4465c9-czwcl" event={"ID":"f8a60915-b57b-4331-ad0e-b671ff576a69","Type":"ContainerDied","Data":"67be12a989d9394e207bd8944b8cd6294460a40d572fd7dffd8e27ec9d363809"} Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.230214 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m768\" (UniqueName: \"kubernetes.io/projected/b3682585-f554-4a65-86cb-096243ccc793-kube-api-access-8m768\") pod \"dnsmasq-dns-7b57d9888c-mz86z\" (UID: \"b3682585-f554-4a65-86cb-096243ccc793\") " pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.290463 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8a60915-b57b-4331-ad0e-b671ff576a69-config\") pod \"f8a60915-b57b-4331-ad0e-b671ff576a69\" (UID: \"f8a60915-b57b-4331-ad0e-b671ff576a69\") " Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.290543 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ws769\" (UniqueName: \"kubernetes.io/projected/f8a60915-b57b-4331-ad0e-b671ff576a69-kube-api-access-ws769\") pod \"f8a60915-b57b-4331-ad0e-b671ff576a69\" (UID: \"f8a60915-b57b-4331-ad0e-b671ff576a69\") " Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.290632 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8a60915-b57b-4331-ad0e-b671ff576a69-dns-svc\") pod \"f8a60915-b57b-4331-ad0e-b671ff576a69\" (UID: \"f8a60915-b57b-4331-ad0e-b671ff576a69\") " Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.292519 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8a60915-b57b-4331-ad0e-b671ff576a69-config" (OuterVolumeSpecName: "config") pod "f8a60915-b57b-4331-ad0e-b671ff576a69" (UID: "f8a60915-b57b-4331-ad0e-b671ff576a69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.294037 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8a60915-b57b-4331-ad0e-b671ff576a69-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f8a60915-b57b-4331-ad0e-b671ff576a69" (UID: "f8a60915-b57b-4331-ad0e-b671ff576a69"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.297347 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8a60915-b57b-4331-ad0e-b671ff576a69-kube-api-access-ws769" (OuterVolumeSpecName: "kube-api-access-ws769") pod "f8a60915-b57b-4331-ad0e-b671ff576a69" (UID: "f8a60915-b57b-4331-ad0e-b671ff576a69"). InnerVolumeSpecName "kube-api-access-ws769". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.358728 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-xrm9m" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.361253 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.387307 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.392315 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8a60915-b57b-4331-ad0e-b671ff576a69-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.392354 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ws769\" (UniqueName: \"kubernetes.io/projected/f8a60915-b57b-4331-ad0e-b671ff576a69-kube-api-access-ws769\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.392371 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8a60915-b57b-4331-ad0e-b671ff576a69-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.495699 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ebfee14-6440-476a-87ff-fc933df3eaa8-config\") pod \"1ebfee14-6440-476a-87ff-fc933df3eaa8\" (UID: \"1ebfee14-6440-476a-87ff-fc933df3eaa8\") " Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.495763 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fwv9\" (UniqueName: \"kubernetes.io/projected/1ebfee14-6440-476a-87ff-fc933df3eaa8-kube-api-access-6fwv9\") pod \"1ebfee14-6440-476a-87ff-fc933df3eaa8\" (UID: \"1ebfee14-6440-476a-87ff-fc933df3eaa8\") " Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.495895 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ebfee14-6440-476a-87ff-fc933df3eaa8-dns-svc\") pod \"1ebfee14-6440-476a-87ff-fc933df3eaa8\" (UID: \"1ebfee14-6440-476a-87ff-fc933df3eaa8\") " Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.497744 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ebfee14-6440-476a-87ff-fc933df3eaa8-config" (OuterVolumeSpecName: "config") pod "1ebfee14-6440-476a-87ff-fc933df3eaa8" (UID: "1ebfee14-6440-476a-87ff-fc933df3eaa8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.501267 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ebfee14-6440-476a-87ff-fc933df3eaa8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1ebfee14-6440-476a-87ff-fc933df3eaa8" (UID: "1ebfee14-6440-476a-87ff-fc933df3eaa8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.501471 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ebfee14-6440-476a-87ff-fc933df3eaa8-kube-api-access-6fwv9" (OuterVolumeSpecName: "kube-api-access-6fwv9") pod "1ebfee14-6440-476a-87ff-fc933df3eaa8" (UID: "1ebfee14-6440-476a-87ff-fc933df3eaa8"). InnerVolumeSpecName "kube-api-access-6fwv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.597839 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ebfee14-6440-476a-87ff-fc933df3eaa8-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.597888 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fwv9\" (UniqueName: \"kubernetes.io/projected/1ebfee14-6440-476a-87ff-fc933df3eaa8-kube-api-access-6fwv9\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.597969 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ebfee14-6440-476a-87ff-fc933df3eaa8-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.626313 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-czwcl"] Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.636687 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-czwcl"] Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.662932 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-j6c8g"] Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.797856 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-7wc55"] Mar 01 09:27:00 crc kubenswrapper[4792]: W0301 09:27:00.805604 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9493aff0_58e3_44ca_ba01_69f3b284d732.slice/crio-a74e88581a327876e95868d41b270959ebb242e2f3a2355c320482a146ea207d WatchSource:0}: Error finding container a74e88581a327876e95868d41b270959ebb242e2f3a2355c320482a146ea207d: Status 404 returned error can't find the container with id a74e88581a327876e95868d41b270959ebb242e2f3a2355c320482a146ea207d Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.940061 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 01 09:27:00 crc kubenswrapper[4792]: W0301 09:27:00.963513 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1712e112_23fd_402b_ae0b_f63a594d4fab.slice/crio-ba7f2650ceeaba5235acd15be0999a93cf7db00d9f318c1eb3a0167a08dc809d WatchSource:0}: Error finding container ba7f2650ceeaba5235acd15be0999a93cf7db00d9f318c1eb3a0167a08dc809d: Status 404 returned error can't find the container with id ba7f2650ceeaba5235acd15be0999a93cf7db00d9f318c1eb3a0167a08dc809d Mar 01 09:27:01 crc kubenswrapper[4792]: I0301 09:27:01.043269 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-mz86z"] Mar 01 09:27:01 crc kubenswrapper[4792]: I0301 09:27:01.220099 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"1712e112-23fd-402b-ae0b-f63a594d4fab","Type":"ContainerStarted","Data":"ba7f2650ceeaba5235acd15be0999a93cf7db00d9f318c1eb3a0167a08dc809d"} Mar 01 09:27:01 crc kubenswrapper[4792]: I0301 09:27:01.222305 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-xrm9m" event={"ID":"1ebfee14-6440-476a-87ff-fc933df3eaa8","Type":"ContainerDied","Data":"dfb18f2a46a9a4bd99a6d3ee488952b80257c666cf6b574e91f2c62a39fd84f6"} Mar 01 09:27:01 crc kubenswrapper[4792]: I0301 09:27:01.222458 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-xrm9m" Mar 01 09:27:01 crc kubenswrapper[4792]: I0301 09:27:01.229195 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" event={"ID":"b3682585-f554-4a65-86cb-096243ccc793","Type":"ContainerStarted","Data":"b2e8c06688804bde6de3674de22e963e00a7db65a6fb4924d8a98f95171a76cc"} Mar 01 09:27:01 crc kubenswrapper[4792]: I0301 09:27:01.232719 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" event={"ID":"87678b56-0909-4735-ad6b-cb992dc86853","Type":"ContainerStarted","Data":"0a697580492b727fa9bf9603c48fb17f94e586d5d0286904452488d9188bd582"} Mar 01 09:27:01 crc kubenswrapper[4792]: I0301 09:27:01.234280 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-7wc55" event={"ID":"9493aff0-58e3-44ca-ba01-69f3b284d732","Type":"ContainerStarted","Data":"60455b6869ef76ca128b15b1a396f701387e37d930aa24eb9126981d2885fa5b"} Mar 01 09:27:01 crc kubenswrapper[4792]: I0301 09:27:01.234380 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-7wc55" event={"ID":"9493aff0-58e3-44ca-ba01-69f3b284d732","Type":"ContainerStarted","Data":"a74e88581a327876e95868d41b270959ebb242e2f3a2355c320482a146ea207d"} Mar 01 09:27:01 crc kubenswrapper[4792]: I0301 09:27:01.264684 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-7wc55" podStartSLOduration=2.264664422 podStartE2EDuration="2.264664422s" podCreationTimestamp="2026-03-01 09:26:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:27:01.261369754 +0000 UTC m=+1150.503248951" watchObservedRunningTime="2026-03-01 09:27:01.264664422 +0000 UTC m=+1150.506543619" Mar 01 09:27:01 crc kubenswrapper[4792]: I0301 09:27:01.323270 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-xrm9m"] Mar 01 09:27:01 crc kubenswrapper[4792]: I0301 09:27:01.331475 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-xrm9m"] Mar 01 09:27:01 crc kubenswrapper[4792]: I0301 09:27:01.423846 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ebfee14-6440-476a-87ff-fc933df3eaa8" path="/var/lib/kubelet/pods/1ebfee14-6440-476a-87ff-fc933df3eaa8/volumes" Mar 01 09:27:01 crc kubenswrapper[4792]: I0301 09:27:01.424205 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8a60915-b57b-4331-ad0e-b671ff576a69" path="/var/lib/kubelet/pods/f8a60915-b57b-4331-ad0e-b671ff576a69/volumes" Mar 01 09:27:02 crc kubenswrapper[4792]: I0301 09:27:02.263233 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"1712e112-23fd-402b-ae0b-f63a594d4fab","Type":"ContainerStarted","Data":"b137b70528617dca2a15862de2f905a20234a875aa1eecd21babd85b68e4a3ee"} Mar 01 09:27:02 crc kubenswrapper[4792]: I0301 09:27:02.265229 4792 generic.go:334] "Generic (PLEG): container finished" podID="b3682585-f554-4a65-86cb-096243ccc793" containerID="0ca19db3fcc227c24e95850e002b21aaf1788482cc19ab24284a2d399a8eb0fd" exitCode=0 Mar 01 09:27:02 crc kubenswrapper[4792]: I0301 09:27:02.265283 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" event={"ID":"b3682585-f554-4a65-86cb-096243ccc793","Type":"ContainerDied","Data":"0ca19db3fcc227c24e95850e002b21aaf1788482cc19ab24284a2d399a8eb0fd"} Mar 01 09:27:02 crc kubenswrapper[4792]: I0301 09:27:02.278281 4792 generic.go:334] "Generic (PLEG): container finished" podID="87678b56-0909-4735-ad6b-cb992dc86853" containerID="dda5ed153c6754e09b06fdc62aa62a423747b40a02478faacd4a8e291bb4a8b7" exitCode=0 Mar 01 09:27:02 crc kubenswrapper[4792]: I0301 09:27:02.278356 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" event={"ID":"87678b56-0909-4735-ad6b-cb992dc86853","Type":"ContainerDied","Data":"dda5ed153c6754e09b06fdc62aa62a423747b40a02478faacd4a8e291bb4a8b7"} Mar 01 09:27:03 crc kubenswrapper[4792]: I0301 09:27:03.085284 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-j6lbg"] Mar 01 09:27:03 crc kubenswrapper[4792]: I0301 09:27:03.086497 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-j6lbg" Mar 01 09:27:03 crc kubenswrapper[4792]: I0301 09:27:03.088752 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 01 09:27:03 crc kubenswrapper[4792]: I0301 09:27:03.103704 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-j6lbg"] Mar 01 09:27:03 crc kubenswrapper[4792]: I0301 09:27:03.242263 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn2gw\" (UniqueName: \"kubernetes.io/projected/77225d3a-9f2d-4aaf-8b98-6dc5310db3da-kube-api-access-tn2gw\") pod \"root-account-create-update-j6lbg\" (UID: \"77225d3a-9f2d-4aaf-8b98-6dc5310db3da\") " pod="openstack/root-account-create-update-j6lbg" Mar 01 09:27:03 crc kubenswrapper[4792]: I0301 09:27:03.242326 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77225d3a-9f2d-4aaf-8b98-6dc5310db3da-operator-scripts\") pod \"root-account-create-update-j6lbg\" (UID: \"77225d3a-9f2d-4aaf-8b98-6dc5310db3da\") " pod="openstack/root-account-create-update-j6lbg" Mar 01 09:27:03 crc kubenswrapper[4792]: I0301 09:27:03.286878 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" event={"ID":"b3682585-f554-4a65-86cb-096243ccc793","Type":"ContainerStarted","Data":"6049c60340fa24dee1fdddb897ca32ccd559a6ca6bdacf1ad0f33b624dbf4865"} Mar 01 09:27:03 crc kubenswrapper[4792]: I0301 09:27:03.286977 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" Mar 01 09:27:03 crc kubenswrapper[4792]: I0301 09:27:03.290133 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" event={"ID":"87678b56-0909-4735-ad6b-cb992dc86853","Type":"ContainerStarted","Data":"145f85c9e4bed69559d3fa2321198a2801d674c147a18cd06a91ee52950e9894"} Mar 01 09:27:03 crc kubenswrapper[4792]: I0301 09:27:03.290300 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" Mar 01 09:27:03 crc kubenswrapper[4792]: I0301 09:27:03.291921 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"1712e112-23fd-402b-ae0b-f63a594d4fab","Type":"ContainerStarted","Data":"23a988a54e4c4461bbed778852e583fffba07392434e115c274bc6841e8ab6bb"} Mar 01 09:27:03 crc kubenswrapper[4792]: I0301 09:27:03.292041 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 01 09:27:03 crc kubenswrapper[4792]: I0301 09:27:03.307630 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" podStartSLOduration=4.307614137 podStartE2EDuration="4.307614137s" podCreationTimestamp="2026-03-01 09:26:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:27:03.303203942 +0000 UTC m=+1152.545083139" watchObservedRunningTime="2026-03-01 09:27:03.307614137 +0000 UTC m=+1152.549493334" Mar 01 09:27:03 crc kubenswrapper[4792]: I0301 09:27:03.332281 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" podStartSLOduration=3.916259031 podStartE2EDuration="4.332259265s" podCreationTimestamp="2026-03-01 09:26:59 +0000 UTC" firstStartedPulling="2026-03-01 09:27:00.669807646 +0000 UTC m=+1149.911686843" lastFinishedPulling="2026-03-01 09:27:01.08580788 +0000 UTC m=+1150.327687077" observedRunningTime="2026-03-01 09:27:03.32367454 +0000 UTC m=+1152.565553777" watchObservedRunningTime="2026-03-01 09:27:03.332259265 +0000 UTC m=+1152.574138472" Mar 01 09:27:03 crc kubenswrapper[4792]: I0301 09:27:03.343806 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn2gw\" (UniqueName: \"kubernetes.io/projected/77225d3a-9f2d-4aaf-8b98-6dc5310db3da-kube-api-access-tn2gw\") pod \"root-account-create-update-j6lbg\" (UID: \"77225d3a-9f2d-4aaf-8b98-6dc5310db3da\") " pod="openstack/root-account-create-update-j6lbg" Mar 01 09:27:03 crc kubenswrapper[4792]: I0301 09:27:03.344078 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77225d3a-9f2d-4aaf-8b98-6dc5310db3da-operator-scripts\") pod \"root-account-create-update-j6lbg\" (UID: \"77225d3a-9f2d-4aaf-8b98-6dc5310db3da\") " pod="openstack/root-account-create-update-j6lbg" Mar 01 09:27:03 crc kubenswrapper[4792]: I0301 09:27:03.345285 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77225d3a-9f2d-4aaf-8b98-6dc5310db3da-operator-scripts\") pod \"root-account-create-update-j6lbg\" (UID: \"77225d3a-9f2d-4aaf-8b98-6dc5310db3da\") " pod="openstack/root-account-create-update-j6lbg" Mar 01 09:27:03 crc kubenswrapper[4792]: I0301 09:27:03.351940 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.330566242 podStartE2EDuration="4.351922113s" podCreationTimestamp="2026-03-01 09:26:59 +0000 UTC" firstStartedPulling="2026-03-01 09:27:00.966075166 +0000 UTC m=+1150.207954363" lastFinishedPulling="2026-03-01 09:27:01.987431037 +0000 UTC m=+1151.229310234" observedRunningTime="2026-03-01 09:27:03.342803696 +0000 UTC m=+1152.584682923" watchObservedRunningTime="2026-03-01 09:27:03.351922113 +0000 UTC m=+1152.593801310" Mar 01 09:27:03 crc kubenswrapper[4792]: I0301 09:27:03.373999 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn2gw\" (UniqueName: \"kubernetes.io/projected/77225d3a-9f2d-4aaf-8b98-6dc5310db3da-kube-api-access-tn2gw\") pod \"root-account-create-update-j6lbg\" (UID: \"77225d3a-9f2d-4aaf-8b98-6dc5310db3da\") " pod="openstack/root-account-create-update-j6lbg" Mar 01 09:27:03 crc kubenswrapper[4792]: I0301 09:27:03.405975 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-j6lbg" Mar 01 09:27:03 crc kubenswrapper[4792]: I0301 09:27:03.829783 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-j6lbg"] Mar 01 09:27:04 crc kubenswrapper[4792]: I0301 09:27:04.298856 4792 generic.go:334] "Generic (PLEG): container finished" podID="77225d3a-9f2d-4aaf-8b98-6dc5310db3da" containerID="6dd143b9e9badd592279cca432fe539c49e92a79ca469f608516d1e967d18c73" exitCode=0 Mar 01 09:27:04 crc kubenswrapper[4792]: I0301 09:27:04.298935 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-j6lbg" event={"ID":"77225d3a-9f2d-4aaf-8b98-6dc5310db3da","Type":"ContainerDied","Data":"6dd143b9e9badd592279cca432fe539c49e92a79ca469f608516d1e967d18c73"} Mar 01 09:27:04 crc kubenswrapper[4792]: I0301 09:27:04.298965 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-j6lbg" event={"ID":"77225d3a-9f2d-4aaf-8b98-6dc5310db3da","Type":"ContainerStarted","Data":"955029b7b4fc653d1ccc05aa7f77cb849f104e2da77ccb3a30ce4c53323eba36"} Mar 01 09:27:04 crc kubenswrapper[4792]: I0301 09:27:04.943224 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:27:04 crc kubenswrapper[4792]: I0301 09:27:04.943503 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:27:04 crc kubenswrapper[4792]: I0301 09:27:04.943535 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:27:04 crc kubenswrapper[4792]: I0301 09:27:04.944211 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9fc8b9702d9d3591695478729a2a209996e2f83219ba8649a31afc02f286ad3f"} pod="openshift-machine-config-operator/machine-config-daemon-bqszv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 01 09:27:04 crc kubenswrapper[4792]: I0301 09:27:04.944270 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" containerID="cri-o://9fc8b9702d9d3591695478729a2a209996e2f83219ba8649a31afc02f286ad3f" gracePeriod=600 Mar 01 09:27:05 crc kubenswrapper[4792]: I0301 09:27:05.308398 4792 generic.go:334] "Generic (PLEG): container finished" podID="9105f6b0-6f16-47aa-8009-73736a90b765" containerID="9fc8b9702d9d3591695478729a2a209996e2f83219ba8649a31afc02f286ad3f" exitCode=0 Mar 01 09:27:05 crc kubenswrapper[4792]: I0301 09:27:05.308591 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerDied","Data":"9fc8b9702d9d3591695478729a2a209996e2f83219ba8649a31afc02f286ad3f"} Mar 01 09:27:05 crc kubenswrapper[4792]: I0301 09:27:05.308622 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerStarted","Data":"d3c51b46f8c635f0bd922e6a816c6cbadeb855fe42f4d60474cde44514e4e4de"} Mar 01 09:27:05 crc kubenswrapper[4792]: I0301 09:27:05.308637 4792 scope.go:117] "RemoveContainer" containerID="f223d76bf80d673696c61fa09c13cc363c202a24d360c9e2da1e52c335e05521" Mar 01 09:27:05 crc kubenswrapper[4792]: I0301 09:27:05.654668 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-j6lbg" Mar 01 09:27:05 crc kubenswrapper[4792]: I0301 09:27:05.719583 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 01 09:27:05 crc kubenswrapper[4792]: I0301 09:27:05.719643 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 01 09:27:05 crc kubenswrapper[4792]: I0301 09:27:05.788799 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tn2gw\" (UniqueName: \"kubernetes.io/projected/77225d3a-9f2d-4aaf-8b98-6dc5310db3da-kube-api-access-tn2gw\") pod \"77225d3a-9f2d-4aaf-8b98-6dc5310db3da\" (UID: \"77225d3a-9f2d-4aaf-8b98-6dc5310db3da\") " Mar 01 09:27:05 crc kubenswrapper[4792]: I0301 09:27:05.788961 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77225d3a-9f2d-4aaf-8b98-6dc5310db3da-operator-scripts\") pod \"77225d3a-9f2d-4aaf-8b98-6dc5310db3da\" (UID: \"77225d3a-9f2d-4aaf-8b98-6dc5310db3da\") " Mar 01 09:27:05 crc kubenswrapper[4792]: I0301 09:27:05.789784 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77225d3a-9f2d-4aaf-8b98-6dc5310db3da-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "77225d3a-9f2d-4aaf-8b98-6dc5310db3da" (UID: "77225d3a-9f2d-4aaf-8b98-6dc5310db3da"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:05 crc kubenswrapper[4792]: I0301 09:27:05.793430 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 01 09:27:05 crc kubenswrapper[4792]: I0301 09:27:05.800153 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77225d3a-9f2d-4aaf-8b98-6dc5310db3da-kube-api-access-tn2gw" (OuterVolumeSpecName: "kube-api-access-tn2gw") pod "77225d3a-9f2d-4aaf-8b98-6dc5310db3da" (UID: "77225d3a-9f2d-4aaf-8b98-6dc5310db3da"). InnerVolumeSpecName "kube-api-access-tn2gw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:27:05 crc kubenswrapper[4792]: I0301 09:27:05.891276 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tn2gw\" (UniqueName: \"kubernetes.io/projected/77225d3a-9f2d-4aaf-8b98-6dc5310db3da-kube-api-access-tn2gw\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:05 crc kubenswrapper[4792]: I0301 09:27:05.891625 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77225d3a-9f2d-4aaf-8b98-6dc5310db3da-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.191368 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-dlv4c"] Mar 01 09:27:06 crc kubenswrapper[4792]: E0301 09:27:06.191927 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77225d3a-9f2d-4aaf-8b98-6dc5310db3da" containerName="mariadb-account-create-update" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.192016 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="77225d3a-9f2d-4aaf-8b98-6dc5310db3da" containerName="mariadb-account-create-update" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.192224 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="77225d3a-9f2d-4aaf-8b98-6dc5310db3da" containerName="mariadb-account-create-update" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.192745 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-dlv4c" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.206315 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-dlv4c"] Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.285880 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-2c3a-account-create-update-vnvfb"] Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.287108 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2c3a-account-create-update-vnvfb" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.291279 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.296596 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5-operator-scripts\") pod \"glance-db-create-dlv4c\" (UID: \"192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5\") " pod="openstack/glance-db-create-dlv4c" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.296687 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2pr2\" (UniqueName: \"kubernetes.io/projected/192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5-kube-api-access-v2pr2\") pod \"glance-db-create-dlv4c\" (UID: \"192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5\") " pod="openstack/glance-db-create-dlv4c" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.299588 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-2c3a-account-create-update-vnvfb"] Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.326363 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-j6lbg" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.326359 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-j6lbg" event={"ID":"77225d3a-9f2d-4aaf-8b98-6dc5310db3da","Type":"ContainerDied","Data":"955029b7b4fc653d1ccc05aa7f77cb849f104e2da77ccb3a30ce4c53323eba36"} Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.327535 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="955029b7b4fc653d1ccc05aa7f77cb849f104e2da77ccb3a30ce4c53323eba36" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.397560 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2pr2\" (UniqueName: \"kubernetes.io/projected/192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5-kube-api-access-v2pr2\") pod \"glance-db-create-dlv4c\" (UID: \"192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5\") " pod="openstack/glance-db-create-dlv4c" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.397646 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/869a99e5-f399-4938-ba59-bbe20e23385b-operator-scripts\") pod \"glance-2c3a-account-create-update-vnvfb\" (UID: \"869a99e5-f399-4938-ba59-bbe20e23385b\") " pod="openstack/glance-2c3a-account-create-update-vnvfb" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.397680 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db95w\" (UniqueName: \"kubernetes.io/projected/869a99e5-f399-4938-ba59-bbe20e23385b-kube-api-access-db95w\") pod \"glance-2c3a-account-create-update-vnvfb\" (UID: \"869a99e5-f399-4938-ba59-bbe20e23385b\") " pod="openstack/glance-2c3a-account-create-update-vnvfb" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.397768 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5-operator-scripts\") pod \"glance-db-create-dlv4c\" (UID: \"192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5\") " pod="openstack/glance-db-create-dlv4c" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.399353 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5-operator-scripts\") pod \"glance-db-create-dlv4c\" (UID: \"192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5\") " pod="openstack/glance-db-create-dlv4c" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.423499 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.424117 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2pr2\" (UniqueName: \"kubernetes.io/projected/192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5-kube-api-access-v2pr2\") pod \"glance-db-create-dlv4c\" (UID: \"192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5\") " pod="openstack/glance-db-create-dlv4c" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.499112 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/869a99e5-f399-4938-ba59-bbe20e23385b-operator-scripts\") pod \"glance-2c3a-account-create-update-vnvfb\" (UID: \"869a99e5-f399-4938-ba59-bbe20e23385b\") " pod="openstack/glance-2c3a-account-create-update-vnvfb" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.499820 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/869a99e5-f399-4938-ba59-bbe20e23385b-operator-scripts\") pod \"glance-2c3a-account-create-update-vnvfb\" (UID: \"869a99e5-f399-4938-ba59-bbe20e23385b\") " pod="openstack/glance-2c3a-account-create-update-vnvfb" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.499879 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-db95w\" (UniqueName: \"kubernetes.io/projected/869a99e5-f399-4938-ba59-bbe20e23385b-kube-api-access-db95w\") pod \"glance-2c3a-account-create-update-vnvfb\" (UID: \"869a99e5-f399-4938-ba59-bbe20e23385b\") " pod="openstack/glance-2c3a-account-create-update-vnvfb" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.511321 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-dlv4c" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.526456 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-db95w\" (UniqueName: \"kubernetes.io/projected/869a99e5-f399-4938-ba59-bbe20e23385b-kube-api-access-db95w\") pod \"glance-2c3a-account-create-update-vnvfb\" (UID: \"869a99e5-f399-4938-ba59-bbe20e23385b\") " pod="openstack/glance-2c3a-account-create-update-vnvfb" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.619852 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2c3a-account-create-update-vnvfb" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.967493 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-f95nh"] Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.968648 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-f95nh" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.983232 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-f95nh"] Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.991276 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-dlv4c"] Mar 01 09:27:06 crc kubenswrapper[4792]: W0301 09:27:06.999028 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod192b539c_c4b9_4c4e_93e3_23b6dc0d7ec5.slice/crio-09c567031c462356ea2900b9a1ef75321e0cbd49a89efe5f19a9b5a3e329bfaf WatchSource:0}: Error finding container 09c567031c462356ea2900b9a1ef75321e0cbd49a89efe5f19a9b5a3e329bfaf: Status 404 returned error can't find the container with id 09c567031c462356ea2900b9a1ef75321e0cbd49a89efe5f19a9b5a3e329bfaf Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.077278 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-11c1-account-create-update-8h9xf"] Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.078168 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-11c1-account-create-update-8h9xf" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.082891 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.091621 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-11c1-account-create-update-8h9xf"] Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.116460 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqd46\" (UniqueName: \"kubernetes.io/projected/127158ae-b49c-42bd-932d-af85eafce8c0-kube-api-access-vqd46\") pod \"keystone-db-create-f95nh\" (UID: \"127158ae-b49c-42bd-932d-af85eafce8c0\") " pod="openstack/keystone-db-create-f95nh" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.116509 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/127158ae-b49c-42bd-932d-af85eafce8c0-operator-scripts\") pod \"keystone-db-create-f95nh\" (UID: \"127158ae-b49c-42bd-932d-af85eafce8c0\") " pod="openstack/keystone-db-create-f95nh" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.191643 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-8zsss"] Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.193326 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8zsss" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.217845 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/127158ae-b49c-42bd-932d-af85eafce8c0-operator-scripts\") pod \"keystone-db-create-f95nh\" (UID: \"127158ae-b49c-42bd-932d-af85eafce8c0\") " pod="openstack/keystone-db-create-f95nh" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.218057 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj9z6\" (UniqueName: \"kubernetes.io/projected/46d8b4e1-c1b5-468c-b319-84985c525d6a-kube-api-access-zj9z6\") pod \"keystone-11c1-account-create-update-8h9xf\" (UID: \"46d8b4e1-c1b5-468c-b319-84985c525d6a\") " pod="openstack/keystone-11c1-account-create-update-8h9xf" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.218106 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46d8b4e1-c1b5-468c-b319-84985c525d6a-operator-scripts\") pod \"keystone-11c1-account-create-update-8h9xf\" (UID: \"46d8b4e1-c1b5-468c-b319-84985c525d6a\") " pod="openstack/keystone-11c1-account-create-update-8h9xf" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.218125 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqd46\" (UniqueName: \"kubernetes.io/projected/127158ae-b49c-42bd-932d-af85eafce8c0-kube-api-access-vqd46\") pod \"keystone-db-create-f95nh\" (UID: \"127158ae-b49c-42bd-932d-af85eafce8c0\") " pod="openstack/keystone-db-create-f95nh" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.218707 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-8zsss"] Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.219274 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/127158ae-b49c-42bd-932d-af85eafce8c0-operator-scripts\") pod \"keystone-db-create-f95nh\" (UID: \"127158ae-b49c-42bd-932d-af85eafce8c0\") " pod="openstack/keystone-db-create-f95nh" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.246663 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqd46\" (UniqueName: \"kubernetes.io/projected/127158ae-b49c-42bd-932d-af85eafce8c0-kube-api-access-vqd46\") pod \"keystone-db-create-f95nh\" (UID: \"127158ae-b49c-42bd-932d-af85eafce8c0\") " pod="openstack/keystone-db-create-f95nh" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.249536 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-2c3a-account-create-update-vnvfb"] Mar 01 09:27:07 crc kubenswrapper[4792]: W0301 09:27:07.255072 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod869a99e5_f399_4938_ba59_bbe20e23385b.slice/crio-2a94915c9a7958c2ca19c1867a8b54e166c8909a2e2ab8660c53f3728dff551a WatchSource:0}: Error finding container 2a94915c9a7958c2ca19c1867a8b54e166c8909a2e2ab8660c53f3728dff551a: Status 404 returned error can't find the container with id 2a94915c9a7958c2ca19c1867a8b54e166c8909a2e2ab8660c53f3728dff551a Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.275444 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-d8c5-account-create-update-z4zgs"] Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.277525 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d8c5-account-create-update-z4zgs" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.288465 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.296403 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d8c5-account-create-update-z4zgs"] Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.296696 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-f95nh" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.320047 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxrtr\" (UniqueName: \"kubernetes.io/projected/272107df-b15b-4c97-b9b0-e865f9a391da-kube-api-access-rxrtr\") pod \"placement-db-create-8zsss\" (UID: \"272107df-b15b-4c97-b9b0-e865f9a391da\") " pod="openstack/placement-db-create-8zsss" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.320142 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zj9z6\" (UniqueName: \"kubernetes.io/projected/46d8b4e1-c1b5-468c-b319-84985c525d6a-kube-api-access-zj9z6\") pod \"keystone-11c1-account-create-update-8h9xf\" (UID: \"46d8b4e1-c1b5-468c-b319-84985c525d6a\") " pod="openstack/keystone-11c1-account-create-update-8h9xf" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.320202 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/272107df-b15b-4c97-b9b0-e865f9a391da-operator-scripts\") pod \"placement-db-create-8zsss\" (UID: \"272107df-b15b-4c97-b9b0-e865f9a391da\") " pod="openstack/placement-db-create-8zsss" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.320262 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46d8b4e1-c1b5-468c-b319-84985c525d6a-operator-scripts\") pod \"keystone-11c1-account-create-update-8h9xf\" (UID: \"46d8b4e1-c1b5-468c-b319-84985c525d6a\") " pod="openstack/keystone-11c1-account-create-update-8h9xf" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.321159 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46d8b4e1-c1b5-468c-b319-84985c525d6a-operator-scripts\") pod \"keystone-11c1-account-create-update-8h9xf\" (UID: \"46d8b4e1-c1b5-468c-b319-84985c525d6a\") " pod="openstack/keystone-11c1-account-create-update-8h9xf" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.336459 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj9z6\" (UniqueName: \"kubernetes.io/projected/46d8b4e1-c1b5-468c-b319-84985c525d6a-kube-api-access-zj9z6\") pod \"keystone-11c1-account-create-update-8h9xf\" (UID: \"46d8b4e1-c1b5-468c-b319-84985c525d6a\") " pod="openstack/keystone-11c1-account-create-update-8h9xf" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.345147 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-dlv4c" event={"ID":"192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5","Type":"ContainerStarted","Data":"09c567031c462356ea2900b9a1ef75321e0cbd49a89efe5f19a9b5a3e329bfaf"} Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.346842 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2c3a-account-create-update-vnvfb" event={"ID":"869a99e5-f399-4938-ba59-bbe20e23385b","Type":"ContainerStarted","Data":"2a94915c9a7958c2ca19c1867a8b54e166c8909a2e2ab8660c53f3728dff551a"} Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.390659 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-11c1-account-create-update-8h9xf" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.429779 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58e6fcdd-44b2-4c03-9cf6-a772bd0c3779-operator-scripts\") pod \"placement-d8c5-account-create-update-z4zgs\" (UID: \"58e6fcdd-44b2-4c03-9cf6-a772bd0c3779\") " pod="openstack/placement-d8c5-account-create-update-z4zgs" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.429852 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxrtr\" (UniqueName: \"kubernetes.io/projected/272107df-b15b-4c97-b9b0-e865f9a391da-kube-api-access-rxrtr\") pod \"placement-db-create-8zsss\" (UID: \"272107df-b15b-4c97-b9b0-e865f9a391da\") " pod="openstack/placement-db-create-8zsss" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.429941 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/272107df-b15b-4c97-b9b0-e865f9a391da-operator-scripts\") pod \"placement-db-create-8zsss\" (UID: \"272107df-b15b-4c97-b9b0-e865f9a391da\") " pod="openstack/placement-db-create-8zsss" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.429987 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9s2d\" (UniqueName: \"kubernetes.io/projected/58e6fcdd-44b2-4c03-9cf6-a772bd0c3779-kube-api-access-j9s2d\") pod \"placement-d8c5-account-create-update-z4zgs\" (UID: \"58e6fcdd-44b2-4c03-9cf6-a772bd0c3779\") " pod="openstack/placement-d8c5-account-create-update-z4zgs" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.430938 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/272107df-b15b-4c97-b9b0-e865f9a391da-operator-scripts\") pod \"placement-db-create-8zsss\" (UID: \"272107df-b15b-4c97-b9b0-e865f9a391da\") " pod="openstack/placement-db-create-8zsss" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.449721 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxrtr\" (UniqueName: \"kubernetes.io/projected/272107df-b15b-4c97-b9b0-e865f9a391da-kube-api-access-rxrtr\") pod \"placement-db-create-8zsss\" (UID: \"272107df-b15b-4c97-b9b0-e865f9a391da\") " pod="openstack/placement-db-create-8zsss" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.510654 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8zsss" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.531234 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9s2d\" (UniqueName: \"kubernetes.io/projected/58e6fcdd-44b2-4c03-9cf6-a772bd0c3779-kube-api-access-j9s2d\") pod \"placement-d8c5-account-create-update-z4zgs\" (UID: \"58e6fcdd-44b2-4c03-9cf6-a772bd0c3779\") " pod="openstack/placement-d8c5-account-create-update-z4zgs" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.532401 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58e6fcdd-44b2-4c03-9cf6-a772bd0c3779-operator-scripts\") pod \"placement-d8c5-account-create-update-z4zgs\" (UID: \"58e6fcdd-44b2-4c03-9cf6-a772bd0c3779\") " pod="openstack/placement-d8c5-account-create-update-z4zgs" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.533573 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58e6fcdd-44b2-4c03-9cf6-a772bd0c3779-operator-scripts\") pod \"placement-d8c5-account-create-update-z4zgs\" (UID: \"58e6fcdd-44b2-4c03-9cf6-a772bd0c3779\") " pod="openstack/placement-d8c5-account-create-update-z4zgs" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.566537 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9s2d\" (UniqueName: \"kubernetes.io/projected/58e6fcdd-44b2-4c03-9cf6-a772bd0c3779-kube-api-access-j9s2d\") pod \"placement-d8c5-account-create-update-z4zgs\" (UID: \"58e6fcdd-44b2-4c03-9cf6-a772bd0c3779\") " pod="openstack/placement-d8c5-account-create-update-z4zgs" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.611225 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d8c5-account-create-update-z4zgs" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.871443 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-f95nh"] Mar 01 09:27:07 crc kubenswrapper[4792]: W0301 09:27:07.875155 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod127158ae_b49c_42bd_932d_af85eafce8c0.slice/crio-9a99c4b2686168f24fce829e1a3e70dd78fb16bebec4c0aa6697c70dee318f7b WatchSource:0}: Error finding container 9a99c4b2686168f24fce829e1a3e70dd78fb16bebec4c0aa6697c70dee318f7b: Status 404 returned error can't find the container with id 9a99c4b2686168f24fce829e1a3e70dd78fb16bebec4c0aa6697c70dee318f7b Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.946363 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-11c1-account-create-update-8h9xf"] Mar 01 09:27:08 crc kubenswrapper[4792]: I0301 09:27:08.055670 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-8zsss"] Mar 01 09:27:08 crc kubenswrapper[4792]: W0301 09:27:08.072102 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod272107df_b15b_4c97_b9b0_e865f9a391da.slice/crio-63bb42f2da2fd3c9a15586710b3a6b5b1ba3e77ac05b9db5ef3689f411e43763 WatchSource:0}: Error finding container 63bb42f2da2fd3c9a15586710b3a6b5b1ba3e77ac05b9db5ef3689f411e43763: Status 404 returned error can't find the container with id 63bb42f2da2fd3c9a15586710b3a6b5b1ba3e77ac05b9db5ef3689f411e43763 Mar 01 09:27:08 crc kubenswrapper[4792]: I0301 09:27:08.186460 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d8c5-account-create-update-z4zgs"] Mar 01 09:27:08 crc kubenswrapper[4792]: I0301 09:27:08.356242 4792 generic.go:334] "Generic (PLEG): container finished" podID="869a99e5-f399-4938-ba59-bbe20e23385b" containerID="5c3a4231cfc20731f9ac2774fb470c532f5db1e9d44253c60e2e47577fa458dc" exitCode=0 Mar 01 09:27:08 crc kubenswrapper[4792]: I0301 09:27:08.356308 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2c3a-account-create-update-vnvfb" event={"ID":"869a99e5-f399-4938-ba59-bbe20e23385b","Type":"ContainerDied","Data":"5c3a4231cfc20731f9ac2774fb470c532f5db1e9d44253c60e2e47577fa458dc"} Mar 01 09:27:08 crc kubenswrapper[4792]: I0301 09:27:08.357536 4792 generic.go:334] "Generic (PLEG): container finished" podID="127158ae-b49c-42bd-932d-af85eafce8c0" containerID="b99cdab13c59b3d72ed63dfb54dc704e52617818eb25d09b6ad0f435b22c114f" exitCode=0 Mar 01 09:27:08 crc kubenswrapper[4792]: I0301 09:27:08.357607 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-f95nh" event={"ID":"127158ae-b49c-42bd-932d-af85eafce8c0","Type":"ContainerDied","Data":"b99cdab13c59b3d72ed63dfb54dc704e52617818eb25d09b6ad0f435b22c114f"} Mar 01 09:27:08 crc kubenswrapper[4792]: I0301 09:27:08.357633 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-f95nh" event={"ID":"127158ae-b49c-42bd-932d-af85eafce8c0","Type":"ContainerStarted","Data":"9a99c4b2686168f24fce829e1a3e70dd78fb16bebec4c0aa6697c70dee318f7b"} Mar 01 09:27:08 crc kubenswrapper[4792]: I0301 09:27:08.359780 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d8c5-account-create-update-z4zgs" event={"ID":"58e6fcdd-44b2-4c03-9cf6-a772bd0c3779","Type":"ContainerStarted","Data":"4bfc939eefab66aa8b4620743c13ae7fca651fc6f905e4eef02dd74188eabdb0"} Mar 01 09:27:08 crc kubenswrapper[4792]: I0301 09:27:08.361001 4792 generic.go:334] "Generic (PLEG): container finished" podID="192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5" containerID="0efefdd3dac5f3f586a3c6d6e7f2ba1305e9a8e8544b4e285a4ab7c3e12e8018" exitCode=0 Mar 01 09:27:08 crc kubenswrapper[4792]: I0301 09:27:08.361055 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-dlv4c" event={"ID":"192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5","Type":"ContainerDied","Data":"0efefdd3dac5f3f586a3c6d6e7f2ba1305e9a8e8544b4e285a4ab7c3e12e8018"} Mar 01 09:27:08 crc kubenswrapper[4792]: I0301 09:27:08.362043 4792 generic.go:334] "Generic (PLEG): container finished" podID="46d8b4e1-c1b5-468c-b319-84985c525d6a" containerID="79a53da4edce2f856b264b84a40ae3b0fe791d8730afd70b1bb1b19a59aff3f9" exitCode=0 Mar 01 09:27:08 crc kubenswrapper[4792]: I0301 09:27:08.362088 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-11c1-account-create-update-8h9xf" event={"ID":"46d8b4e1-c1b5-468c-b319-84985c525d6a","Type":"ContainerDied","Data":"79a53da4edce2f856b264b84a40ae3b0fe791d8730afd70b1bb1b19a59aff3f9"} Mar 01 09:27:08 crc kubenswrapper[4792]: I0301 09:27:08.362103 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-11c1-account-create-update-8h9xf" event={"ID":"46d8b4e1-c1b5-468c-b319-84985c525d6a","Type":"ContainerStarted","Data":"013b3e77f344d3b66e5bc6a232922ce5e3b73c8cdd4b83b86d6000024b74e7d3"} Mar 01 09:27:08 crc kubenswrapper[4792]: I0301 09:27:08.364100 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-8zsss" event={"ID":"272107df-b15b-4c97-b9b0-e865f9a391da","Type":"ContainerStarted","Data":"98147a64ef321c5a2be94da9872cf3444a2d6ee5365cb23fe0e8d40238b6ab98"} Mar 01 09:27:08 crc kubenswrapper[4792]: I0301 09:27:08.364124 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-8zsss" event={"ID":"272107df-b15b-4c97-b9b0-e865f9a391da","Type":"ContainerStarted","Data":"63bb42f2da2fd3c9a15586710b3a6b5b1ba3e77ac05b9db5ef3689f411e43763"} Mar 01 09:27:08 crc kubenswrapper[4792]: I0301 09:27:08.407946 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-8zsss" podStartSLOduration=1.407929183 podStartE2EDuration="1.407929183s" podCreationTimestamp="2026-03-01 09:27:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:27:08.406180392 +0000 UTC m=+1157.648059589" watchObservedRunningTime="2026-03-01 09:27:08.407929183 +0000 UTC m=+1157.649808380" Mar 01 09:27:09 crc kubenswrapper[4792]: I0301 09:27:09.371447 4792 generic.go:334] "Generic (PLEG): container finished" podID="272107df-b15b-4c97-b9b0-e865f9a391da" containerID="98147a64ef321c5a2be94da9872cf3444a2d6ee5365cb23fe0e8d40238b6ab98" exitCode=0 Mar 01 09:27:09 crc kubenswrapper[4792]: I0301 09:27:09.371699 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-8zsss" event={"ID":"272107df-b15b-4c97-b9b0-e865f9a391da","Type":"ContainerDied","Data":"98147a64ef321c5a2be94da9872cf3444a2d6ee5365cb23fe0e8d40238b6ab98"} Mar 01 09:27:09 crc kubenswrapper[4792]: I0301 09:27:09.373792 4792 generic.go:334] "Generic (PLEG): container finished" podID="58e6fcdd-44b2-4c03-9cf6-a772bd0c3779" containerID="3b9a5bf9216213ab73f7db6aa95b33bd1c546b1770c33a00558994664a8fc4ce" exitCode=0 Mar 01 09:27:09 crc kubenswrapper[4792]: I0301 09:27:09.373891 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d8c5-account-create-update-z4zgs" event={"ID":"58e6fcdd-44b2-4c03-9cf6-a772bd0c3779","Type":"ContainerDied","Data":"3b9a5bf9216213ab73f7db6aa95b33bd1c546b1770c33a00558994664a8fc4ce"} Mar 01 09:27:09 crc kubenswrapper[4792]: I0301 09:27:09.692734 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-f95nh" Mar 01 09:27:09 crc kubenswrapper[4792]: I0301 09:27:09.799951 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2c3a-account-create-update-vnvfb" Mar 01 09:27:09 crc kubenswrapper[4792]: I0301 09:27:09.875183 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqd46\" (UniqueName: \"kubernetes.io/projected/127158ae-b49c-42bd-932d-af85eafce8c0-kube-api-access-vqd46\") pod \"127158ae-b49c-42bd-932d-af85eafce8c0\" (UID: \"127158ae-b49c-42bd-932d-af85eafce8c0\") " Mar 01 09:27:09 crc kubenswrapper[4792]: I0301 09:27:09.875220 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/127158ae-b49c-42bd-932d-af85eafce8c0-operator-scripts\") pod \"127158ae-b49c-42bd-932d-af85eafce8c0\" (UID: \"127158ae-b49c-42bd-932d-af85eafce8c0\") " Mar 01 09:27:09 crc kubenswrapper[4792]: I0301 09:27:09.876516 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/127158ae-b49c-42bd-932d-af85eafce8c0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "127158ae-b49c-42bd-932d-af85eafce8c0" (UID: "127158ae-b49c-42bd-932d-af85eafce8c0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:09 crc kubenswrapper[4792]: I0301 09:27:09.882292 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/127158ae-b49c-42bd-932d-af85eafce8c0-kube-api-access-vqd46" (OuterVolumeSpecName: "kube-api-access-vqd46") pod "127158ae-b49c-42bd-932d-af85eafce8c0" (UID: "127158ae-b49c-42bd-932d-af85eafce8c0"). InnerVolumeSpecName "kube-api-access-vqd46". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:27:09 crc kubenswrapper[4792]: I0301 09:27:09.918596 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-dlv4c" Mar 01 09:27:09 crc kubenswrapper[4792]: I0301 09:27:09.927121 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-11c1-account-create-update-8h9xf" Mar 01 09:27:09 crc kubenswrapper[4792]: I0301 09:27:09.959590 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" Mar 01 09:27:09 crc kubenswrapper[4792]: I0301 09:27:09.976588 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/869a99e5-f399-4938-ba59-bbe20e23385b-operator-scripts\") pod \"869a99e5-f399-4938-ba59-bbe20e23385b\" (UID: \"869a99e5-f399-4938-ba59-bbe20e23385b\") " Mar 01 09:27:09 crc kubenswrapper[4792]: I0301 09:27:09.976642 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-db95w\" (UniqueName: \"kubernetes.io/projected/869a99e5-f399-4938-ba59-bbe20e23385b-kube-api-access-db95w\") pod \"869a99e5-f399-4938-ba59-bbe20e23385b\" (UID: \"869a99e5-f399-4938-ba59-bbe20e23385b\") " Mar 01 09:27:09 crc kubenswrapper[4792]: I0301 09:27:09.977027 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/869a99e5-f399-4938-ba59-bbe20e23385b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "869a99e5-f399-4938-ba59-bbe20e23385b" (UID: "869a99e5-f399-4938-ba59-bbe20e23385b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:09 crc kubenswrapper[4792]: I0301 09:27:09.977106 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqd46\" (UniqueName: \"kubernetes.io/projected/127158ae-b49c-42bd-932d-af85eafce8c0-kube-api-access-vqd46\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:09 crc kubenswrapper[4792]: I0301 09:27:09.977125 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/127158ae-b49c-42bd-932d-af85eafce8c0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:09 crc kubenswrapper[4792]: I0301 09:27:09.982779 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/869a99e5-f399-4938-ba59-bbe20e23385b-kube-api-access-db95w" (OuterVolumeSpecName: "kube-api-access-db95w") pod "869a99e5-f399-4938-ba59-bbe20e23385b" (UID: "869a99e5-f399-4938-ba59-bbe20e23385b"). InnerVolumeSpecName "kube-api-access-db95w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.078128 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zj9z6\" (UniqueName: \"kubernetes.io/projected/46d8b4e1-c1b5-468c-b319-84985c525d6a-kube-api-access-zj9z6\") pod \"46d8b4e1-c1b5-468c-b319-84985c525d6a\" (UID: \"46d8b4e1-c1b5-468c-b319-84985c525d6a\") " Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.078233 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46d8b4e1-c1b5-468c-b319-84985c525d6a-operator-scripts\") pod \"46d8b4e1-c1b5-468c-b319-84985c525d6a\" (UID: \"46d8b4e1-c1b5-468c-b319-84985c525d6a\") " Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.078301 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5-operator-scripts\") pod \"192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5\" (UID: \"192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5\") " Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.078321 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2pr2\" (UniqueName: \"kubernetes.io/projected/192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5-kube-api-access-v2pr2\") pod \"192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5\" (UID: \"192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5\") " Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.078742 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-db95w\" (UniqueName: \"kubernetes.io/projected/869a99e5-f399-4938-ba59-bbe20e23385b-kube-api-access-db95w\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.078759 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/869a99e5-f399-4938-ba59-bbe20e23385b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.079136 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46d8b4e1-c1b5-468c-b319-84985c525d6a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "46d8b4e1-c1b5-468c-b319-84985c525d6a" (UID: "46d8b4e1-c1b5-468c-b319-84985c525d6a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.079606 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5" (UID: "192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.083387 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5-kube-api-access-v2pr2" (OuterVolumeSpecName: "kube-api-access-v2pr2") pod "192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5" (UID: "192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5"). InnerVolumeSpecName "kube-api-access-v2pr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.083400 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46d8b4e1-c1b5-468c-b319-84985c525d6a-kube-api-access-zj9z6" (OuterVolumeSpecName: "kube-api-access-zj9z6") pod "46d8b4e1-c1b5-468c-b319-84985c525d6a" (UID: "46d8b4e1-c1b5-468c-b319-84985c525d6a"). InnerVolumeSpecName "kube-api-access-zj9z6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.179948 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.180129 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2pr2\" (UniqueName: \"kubernetes.io/projected/192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5-kube-api-access-v2pr2\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.180208 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zj9z6\" (UniqueName: \"kubernetes.io/projected/46d8b4e1-c1b5-468c-b319-84985c525d6a-kube-api-access-zj9z6\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.180263 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46d8b4e1-c1b5-468c-b319-84985c525d6a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.381428 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2c3a-account-create-update-vnvfb" event={"ID":"869a99e5-f399-4938-ba59-bbe20e23385b","Type":"ContainerDied","Data":"2a94915c9a7958c2ca19c1867a8b54e166c8909a2e2ab8660c53f3728dff551a"} Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.381750 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a94915c9a7958c2ca19c1867a8b54e166c8909a2e2ab8660c53f3728dff551a" Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.381522 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2c3a-account-create-update-vnvfb" Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.385370 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-f95nh" event={"ID":"127158ae-b49c-42bd-932d-af85eafce8c0","Type":"ContainerDied","Data":"9a99c4b2686168f24fce829e1a3e70dd78fb16bebec4c0aa6697c70dee318f7b"} Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.385458 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a99c4b2686168f24fce829e1a3e70dd78fb16bebec4c0aa6697c70dee318f7b" Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.385588 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-f95nh" Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.391512 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.392435 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-dlv4c" Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.392483 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-dlv4c" event={"ID":"192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5","Type":"ContainerDied","Data":"09c567031c462356ea2900b9a1ef75321e0cbd49a89efe5f19a9b5a3e329bfaf"} Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.392521 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09c567031c462356ea2900b9a1ef75321e0cbd49a89efe5f19a9b5a3e329bfaf" Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.394258 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-11c1-account-create-update-8h9xf" event={"ID":"46d8b4e1-c1b5-468c-b319-84985c525d6a","Type":"ContainerDied","Data":"013b3e77f344d3b66e5bc6a232922ce5e3b73c8cdd4b83b86d6000024b74e7d3"} Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.394292 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="013b3e77f344d3b66e5bc6a232922ce5e3b73c8cdd4b83b86d6000024b74e7d3" Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.394414 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-11c1-account-create-update-8h9xf" Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.474692 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-j6c8g"] Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.474914 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" podUID="87678b56-0909-4735-ad6b-cb992dc86853" containerName="dnsmasq-dns" containerID="cri-o://145f85c9e4bed69559d3fa2321198a2801d674c147a18cd06a91ee52950e9894" gracePeriod=10 Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.863009 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d8c5-account-create-update-z4zgs" Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.872836 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8zsss" Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.993394 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9s2d\" (UniqueName: \"kubernetes.io/projected/58e6fcdd-44b2-4c03-9cf6-a772bd0c3779-kube-api-access-j9s2d\") pod \"58e6fcdd-44b2-4c03-9cf6-a772bd0c3779\" (UID: \"58e6fcdd-44b2-4c03-9cf6-a772bd0c3779\") " Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.993787 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58e6fcdd-44b2-4c03-9cf6-a772bd0c3779-operator-scripts\") pod \"58e6fcdd-44b2-4c03-9cf6-a772bd0c3779\" (UID: \"58e6fcdd-44b2-4c03-9cf6-a772bd0c3779\") " Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.993870 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxrtr\" (UniqueName: \"kubernetes.io/projected/272107df-b15b-4c97-b9b0-e865f9a391da-kube-api-access-rxrtr\") pod \"272107df-b15b-4c97-b9b0-e865f9a391da\" (UID: \"272107df-b15b-4c97-b9b0-e865f9a391da\") " Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.993892 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/272107df-b15b-4c97-b9b0-e865f9a391da-operator-scripts\") pod \"272107df-b15b-4c97-b9b0-e865f9a391da\" (UID: \"272107df-b15b-4c97-b9b0-e865f9a391da\") " Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.994351 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58e6fcdd-44b2-4c03-9cf6-a772bd0c3779-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "58e6fcdd-44b2-4c03-9cf6-a772bd0c3779" (UID: "58e6fcdd-44b2-4c03-9cf6-a772bd0c3779"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.994541 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/272107df-b15b-4c97-b9b0-e865f9a391da-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "272107df-b15b-4c97-b9b0-e865f9a391da" (UID: "272107df-b15b-4c97-b9b0-e865f9a391da"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.003138 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/272107df-b15b-4c97-b9b0-e865f9a391da-kube-api-access-rxrtr" (OuterVolumeSpecName: "kube-api-access-rxrtr") pod "272107df-b15b-4c97-b9b0-e865f9a391da" (UID: "272107df-b15b-4c97-b9b0-e865f9a391da"). InnerVolumeSpecName "kube-api-access-rxrtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.003613 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58e6fcdd-44b2-4c03-9cf6-a772bd0c3779-kube-api-access-j9s2d" (OuterVolumeSpecName: "kube-api-access-j9s2d") pod "58e6fcdd-44b2-4c03-9cf6-a772bd0c3779" (UID: "58e6fcdd-44b2-4c03-9cf6-a772bd0c3779"). InnerVolumeSpecName "kube-api-access-j9s2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.027395 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.095550 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxrtr\" (UniqueName: \"kubernetes.io/projected/272107df-b15b-4c97-b9b0-e865f9a391da-kube-api-access-rxrtr\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.095581 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/272107df-b15b-4c97-b9b0-e865f9a391da-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.095590 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9s2d\" (UniqueName: \"kubernetes.io/projected/58e6fcdd-44b2-4c03-9cf6-a772bd0c3779-kube-api-access-j9s2d\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.095599 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58e6fcdd-44b2-4c03-9cf6-a772bd0c3779-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.196614 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87678b56-0909-4735-ad6b-cb992dc86853-config\") pod \"87678b56-0909-4735-ad6b-cb992dc86853\" (UID: \"87678b56-0909-4735-ad6b-cb992dc86853\") " Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.196684 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzbw9\" (UniqueName: \"kubernetes.io/projected/87678b56-0909-4735-ad6b-cb992dc86853-kube-api-access-tzbw9\") pod \"87678b56-0909-4735-ad6b-cb992dc86853\" (UID: \"87678b56-0909-4735-ad6b-cb992dc86853\") " Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.196780 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87678b56-0909-4735-ad6b-cb992dc86853-dns-svc\") pod \"87678b56-0909-4735-ad6b-cb992dc86853\" (UID: \"87678b56-0909-4735-ad6b-cb992dc86853\") " Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.196839 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87678b56-0909-4735-ad6b-cb992dc86853-ovsdbserver-nb\") pod \"87678b56-0909-4735-ad6b-cb992dc86853\" (UID: \"87678b56-0909-4735-ad6b-cb992dc86853\") " Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.200076 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87678b56-0909-4735-ad6b-cb992dc86853-kube-api-access-tzbw9" (OuterVolumeSpecName: "kube-api-access-tzbw9") pod "87678b56-0909-4735-ad6b-cb992dc86853" (UID: "87678b56-0909-4735-ad6b-cb992dc86853"). InnerVolumeSpecName "kube-api-access-tzbw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.231917 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87678b56-0909-4735-ad6b-cb992dc86853-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "87678b56-0909-4735-ad6b-cb992dc86853" (UID: "87678b56-0909-4735-ad6b-cb992dc86853"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.247402 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87678b56-0909-4735-ad6b-cb992dc86853-config" (OuterVolumeSpecName: "config") pod "87678b56-0909-4735-ad6b-cb992dc86853" (UID: "87678b56-0909-4735-ad6b-cb992dc86853"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.248325 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87678b56-0909-4735-ad6b-cb992dc86853-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "87678b56-0909-4735-ad6b-cb992dc86853" (UID: "87678b56-0909-4735-ad6b-cb992dc86853"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.298617 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87678b56-0909-4735-ad6b-cb992dc86853-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.298665 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87678b56-0909-4735-ad6b-cb992dc86853-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.298682 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87678b56-0909-4735-ad6b-cb992dc86853-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.298699 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzbw9\" (UniqueName: \"kubernetes.io/projected/87678b56-0909-4735-ad6b-cb992dc86853-kube-api-access-tzbw9\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.397105 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-ks68h"] Mar 01 09:27:11 crc kubenswrapper[4792]: E0301 09:27:11.397400 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5" containerName="mariadb-database-create" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.397413 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5" containerName="mariadb-database-create" Mar 01 09:27:11 crc kubenswrapper[4792]: E0301 09:27:11.397434 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58e6fcdd-44b2-4c03-9cf6-a772bd0c3779" containerName="mariadb-account-create-update" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.397439 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="58e6fcdd-44b2-4c03-9cf6-a772bd0c3779" containerName="mariadb-account-create-update" Mar 01 09:27:11 crc kubenswrapper[4792]: E0301 09:27:11.397450 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="272107df-b15b-4c97-b9b0-e865f9a391da" containerName="mariadb-database-create" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.397456 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="272107df-b15b-4c97-b9b0-e865f9a391da" containerName="mariadb-database-create" Mar 01 09:27:11 crc kubenswrapper[4792]: E0301 09:27:11.397465 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87678b56-0909-4735-ad6b-cb992dc86853" containerName="init" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.397471 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="87678b56-0909-4735-ad6b-cb992dc86853" containerName="init" Mar 01 09:27:11 crc kubenswrapper[4792]: E0301 09:27:11.397486 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="869a99e5-f399-4938-ba59-bbe20e23385b" containerName="mariadb-account-create-update" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.397492 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="869a99e5-f399-4938-ba59-bbe20e23385b" containerName="mariadb-account-create-update" Mar 01 09:27:11 crc kubenswrapper[4792]: E0301 09:27:11.397501 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87678b56-0909-4735-ad6b-cb992dc86853" containerName="dnsmasq-dns" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.397506 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="87678b56-0909-4735-ad6b-cb992dc86853" containerName="dnsmasq-dns" Mar 01 09:27:11 crc kubenswrapper[4792]: E0301 09:27:11.397519 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="127158ae-b49c-42bd-932d-af85eafce8c0" containerName="mariadb-database-create" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.397524 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="127158ae-b49c-42bd-932d-af85eafce8c0" containerName="mariadb-database-create" Mar 01 09:27:11 crc kubenswrapper[4792]: E0301 09:27:11.397538 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46d8b4e1-c1b5-468c-b319-84985c525d6a" containerName="mariadb-account-create-update" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.397543 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="46d8b4e1-c1b5-468c-b319-84985c525d6a" containerName="mariadb-account-create-update" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.397683 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="46d8b4e1-c1b5-468c-b319-84985c525d6a" containerName="mariadb-account-create-update" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.397693 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="87678b56-0909-4735-ad6b-cb992dc86853" containerName="dnsmasq-dns" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.397704 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="272107df-b15b-4c97-b9b0-e865f9a391da" containerName="mariadb-database-create" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.397716 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="58e6fcdd-44b2-4c03-9cf6-a772bd0c3779" containerName="mariadb-account-create-update" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.397725 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5" containerName="mariadb-database-create" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.397734 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="869a99e5-f399-4938-ba59-bbe20e23385b" containerName="mariadb-account-create-update" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.397746 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="127158ae-b49c-42bd-932d-af85eafce8c0" containerName="mariadb-database-create" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.398202 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ks68h" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.400581 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.404510 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d8c5-account-create-update-z4zgs" event={"ID":"58e6fcdd-44b2-4c03-9cf6-a772bd0c3779","Type":"ContainerDied","Data":"4bfc939eefab66aa8b4620743c13ae7fca651fc6f905e4eef02dd74188eabdb0"} Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.404542 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d8c5-account-create-update-z4zgs" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.404557 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4bfc939eefab66aa8b4620743c13ae7fca651fc6f905e4eef02dd74188eabdb0" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.407326 4792 generic.go:334] "Generic (PLEG): container finished" podID="87678b56-0909-4735-ad6b-cb992dc86853" containerID="145f85c9e4bed69559d3fa2321198a2801d674c147a18cd06a91ee52950e9894" exitCode=0 Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.407372 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" event={"ID":"87678b56-0909-4735-ad6b-cb992dc86853","Type":"ContainerDied","Data":"145f85c9e4bed69559d3fa2321198a2801d674c147a18cd06a91ee52950e9894"} Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.407392 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" event={"ID":"87678b56-0909-4735-ad6b-cb992dc86853","Type":"ContainerDied","Data":"0a697580492b727fa9bf9603c48fb17f94e586d5d0286904452488d9188bd582"} Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.407408 4792 scope.go:117] "RemoveContainer" containerID="145f85c9e4bed69559d3fa2321198a2801d674c147a18cd06a91ee52950e9894" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.407508 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.409070 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-vwjrh" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.411305 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8zsss" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.436855 4792 scope.go:117] "RemoveContainer" containerID="dda5ed153c6754e09b06fdc62aa62a423747b40a02478faacd4a8e291bb4a8b7" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.473836 4792 scope.go:117] "RemoveContainer" containerID="145f85c9e4bed69559d3fa2321198a2801d674c147a18cd06a91ee52950e9894" Mar 01 09:27:11 crc kubenswrapper[4792]: E0301 09:27:11.474409 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"145f85c9e4bed69559d3fa2321198a2801d674c147a18cd06a91ee52950e9894\": container with ID starting with 145f85c9e4bed69559d3fa2321198a2801d674c147a18cd06a91ee52950e9894 not found: ID does not exist" containerID="145f85c9e4bed69559d3fa2321198a2801d674c147a18cd06a91ee52950e9894" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.474448 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"145f85c9e4bed69559d3fa2321198a2801d674c147a18cd06a91ee52950e9894"} err="failed to get container status \"145f85c9e4bed69559d3fa2321198a2801d674c147a18cd06a91ee52950e9894\": rpc error: code = NotFound desc = could not find container \"145f85c9e4bed69559d3fa2321198a2801d674c147a18cd06a91ee52950e9894\": container with ID starting with 145f85c9e4bed69559d3fa2321198a2801d674c147a18cd06a91ee52950e9894 not found: ID does not exist" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.474475 4792 scope.go:117] "RemoveContainer" containerID="dda5ed153c6754e09b06fdc62aa62a423747b40a02478faacd4a8e291bb4a8b7" Mar 01 09:27:11 crc kubenswrapper[4792]: E0301 09:27:11.475258 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dda5ed153c6754e09b06fdc62aa62a423747b40a02478faacd4a8e291bb4a8b7\": container with ID starting with dda5ed153c6754e09b06fdc62aa62a423747b40a02478faacd4a8e291bb4a8b7 not found: ID does not exist" containerID="dda5ed153c6754e09b06fdc62aa62a423747b40a02478faacd4a8e291bb4a8b7" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.475282 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dda5ed153c6754e09b06fdc62aa62a423747b40a02478faacd4a8e291bb4a8b7"} err="failed to get container status \"dda5ed153c6754e09b06fdc62aa62a423747b40a02478faacd4a8e291bb4a8b7\": rpc error: code = NotFound desc = could not find container \"dda5ed153c6754e09b06fdc62aa62a423747b40a02478faacd4a8e291bb4a8b7\": container with ID starting with dda5ed153c6754e09b06fdc62aa62a423747b40a02478faacd4a8e291bb4a8b7 not found: ID does not exist" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.480497 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-ks68h"] Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.480696 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-8zsss" event={"ID":"272107df-b15b-4c97-b9b0-e865f9a391da","Type":"ContainerDied","Data":"63bb42f2da2fd3c9a15586710b3a6b5b1ba3e77ac05b9db5ef3689f411e43763"} Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.480726 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63bb42f2da2fd3c9a15586710b3a6b5b1ba3e77ac05b9db5ef3689f411e43763" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.480742 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-j6c8g"] Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.492781 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-j6c8g"] Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.505892 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-config-data\") pod \"glance-db-sync-ks68h\" (UID: \"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89\") " pod="openstack/glance-db-sync-ks68h" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.506005 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xks9p\" (UniqueName: \"kubernetes.io/projected/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-kube-api-access-xks9p\") pod \"glance-db-sync-ks68h\" (UID: \"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89\") " pod="openstack/glance-db-sync-ks68h" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.506695 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-db-sync-config-data\") pod \"glance-db-sync-ks68h\" (UID: \"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89\") " pod="openstack/glance-db-sync-ks68h" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.506756 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-combined-ca-bundle\") pod \"glance-db-sync-ks68h\" (UID: \"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89\") " pod="openstack/glance-db-sync-ks68h" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.608519 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-config-data\") pod \"glance-db-sync-ks68h\" (UID: \"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89\") " pod="openstack/glance-db-sync-ks68h" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.608613 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xks9p\" (UniqueName: \"kubernetes.io/projected/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-kube-api-access-xks9p\") pod \"glance-db-sync-ks68h\" (UID: \"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89\") " pod="openstack/glance-db-sync-ks68h" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.608675 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-db-sync-config-data\") pod \"glance-db-sync-ks68h\" (UID: \"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89\") " pod="openstack/glance-db-sync-ks68h" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.608696 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-combined-ca-bundle\") pod \"glance-db-sync-ks68h\" (UID: \"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89\") " pod="openstack/glance-db-sync-ks68h" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.613322 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-combined-ca-bundle\") pod \"glance-db-sync-ks68h\" (UID: \"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89\") " pod="openstack/glance-db-sync-ks68h" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.615961 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-config-data\") pod \"glance-db-sync-ks68h\" (UID: \"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89\") " pod="openstack/glance-db-sync-ks68h" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.617962 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-db-sync-config-data\") pod \"glance-db-sync-ks68h\" (UID: \"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89\") " pod="openstack/glance-db-sync-ks68h" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.624460 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xks9p\" (UniqueName: \"kubernetes.io/projected/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-kube-api-access-xks9p\") pod \"glance-db-sync-ks68h\" (UID: \"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89\") " pod="openstack/glance-db-sync-ks68h" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.757259 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ks68h" Mar 01 09:27:12 crc kubenswrapper[4792]: I0301 09:27:12.413776 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-ks68h"] Mar 01 09:27:13 crc kubenswrapper[4792]: I0301 09:27:13.425760 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87678b56-0909-4735-ad6b-cb992dc86853" path="/var/lib/kubelet/pods/87678b56-0909-4735-ad6b-cb992dc86853/volumes" Mar 01 09:27:13 crc kubenswrapper[4792]: I0301 09:27:13.430507 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ks68h" event={"ID":"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89","Type":"ContainerStarted","Data":"cb43fcc4ccc58a02f1becab11b8e15aea8e8f7775807952c1c371315885d3f5c"} Mar 01 09:27:14 crc kubenswrapper[4792]: I0301 09:27:14.332726 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-j6lbg"] Mar 01 09:27:14 crc kubenswrapper[4792]: I0301 09:27:14.342470 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-j6lbg"] Mar 01 09:27:14 crc kubenswrapper[4792]: I0301 09:27:14.413160 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-2dgrc"] Mar 01 09:27:14 crc kubenswrapper[4792]: I0301 09:27:14.414397 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2dgrc" Mar 01 09:27:14 crc kubenswrapper[4792]: I0301 09:27:14.421992 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 01 09:27:14 crc kubenswrapper[4792]: I0301 09:27:14.422134 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-2dgrc"] Mar 01 09:27:14 crc kubenswrapper[4792]: I0301 09:27:14.573376 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwrqj\" (UniqueName: \"kubernetes.io/projected/e9e80b4e-b68a-48a2-b0fe-e5cf19e00669-kube-api-access-bwrqj\") pod \"root-account-create-update-2dgrc\" (UID: \"e9e80b4e-b68a-48a2-b0fe-e5cf19e00669\") " pod="openstack/root-account-create-update-2dgrc" Mar 01 09:27:14 crc kubenswrapper[4792]: I0301 09:27:14.573490 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9e80b4e-b68a-48a2-b0fe-e5cf19e00669-operator-scripts\") pod \"root-account-create-update-2dgrc\" (UID: \"e9e80b4e-b68a-48a2-b0fe-e5cf19e00669\") " pod="openstack/root-account-create-update-2dgrc" Mar 01 09:27:14 crc kubenswrapper[4792]: I0301 09:27:14.675477 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9e80b4e-b68a-48a2-b0fe-e5cf19e00669-operator-scripts\") pod \"root-account-create-update-2dgrc\" (UID: \"e9e80b4e-b68a-48a2-b0fe-e5cf19e00669\") " pod="openstack/root-account-create-update-2dgrc" Mar 01 09:27:14 crc kubenswrapper[4792]: I0301 09:27:14.675563 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwrqj\" (UniqueName: \"kubernetes.io/projected/e9e80b4e-b68a-48a2-b0fe-e5cf19e00669-kube-api-access-bwrqj\") pod \"root-account-create-update-2dgrc\" (UID: \"e9e80b4e-b68a-48a2-b0fe-e5cf19e00669\") " pod="openstack/root-account-create-update-2dgrc" Mar 01 09:27:14 crc kubenswrapper[4792]: I0301 09:27:14.676621 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9e80b4e-b68a-48a2-b0fe-e5cf19e00669-operator-scripts\") pod \"root-account-create-update-2dgrc\" (UID: \"e9e80b4e-b68a-48a2-b0fe-e5cf19e00669\") " pod="openstack/root-account-create-update-2dgrc" Mar 01 09:27:14 crc kubenswrapper[4792]: I0301 09:27:14.693815 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwrqj\" (UniqueName: \"kubernetes.io/projected/e9e80b4e-b68a-48a2-b0fe-e5cf19e00669-kube-api-access-bwrqj\") pod \"root-account-create-update-2dgrc\" (UID: \"e9e80b4e-b68a-48a2-b0fe-e5cf19e00669\") " pod="openstack/root-account-create-update-2dgrc" Mar 01 09:27:14 crc kubenswrapper[4792]: I0301 09:27:14.736377 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2dgrc" Mar 01 09:27:15 crc kubenswrapper[4792]: I0301 09:27:15.166269 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-2dgrc"] Mar 01 09:27:15 crc kubenswrapper[4792]: W0301 09:27:15.188084 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9e80b4e_b68a_48a2_b0fe_e5cf19e00669.slice/crio-4f51b274cb581b24cd99a79ceeae3ca3c753bc062a04d1f27a5ca9e221f2dd2b WatchSource:0}: Error finding container 4f51b274cb581b24cd99a79ceeae3ca3c753bc062a04d1f27a5ca9e221f2dd2b: Status 404 returned error can't find the container with id 4f51b274cb581b24cd99a79ceeae3ca3c753bc062a04d1f27a5ca9e221f2dd2b Mar 01 09:27:15 crc kubenswrapper[4792]: I0301 09:27:15.418798 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77225d3a-9f2d-4aaf-8b98-6dc5310db3da" path="/var/lib/kubelet/pods/77225d3a-9f2d-4aaf-8b98-6dc5310db3da/volumes" Mar 01 09:27:15 crc kubenswrapper[4792]: I0301 09:27:15.467628 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2dgrc" event={"ID":"e9e80b4e-b68a-48a2-b0fe-e5cf19e00669","Type":"ContainerStarted","Data":"85f8d1f8a57c04591a9aaccb1305c025dd215cbe1527e2573cb115d042731951"} Mar 01 09:27:15 crc kubenswrapper[4792]: I0301 09:27:15.467686 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2dgrc" event={"ID":"e9e80b4e-b68a-48a2-b0fe-e5cf19e00669","Type":"ContainerStarted","Data":"4f51b274cb581b24cd99a79ceeae3ca3c753bc062a04d1f27a5ca9e221f2dd2b"} Mar 01 09:27:15 crc kubenswrapper[4792]: I0301 09:27:15.490558 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-2dgrc" podStartSLOduration=1.490542509 podStartE2EDuration="1.490542509s" podCreationTimestamp="2026-03-01 09:27:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:27:15.484723271 +0000 UTC m=+1164.726602488" watchObservedRunningTime="2026-03-01 09:27:15.490542509 +0000 UTC m=+1164.732421706" Mar 01 09:27:16 crc kubenswrapper[4792]: I0301 09:27:16.475409 4792 generic.go:334] "Generic (PLEG): container finished" podID="e9e80b4e-b68a-48a2-b0fe-e5cf19e00669" containerID="85f8d1f8a57c04591a9aaccb1305c025dd215cbe1527e2573cb115d042731951" exitCode=0 Mar 01 09:27:16 crc kubenswrapper[4792]: I0301 09:27:16.475688 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2dgrc" event={"ID":"e9e80b4e-b68a-48a2-b0fe-e5cf19e00669","Type":"ContainerDied","Data":"85f8d1f8a57c04591a9aaccb1305c025dd215cbe1527e2573cb115d042731951"} Mar 01 09:27:20 crc kubenswrapper[4792]: I0301 09:27:20.431385 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 01 09:27:22 crc kubenswrapper[4792]: I0301 09:27:22.534615 4792 generic.go:334] "Generic (PLEG): container finished" podID="6252a079-917c-46e8-a848-10569e1e057e" containerID="81c1c2615bd05b6e2f8a23b6d892f8335b3c7a5c117575ce3ed245f2faa7543f" exitCode=0 Mar 01 09:27:22 crc kubenswrapper[4792]: I0301 09:27:22.534659 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6252a079-917c-46e8-a848-10569e1e057e","Type":"ContainerDied","Data":"81c1c2615bd05b6e2f8a23b6d892f8335b3c7a5c117575ce3ed245f2faa7543f"} Mar 01 09:27:23 crc kubenswrapper[4792]: I0301 09:27:23.546302 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2dgrc" event={"ID":"e9e80b4e-b68a-48a2-b0fe-e5cf19e00669","Type":"ContainerDied","Data":"4f51b274cb581b24cd99a79ceeae3ca3c753bc062a04d1f27a5ca9e221f2dd2b"} Mar 01 09:27:23 crc kubenswrapper[4792]: I0301 09:27:23.547639 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f51b274cb581b24cd99a79ceeae3ca3c753bc062a04d1f27a5ca9e221f2dd2b" Mar 01 09:27:23 crc kubenswrapper[4792]: I0301 09:27:23.549464 4792 generic.go:334] "Generic (PLEG): container finished" podID="2d63e2d4-fa65-425a-8578-d7b9f8c5ba24" containerID="6ee0d7e342e549ff7a0c75a829bbad1f0458089ec98c553779c523b140c0f36b" exitCode=0 Mar 01 09:27:23 crc kubenswrapper[4792]: I0301 09:27:23.549514 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24","Type":"ContainerDied","Data":"6ee0d7e342e549ff7a0c75a829bbad1f0458089ec98c553779c523b140c0f36b"} Mar 01 09:27:23 crc kubenswrapper[4792]: I0301 09:27:23.569266 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2dgrc" Mar 01 09:27:23 crc kubenswrapper[4792]: I0301 09:27:23.743009 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwrqj\" (UniqueName: \"kubernetes.io/projected/e9e80b4e-b68a-48a2-b0fe-e5cf19e00669-kube-api-access-bwrqj\") pod \"e9e80b4e-b68a-48a2-b0fe-e5cf19e00669\" (UID: \"e9e80b4e-b68a-48a2-b0fe-e5cf19e00669\") " Mar 01 09:27:23 crc kubenswrapper[4792]: I0301 09:27:23.743168 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9e80b4e-b68a-48a2-b0fe-e5cf19e00669-operator-scripts\") pod \"e9e80b4e-b68a-48a2-b0fe-e5cf19e00669\" (UID: \"e9e80b4e-b68a-48a2-b0fe-e5cf19e00669\") " Mar 01 09:27:23 crc kubenswrapper[4792]: I0301 09:27:23.744764 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9e80b4e-b68a-48a2-b0fe-e5cf19e00669-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e9e80b4e-b68a-48a2-b0fe-e5cf19e00669" (UID: "e9e80b4e-b68a-48a2-b0fe-e5cf19e00669"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:23 crc kubenswrapper[4792]: I0301 09:27:23.748681 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9e80b4e-b68a-48a2-b0fe-e5cf19e00669-kube-api-access-bwrqj" (OuterVolumeSpecName: "kube-api-access-bwrqj") pod "e9e80b4e-b68a-48a2-b0fe-e5cf19e00669" (UID: "e9e80b4e-b68a-48a2-b0fe-e5cf19e00669"). InnerVolumeSpecName "kube-api-access-bwrqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:27:23 crc kubenswrapper[4792]: I0301 09:27:23.845550 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwrqj\" (UniqueName: \"kubernetes.io/projected/e9e80b4e-b68a-48a2-b0fe-e5cf19e00669-kube-api-access-bwrqj\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:23 crc kubenswrapper[4792]: I0301 09:27:23.845594 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9e80b4e-b68a-48a2-b0fe-e5cf19e00669-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:24 crc kubenswrapper[4792]: I0301 09:27:24.558513 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ks68h" event={"ID":"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89","Type":"ContainerStarted","Data":"01f9646a0afc7be0a0075cd21cc75eb15ebfcd51abc1ad1bd7ae6a925a4bfdd3"} Mar 01 09:27:24 crc kubenswrapper[4792]: I0301 09:27:24.560892 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6252a079-917c-46e8-a848-10569e1e057e","Type":"ContainerStarted","Data":"e872a8250debe35b2b405169cd43bdbc962c34739bd277e35d8038f3fa166251"} Mar 01 09:27:24 crc kubenswrapper[4792]: I0301 09:27:24.561193 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 01 09:27:24 crc kubenswrapper[4792]: I0301 09:27:24.564305 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2dgrc" Mar 01 09:27:24 crc kubenswrapper[4792]: I0301 09:27:24.564684 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24","Type":"ContainerStarted","Data":"763a17ce168e713296e67217a330ca93fd37d2fe6e80cda59899f22b0afab4c5"} Mar 01 09:27:24 crc kubenswrapper[4792]: I0301 09:27:24.564897 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:27:24 crc kubenswrapper[4792]: I0301 09:27:24.608148 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-ks68h" podStartSLOduration=2.439438714 podStartE2EDuration="13.608130848s" podCreationTimestamp="2026-03-01 09:27:11 +0000 UTC" firstStartedPulling="2026-03-01 09:27:12.425567358 +0000 UTC m=+1161.667446555" lastFinishedPulling="2026-03-01 09:27:23.594259492 +0000 UTC m=+1172.836138689" observedRunningTime="2026-03-01 09:27:24.589303125 +0000 UTC m=+1173.831182322" watchObservedRunningTime="2026-03-01 09:27:24.608130848 +0000 UTC m=+1173.850010045" Mar 01 09:27:24 crc kubenswrapper[4792]: I0301 09:27:24.622973 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.097113854 podStartE2EDuration="1m3.622956661s" podCreationTimestamp="2026-03-01 09:26:21 +0000 UTC" firstStartedPulling="2026-03-01 09:26:23.605844233 +0000 UTC m=+1112.847723430" lastFinishedPulling="2026-03-01 09:26:49.13168704 +0000 UTC m=+1138.373566237" observedRunningTime="2026-03-01 09:27:24.619377744 +0000 UTC m=+1173.861256941" watchObservedRunningTime="2026-03-01 09:27:24.622956661 +0000 UTC m=+1173.864835858" Mar 01 09:27:24 crc kubenswrapper[4792]: I0301 09:27:24.648491 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.442271778 podStartE2EDuration="1m3.648474688s" podCreationTimestamp="2026-03-01 09:26:21 +0000 UTC" firstStartedPulling="2026-03-01 09:26:23.864243901 +0000 UTC m=+1113.106123098" lastFinishedPulling="2026-03-01 09:26:49.070446821 +0000 UTC m=+1138.312326008" observedRunningTime="2026-03-01 09:27:24.644749836 +0000 UTC m=+1173.886629033" watchObservedRunningTime="2026-03-01 09:27:24.648474688 +0000 UTC m=+1173.890353885" Mar 01 09:27:26 crc kubenswrapper[4792]: I0301 09:27:26.618382 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-mpvqc" podUID="d50ee3b1-4f97-4644-802d-04c85d9c3abc" containerName="ovn-controller" probeResult="failure" output=< Mar 01 09:27:26 crc kubenswrapper[4792]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 01 09:27:26 crc kubenswrapper[4792]: > Mar 01 09:27:26 crc kubenswrapper[4792]: I0301 09:27:26.657584 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:27:26 crc kubenswrapper[4792]: I0301 09:27:26.659855 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:27:26 crc kubenswrapper[4792]: I0301 09:27:26.899041 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-mpvqc-config-vmwpv"] Mar 01 09:27:26 crc kubenswrapper[4792]: E0301 09:27:26.899451 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9e80b4e-b68a-48a2-b0fe-e5cf19e00669" containerName="mariadb-account-create-update" Mar 01 09:27:26 crc kubenswrapper[4792]: I0301 09:27:26.899472 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9e80b4e-b68a-48a2-b0fe-e5cf19e00669" containerName="mariadb-account-create-update" Mar 01 09:27:26 crc kubenswrapper[4792]: I0301 09:27:26.899625 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9e80b4e-b68a-48a2-b0fe-e5cf19e00669" containerName="mariadb-account-create-update" Mar 01 09:27:26 crc kubenswrapper[4792]: I0301 09:27:26.900126 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mpvqc-config-vmwpv" Mar 01 09:27:26 crc kubenswrapper[4792]: I0301 09:27:26.914287 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 01 09:27:26 crc kubenswrapper[4792]: I0301 09:27:26.918868 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mpvqc-config-vmwpv"] Mar 01 09:27:26 crc kubenswrapper[4792]: I0301 09:27:26.996870 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-var-run\") pod \"ovn-controller-mpvqc-config-vmwpv\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " pod="openstack/ovn-controller-mpvqc-config-vmwpv" Mar 01 09:27:26 crc kubenswrapper[4792]: I0301 09:27:26.996940 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88jl5\" (UniqueName: \"kubernetes.io/projected/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-kube-api-access-88jl5\") pod \"ovn-controller-mpvqc-config-vmwpv\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " pod="openstack/ovn-controller-mpvqc-config-vmwpv" Mar 01 09:27:26 crc kubenswrapper[4792]: I0301 09:27:26.996985 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-var-run-ovn\") pod \"ovn-controller-mpvqc-config-vmwpv\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " pod="openstack/ovn-controller-mpvqc-config-vmwpv" Mar 01 09:27:26 crc kubenswrapper[4792]: I0301 09:27:26.997023 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-var-log-ovn\") pod \"ovn-controller-mpvqc-config-vmwpv\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " pod="openstack/ovn-controller-mpvqc-config-vmwpv" Mar 01 09:27:26 crc kubenswrapper[4792]: I0301 09:27:26.997060 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-additional-scripts\") pod \"ovn-controller-mpvqc-config-vmwpv\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " pod="openstack/ovn-controller-mpvqc-config-vmwpv" Mar 01 09:27:26 crc kubenswrapper[4792]: I0301 09:27:26.997087 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-scripts\") pod \"ovn-controller-mpvqc-config-vmwpv\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " pod="openstack/ovn-controller-mpvqc-config-vmwpv" Mar 01 09:27:27 crc kubenswrapper[4792]: I0301 09:27:27.098656 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88jl5\" (UniqueName: \"kubernetes.io/projected/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-kube-api-access-88jl5\") pod \"ovn-controller-mpvqc-config-vmwpv\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " pod="openstack/ovn-controller-mpvqc-config-vmwpv" Mar 01 09:27:27 crc kubenswrapper[4792]: I0301 09:27:27.099129 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-var-run-ovn\") pod \"ovn-controller-mpvqc-config-vmwpv\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " pod="openstack/ovn-controller-mpvqc-config-vmwpv" Mar 01 09:27:27 crc kubenswrapper[4792]: I0301 09:27:27.099480 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-var-log-ovn\") pod \"ovn-controller-mpvqc-config-vmwpv\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " pod="openstack/ovn-controller-mpvqc-config-vmwpv" Mar 01 09:27:27 crc kubenswrapper[4792]: I0301 09:27:27.099649 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-additional-scripts\") pod \"ovn-controller-mpvqc-config-vmwpv\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " pod="openstack/ovn-controller-mpvqc-config-vmwpv" Mar 01 09:27:27 crc kubenswrapper[4792]: I0301 09:27:27.100402 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-scripts\") pod \"ovn-controller-mpvqc-config-vmwpv\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " pod="openstack/ovn-controller-mpvqc-config-vmwpv" Mar 01 09:27:27 crc kubenswrapper[4792]: I0301 09:27:27.102090 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-var-run\") pod \"ovn-controller-mpvqc-config-vmwpv\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " pod="openstack/ovn-controller-mpvqc-config-vmwpv" Mar 01 09:27:27 crc kubenswrapper[4792]: I0301 09:27:27.100359 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-additional-scripts\") pod \"ovn-controller-mpvqc-config-vmwpv\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " pod="openstack/ovn-controller-mpvqc-config-vmwpv" Mar 01 09:27:27 crc kubenswrapper[4792]: I0301 09:27:27.099605 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-var-log-ovn\") pod \"ovn-controller-mpvqc-config-vmwpv\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " pod="openstack/ovn-controller-mpvqc-config-vmwpv" Mar 01 09:27:27 crc kubenswrapper[4792]: I0301 09:27:27.102000 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-scripts\") pod \"ovn-controller-mpvqc-config-vmwpv\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " pod="openstack/ovn-controller-mpvqc-config-vmwpv" Mar 01 09:27:27 crc kubenswrapper[4792]: I0301 09:27:27.099433 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-var-run-ovn\") pod \"ovn-controller-mpvqc-config-vmwpv\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " pod="openstack/ovn-controller-mpvqc-config-vmwpv" Mar 01 09:27:27 crc kubenswrapper[4792]: I0301 09:27:27.102506 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-var-run\") pod \"ovn-controller-mpvqc-config-vmwpv\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " pod="openstack/ovn-controller-mpvqc-config-vmwpv" Mar 01 09:27:27 crc kubenswrapper[4792]: I0301 09:27:27.117586 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88jl5\" (UniqueName: \"kubernetes.io/projected/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-kube-api-access-88jl5\") pod \"ovn-controller-mpvqc-config-vmwpv\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " pod="openstack/ovn-controller-mpvqc-config-vmwpv" Mar 01 09:27:27 crc kubenswrapper[4792]: I0301 09:27:27.223738 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mpvqc-config-vmwpv" Mar 01 09:27:27 crc kubenswrapper[4792]: I0301 09:27:27.509182 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mpvqc-config-vmwpv"] Mar 01 09:27:27 crc kubenswrapper[4792]: W0301 09:27:27.535205 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2ec422a_59ee_429e_8e59_5d08e22bc9a6.slice/crio-fa73e573aee706c57ffb03f318a435a06c59a627a88dfec63bde950f977943c5 WatchSource:0}: Error finding container fa73e573aee706c57ffb03f318a435a06c59a627a88dfec63bde950f977943c5: Status 404 returned error can't find the container with id fa73e573aee706c57ffb03f318a435a06c59a627a88dfec63bde950f977943c5 Mar 01 09:27:27 crc kubenswrapper[4792]: I0301 09:27:27.592752 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mpvqc-config-vmwpv" event={"ID":"e2ec422a-59ee-429e-8e59-5d08e22bc9a6","Type":"ContainerStarted","Data":"fa73e573aee706c57ffb03f318a435a06c59a627a88dfec63bde950f977943c5"} Mar 01 09:27:28 crc kubenswrapper[4792]: I0301 09:27:28.601138 4792 generic.go:334] "Generic (PLEG): container finished" podID="e2ec422a-59ee-429e-8e59-5d08e22bc9a6" containerID="68eabc969b4329c81ee454f5c339af1b09a491b6cf0b1ab092fc279d1ef9e440" exitCode=0 Mar 01 09:27:28 crc kubenswrapper[4792]: I0301 09:27:28.601184 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mpvqc-config-vmwpv" event={"ID":"e2ec422a-59ee-429e-8e59-5d08e22bc9a6","Type":"ContainerDied","Data":"68eabc969b4329c81ee454f5c339af1b09a491b6cf0b1ab092fc279d1ef9e440"} Mar 01 09:27:29 crc kubenswrapper[4792]: I0301 09:27:29.932267 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mpvqc-config-vmwpv" Mar 01 09:27:30 crc kubenswrapper[4792]: I0301 09:27:30.052315 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88jl5\" (UniqueName: \"kubernetes.io/projected/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-kube-api-access-88jl5\") pod \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " Mar 01 09:27:30 crc kubenswrapper[4792]: I0301 09:27:30.052671 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-var-log-ovn\") pod \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " Mar 01 09:27:30 crc kubenswrapper[4792]: I0301 09:27:30.052700 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-var-run\") pod \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " Mar 01 09:27:30 crc kubenswrapper[4792]: I0301 09:27:30.052740 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-additional-scripts\") pod \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " Mar 01 09:27:30 crc kubenswrapper[4792]: I0301 09:27:30.052789 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-var-run-ovn\") pod \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " Mar 01 09:27:30 crc kubenswrapper[4792]: I0301 09:27:30.052856 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-scripts\") pod \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " Mar 01 09:27:30 crc kubenswrapper[4792]: I0301 09:27:30.053992 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "e2ec422a-59ee-429e-8e59-5d08e22bc9a6" (UID: "e2ec422a-59ee-429e-8e59-5d08e22bc9a6"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:27:30 crc kubenswrapper[4792]: I0301 09:27:30.054039 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-var-run" (OuterVolumeSpecName: "var-run") pod "e2ec422a-59ee-429e-8e59-5d08e22bc9a6" (UID: "e2ec422a-59ee-429e-8e59-5d08e22bc9a6"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:27:30 crc kubenswrapper[4792]: I0301 09:27:30.054329 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "e2ec422a-59ee-429e-8e59-5d08e22bc9a6" (UID: "e2ec422a-59ee-429e-8e59-5d08e22bc9a6"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:30 crc kubenswrapper[4792]: I0301 09:27:30.054359 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-scripts" (OuterVolumeSpecName: "scripts") pod "e2ec422a-59ee-429e-8e59-5d08e22bc9a6" (UID: "e2ec422a-59ee-429e-8e59-5d08e22bc9a6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:30 crc kubenswrapper[4792]: I0301 09:27:30.054368 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "e2ec422a-59ee-429e-8e59-5d08e22bc9a6" (UID: "e2ec422a-59ee-429e-8e59-5d08e22bc9a6"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:27:30 crc kubenswrapper[4792]: I0301 09:27:30.075082 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-kube-api-access-88jl5" (OuterVolumeSpecName: "kube-api-access-88jl5") pod "e2ec422a-59ee-429e-8e59-5d08e22bc9a6" (UID: "e2ec422a-59ee-429e-8e59-5d08e22bc9a6"). InnerVolumeSpecName "kube-api-access-88jl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:27:30 crc kubenswrapper[4792]: I0301 09:27:30.154491 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:30 crc kubenswrapper[4792]: I0301 09:27:30.154527 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88jl5\" (UniqueName: \"kubernetes.io/projected/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-kube-api-access-88jl5\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:30 crc kubenswrapper[4792]: I0301 09:27:30.154539 4792 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:30 crc kubenswrapper[4792]: I0301 09:27:30.154559 4792 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-var-run\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:30 crc kubenswrapper[4792]: I0301 09:27:30.154568 4792 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:30 crc kubenswrapper[4792]: I0301 09:27:30.154576 4792 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:30 crc kubenswrapper[4792]: I0301 09:27:30.617208 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mpvqc-config-vmwpv" event={"ID":"e2ec422a-59ee-429e-8e59-5d08e22bc9a6","Type":"ContainerDied","Data":"fa73e573aee706c57ffb03f318a435a06c59a627a88dfec63bde950f977943c5"} Mar 01 09:27:30 crc kubenswrapper[4792]: I0301 09:27:30.617254 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa73e573aee706c57ffb03f318a435a06c59a627a88dfec63bde950f977943c5" Mar 01 09:27:30 crc kubenswrapper[4792]: I0301 09:27:30.617317 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mpvqc-config-vmwpv" Mar 01 09:27:31 crc kubenswrapper[4792]: I0301 09:27:31.081774 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-mpvqc-config-vmwpv"] Mar 01 09:27:31 crc kubenswrapper[4792]: I0301 09:27:31.091872 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-mpvqc-config-vmwpv"] Mar 01 09:27:31 crc kubenswrapper[4792]: I0301 09:27:31.416698 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2ec422a-59ee-429e-8e59-5d08e22bc9a6" path="/var/lib/kubelet/pods/e2ec422a-59ee-429e-8e59-5d08e22bc9a6/volumes" Mar 01 09:27:31 crc kubenswrapper[4792]: I0301 09:27:31.661226 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-mpvqc" Mar 01 09:27:33 crc kubenswrapper[4792]: I0301 09:27:33.233381 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="6252a079-917c-46e8-a848-10569e1e057e" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Mar 01 09:27:33 crc kubenswrapper[4792]: I0301 09:27:33.655043 4792 generic.go:334] "Generic (PLEG): container finished" podID="72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89" containerID="01f9646a0afc7be0a0075cd21cc75eb15ebfcd51abc1ad1bd7ae6a925a4bfdd3" exitCode=0 Mar 01 09:27:33 crc kubenswrapper[4792]: I0301 09:27:33.655092 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ks68h" event={"ID":"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89","Type":"ContainerDied","Data":"01f9646a0afc7be0a0075cd21cc75eb15ebfcd51abc1ad1bd7ae6a925a4bfdd3"} Mar 01 09:27:35 crc kubenswrapper[4792]: I0301 09:27:35.019641 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ks68h" Mar 01 09:27:35 crc kubenswrapper[4792]: I0301 09:27:35.128779 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-db-sync-config-data\") pod \"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89\" (UID: \"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89\") " Mar 01 09:27:35 crc kubenswrapper[4792]: I0301 09:27:35.128980 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xks9p\" (UniqueName: \"kubernetes.io/projected/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-kube-api-access-xks9p\") pod \"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89\" (UID: \"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89\") " Mar 01 09:27:35 crc kubenswrapper[4792]: I0301 09:27:35.129058 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-config-data\") pod \"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89\" (UID: \"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89\") " Mar 01 09:27:35 crc kubenswrapper[4792]: I0301 09:27:35.129136 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-combined-ca-bundle\") pod \"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89\" (UID: \"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89\") " Mar 01 09:27:35 crc kubenswrapper[4792]: I0301 09:27:35.137613 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89" (UID: "72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:27:35 crc kubenswrapper[4792]: I0301 09:27:35.139185 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-kube-api-access-xks9p" (OuterVolumeSpecName: "kube-api-access-xks9p") pod "72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89" (UID: "72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89"). InnerVolumeSpecName "kube-api-access-xks9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:27:35 crc kubenswrapper[4792]: I0301 09:27:35.156377 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89" (UID: "72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:27:35 crc kubenswrapper[4792]: I0301 09:27:35.175238 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-config-data" (OuterVolumeSpecName: "config-data") pod "72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89" (UID: "72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:27:35 crc kubenswrapper[4792]: I0301 09:27:35.231809 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:35 crc kubenswrapper[4792]: I0301 09:27:35.231852 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:35 crc kubenswrapper[4792]: I0301 09:27:35.231867 4792 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:35 crc kubenswrapper[4792]: I0301 09:27:35.231880 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xks9p\" (UniqueName: \"kubernetes.io/projected/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-kube-api-access-xks9p\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:35 crc kubenswrapper[4792]: I0301 09:27:35.669161 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ks68h" event={"ID":"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89","Type":"ContainerDied","Data":"cb43fcc4ccc58a02f1becab11b8e15aea8e8f7775807952c1c371315885d3f5c"} Mar 01 09:27:35 crc kubenswrapper[4792]: I0301 09:27:35.669454 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb43fcc4ccc58a02f1becab11b8e15aea8e8f7775807952c1c371315885d3f5c" Mar 01 09:27:35 crc kubenswrapper[4792]: I0301 09:27:35.669931 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ks68h" Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.097649 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6958f867f9-p8x5s"] Mar 01 09:27:36 crc kubenswrapper[4792]: E0301 09:27:36.097990 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89" containerName="glance-db-sync" Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.098004 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89" containerName="glance-db-sync" Mar 01 09:27:36 crc kubenswrapper[4792]: E0301 09:27:36.098025 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2ec422a-59ee-429e-8e59-5d08e22bc9a6" containerName="ovn-config" Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.098032 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2ec422a-59ee-429e-8e59-5d08e22bc9a6" containerName="ovn-config" Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.098191 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2ec422a-59ee-429e-8e59-5d08e22bc9a6" containerName="ovn-config" Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.098204 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89" containerName="glance-db-sync" Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.099008 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.129151 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6958f867f9-p8x5s"] Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.144119 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-config\") pod \"dnsmasq-dns-6958f867f9-p8x5s\" (UID: \"667fff68-7113-4dfe-86b4-34b80b41d326\") " pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.144179 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77bc7\" (UniqueName: \"kubernetes.io/projected/667fff68-7113-4dfe-86b4-34b80b41d326-kube-api-access-77bc7\") pod \"dnsmasq-dns-6958f867f9-p8x5s\" (UID: \"667fff68-7113-4dfe-86b4-34b80b41d326\") " pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.144207 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-dns-svc\") pod \"dnsmasq-dns-6958f867f9-p8x5s\" (UID: \"667fff68-7113-4dfe-86b4-34b80b41d326\") " pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.144239 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-ovsdbserver-sb\") pod \"dnsmasq-dns-6958f867f9-p8x5s\" (UID: \"667fff68-7113-4dfe-86b4-34b80b41d326\") " pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.144360 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-ovsdbserver-nb\") pod \"dnsmasq-dns-6958f867f9-p8x5s\" (UID: \"667fff68-7113-4dfe-86b4-34b80b41d326\") " pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.245855 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-config\") pod \"dnsmasq-dns-6958f867f9-p8x5s\" (UID: \"667fff68-7113-4dfe-86b4-34b80b41d326\") " pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.245933 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77bc7\" (UniqueName: \"kubernetes.io/projected/667fff68-7113-4dfe-86b4-34b80b41d326-kube-api-access-77bc7\") pod \"dnsmasq-dns-6958f867f9-p8x5s\" (UID: \"667fff68-7113-4dfe-86b4-34b80b41d326\") " pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.245956 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-dns-svc\") pod \"dnsmasq-dns-6958f867f9-p8x5s\" (UID: \"667fff68-7113-4dfe-86b4-34b80b41d326\") " pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.245985 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-ovsdbserver-sb\") pod \"dnsmasq-dns-6958f867f9-p8x5s\" (UID: \"667fff68-7113-4dfe-86b4-34b80b41d326\") " pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.246038 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-ovsdbserver-nb\") pod \"dnsmasq-dns-6958f867f9-p8x5s\" (UID: \"667fff68-7113-4dfe-86b4-34b80b41d326\") " pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.246978 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-config\") pod \"dnsmasq-dns-6958f867f9-p8x5s\" (UID: \"667fff68-7113-4dfe-86b4-34b80b41d326\") " pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.247246 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-ovsdbserver-nb\") pod \"dnsmasq-dns-6958f867f9-p8x5s\" (UID: \"667fff68-7113-4dfe-86b4-34b80b41d326\") " pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.247340 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-dns-svc\") pod \"dnsmasq-dns-6958f867f9-p8x5s\" (UID: \"667fff68-7113-4dfe-86b4-34b80b41d326\") " pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.247426 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-ovsdbserver-sb\") pod \"dnsmasq-dns-6958f867f9-p8x5s\" (UID: \"667fff68-7113-4dfe-86b4-34b80b41d326\") " pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.271945 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77bc7\" (UniqueName: \"kubernetes.io/projected/667fff68-7113-4dfe-86b4-34b80b41d326-kube-api-access-77bc7\") pod \"dnsmasq-dns-6958f867f9-p8x5s\" (UID: \"667fff68-7113-4dfe-86b4-34b80b41d326\") " pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.416575 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.859532 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6958f867f9-p8x5s"] Mar 01 09:27:37 crc kubenswrapper[4792]: E0301 09:27:37.231876 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod667fff68_7113_4dfe_86b4_34b80b41d326.slice/crio-e23b7bf7394271fedc5e4abedff3f86bcbcc1a5fe82a8164474dcf3fb06696d9.scope\": RecentStats: unable to find data in memory cache]" Mar 01 09:27:37 crc kubenswrapper[4792]: I0301 09:27:37.685432 4792 generic.go:334] "Generic (PLEG): container finished" podID="667fff68-7113-4dfe-86b4-34b80b41d326" containerID="e23b7bf7394271fedc5e4abedff3f86bcbcc1a5fe82a8164474dcf3fb06696d9" exitCode=0 Mar 01 09:27:37 crc kubenswrapper[4792]: I0301 09:27:37.685491 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" event={"ID":"667fff68-7113-4dfe-86b4-34b80b41d326","Type":"ContainerDied","Data":"e23b7bf7394271fedc5e4abedff3f86bcbcc1a5fe82a8164474dcf3fb06696d9"} Mar 01 09:27:37 crc kubenswrapper[4792]: I0301 09:27:37.685768 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" event={"ID":"667fff68-7113-4dfe-86b4-34b80b41d326","Type":"ContainerStarted","Data":"386c10f9f902b49dfdcc7e63fa9588772c851427627a00112a847f427ef0dd79"} Mar 01 09:27:38 crc kubenswrapper[4792]: I0301 09:27:38.693854 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" event={"ID":"667fff68-7113-4dfe-86b4-34b80b41d326","Type":"ContainerStarted","Data":"b035d55e8df5d946580d2dfd8c4b3f7b4f07c6856584d562fa0676ed055fd3e5"} Mar 01 09:27:38 crc kubenswrapper[4792]: I0301 09:27:38.694346 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" Mar 01 09:27:38 crc kubenswrapper[4792]: I0301 09:27:38.719565 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" podStartSLOduration=2.719540234 podStartE2EDuration="2.719540234s" podCreationTimestamp="2026-03-01 09:27:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:27:38.709862596 +0000 UTC m=+1187.951741813" watchObservedRunningTime="2026-03-01 09:27:38.719540234 +0000 UTC m=+1187.961419441" Mar 01 09:27:42 crc kubenswrapper[4792]: I0301 09:27:42.930111 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:27:43 crc kubenswrapper[4792]: I0301 09:27:43.236215 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 01 09:27:44 crc kubenswrapper[4792]: I0301 09:27:44.838543 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-t8rmw"] Mar 01 09:27:44 crc kubenswrapper[4792]: I0301 09:27:44.839679 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-t8rmw" Mar 01 09:27:44 crc kubenswrapper[4792]: I0301 09:27:44.852018 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-t8rmw"] Mar 01 09:27:44 crc kubenswrapper[4792]: I0301 09:27:44.947800 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-8714-account-create-update-wzssg"] Mar 01 09:27:44 crc kubenswrapper[4792]: I0301 09:27:44.950941 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8714-account-create-update-wzssg" Mar 01 09:27:44 crc kubenswrapper[4792]: I0301 09:27:44.952863 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 01 09:27:44 crc kubenswrapper[4792]: I0301 09:27:44.975361 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8714-account-create-update-wzssg"] Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.006381 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkfk2\" (UniqueName: \"kubernetes.io/projected/f0b42afb-2954-442e-bc91-4c8275a4d2fd-kube-api-access-nkfk2\") pod \"cinder-db-create-t8rmw\" (UID: \"f0b42afb-2954-442e-bc91-4c8275a4d2fd\") " pod="openstack/cinder-db-create-t8rmw" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.006452 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b715bb3f-b181-4614-85c5-9155286ce80c-operator-scripts\") pod \"cinder-8714-account-create-update-wzssg\" (UID: \"b715bb3f-b181-4614-85c5-9155286ce80c\") " pod="openstack/cinder-8714-account-create-update-wzssg" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.006488 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djxnh\" (UniqueName: \"kubernetes.io/projected/b715bb3f-b181-4614-85c5-9155286ce80c-kube-api-access-djxnh\") pod \"cinder-8714-account-create-update-wzssg\" (UID: \"b715bb3f-b181-4614-85c5-9155286ce80c\") " pod="openstack/cinder-8714-account-create-update-wzssg" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.006516 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0b42afb-2954-442e-bc91-4c8275a4d2fd-operator-scripts\") pod \"cinder-db-create-t8rmw\" (UID: \"f0b42afb-2954-442e-bc91-4c8275a4d2fd\") " pod="openstack/cinder-db-create-t8rmw" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.017617 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-kv8gv"] Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.018659 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-kv8gv" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.035322 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-kv8gv"] Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.107365 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgch2\" (UniqueName: \"kubernetes.io/projected/bd689802-7b27-463e-a155-ed837e8594e6-kube-api-access-lgch2\") pod \"barbican-db-create-kv8gv\" (UID: \"bd689802-7b27-463e-a155-ed837e8594e6\") " pod="openstack/barbican-db-create-kv8gv" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.107412 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0b42afb-2954-442e-bc91-4c8275a4d2fd-operator-scripts\") pod \"cinder-db-create-t8rmw\" (UID: \"f0b42afb-2954-442e-bc91-4c8275a4d2fd\") " pod="openstack/cinder-db-create-t8rmw" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.107465 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd689802-7b27-463e-a155-ed837e8594e6-operator-scripts\") pod \"barbican-db-create-kv8gv\" (UID: \"bd689802-7b27-463e-a155-ed837e8594e6\") " pod="openstack/barbican-db-create-kv8gv" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.108454 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0b42afb-2954-442e-bc91-4c8275a4d2fd-operator-scripts\") pod \"cinder-db-create-t8rmw\" (UID: \"f0b42afb-2954-442e-bc91-4c8275a4d2fd\") " pod="openstack/cinder-db-create-t8rmw" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.108585 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkfk2\" (UniqueName: \"kubernetes.io/projected/f0b42afb-2954-442e-bc91-4c8275a4d2fd-kube-api-access-nkfk2\") pod \"cinder-db-create-t8rmw\" (UID: \"f0b42afb-2954-442e-bc91-4c8275a4d2fd\") " pod="openstack/cinder-db-create-t8rmw" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.108774 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b715bb3f-b181-4614-85c5-9155286ce80c-operator-scripts\") pod \"cinder-8714-account-create-update-wzssg\" (UID: \"b715bb3f-b181-4614-85c5-9155286ce80c\") " pod="openstack/cinder-8714-account-create-update-wzssg" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.108866 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djxnh\" (UniqueName: \"kubernetes.io/projected/b715bb3f-b181-4614-85c5-9155286ce80c-kube-api-access-djxnh\") pod \"cinder-8714-account-create-update-wzssg\" (UID: \"b715bb3f-b181-4614-85c5-9155286ce80c\") " pod="openstack/cinder-8714-account-create-update-wzssg" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.109835 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b715bb3f-b181-4614-85c5-9155286ce80c-operator-scripts\") pod \"cinder-8714-account-create-update-wzssg\" (UID: \"b715bb3f-b181-4614-85c5-9155286ce80c\") " pod="openstack/cinder-8714-account-create-update-wzssg" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.119262 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-71d5-account-create-update-mjs9k"] Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.120417 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-71d5-account-create-update-mjs9k" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.122974 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.149126 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djxnh\" (UniqueName: \"kubernetes.io/projected/b715bb3f-b181-4614-85c5-9155286ce80c-kube-api-access-djxnh\") pod \"cinder-8714-account-create-update-wzssg\" (UID: \"b715bb3f-b181-4614-85c5-9155286ce80c\") " pod="openstack/cinder-8714-account-create-update-wzssg" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.149917 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkfk2\" (UniqueName: \"kubernetes.io/projected/f0b42afb-2954-442e-bc91-4c8275a4d2fd-kube-api-access-nkfk2\") pod \"cinder-db-create-t8rmw\" (UID: \"f0b42afb-2954-442e-bc91-4c8275a4d2fd\") " pod="openstack/cinder-db-create-t8rmw" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.155075 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-t8rmw" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.174393 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-71d5-account-create-update-mjs9k"] Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.215249 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-jsld8"] Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.218605 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jsld8" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.232324 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.232510 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-fr9vh" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.232775 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.233100 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.235015 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scqkn\" (UniqueName: \"kubernetes.io/projected/465282ce-1312-4cb6-ae89-de6ada48a901-kube-api-access-scqkn\") pod \"keystone-db-sync-jsld8\" (UID: \"465282ce-1312-4cb6-ae89-de6ada48a901\") " pod="openstack/keystone-db-sync-jsld8" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.235108 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgch2\" (UniqueName: \"kubernetes.io/projected/bd689802-7b27-463e-a155-ed837e8594e6-kube-api-access-lgch2\") pod \"barbican-db-create-kv8gv\" (UID: \"bd689802-7b27-463e-a155-ed837e8594e6\") " pod="openstack/barbican-db-create-kv8gv" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.235138 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/465282ce-1312-4cb6-ae89-de6ada48a901-combined-ca-bundle\") pod \"keystone-db-sync-jsld8\" (UID: \"465282ce-1312-4cb6-ae89-de6ada48a901\") " pod="openstack/keystone-db-sync-jsld8" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.235165 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/465282ce-1312-4cb6-ae89-de6ada48a901-config-data\") pod \"keystone-db-sync-jsld8\" (UID: \"465282ce-1312-4cb6-ae89-de6ada48a901\") " pod="openstack/keystone-db-sync-jsld8" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.235187 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dabb3d2e-57fa-4ad3-9f3b-b85e0b670650-operator-scripts\") pod \"barbican-71d5-account-create-update-mjs9k\" (UID: \"dabb3d2e-57fa-4ad3-9f3b-b85e0b670650\") " pod="openstack/barbican-71d5-account-create-update-mjs9k" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.235217 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd689802-7b27-463e-a155-ed837e8594e6-operator-scripts\") pod \"barbican-db-create-kv8gv\" (UID: \"bd689802-7b27-463e-a155-ed837e8594e6\") " pod="openstack/barbican-db-create-kv8gv" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.235238 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-549vc\" (UniqueName: \"kubernetes.io/projected/dabb3d2e-57fa-4ad3-9f3b-b85e0b670650-kube-api-access-549vc\") pod \"barbican-71d5-account-create-update-mjs9k\" (UID: \"dabb3d2e-57fa-4ad3-9f3b-b85e0b670650\") " pod="openstack/barbican-71d5-account-create-update-mjs9k" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.236173 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd689802-7b27-463e-a155-ed837e8594e6-operator-scripts\") pod \"barbican-db-create-kv8gv\" (UID: \"bd689802-7b27-463e-a155-ed837e8594e6\") " pod="openstack/barbican-db-create-kv8gv" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.237628 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-jsld8"] Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.248982 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-zxh6d"] Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.249859 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zxh6d" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.270173 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8714-account-create-update-wzssg" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.275464 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgch2\" (UniqueName: \"kubernetes.io/projected/bd689802-7b27-463e-a155-ed837e8594e6-kube-api-access-lgch2\") pod \"barbican-db-create-kv8gv\" (UID: \"bd689802-7b27-463e-a155-ed837e8594e6\") " pod="openstack/barbican-db-create-kv8gv" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.287039 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-zxh6d"] Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.332253 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-kv8gv" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.339534 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/465282ce-1312-4cb6-ae89-de6ada48a901-combined-ca-bundle\") pod \"keystone-db-sync-jsld8\" (UID: \"465282ce-1312-4cb6-ae89-de6ada48a901\") " pod="openstack/keystone-db-sync-jsld8" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.339586 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/465282ce-1312-4cb6-ae89-de6ada48a901-config-data\") pod \"keystone-db-sync-jsld8\" (UID: \"465282ce-1312-4cb6-ae89-de6ada48a901\") " pod="openstack/keystone-db-sync-jsld8" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.339611 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dabb3d2e-57fa-4ad3-9f3b-b85e0b670650-operator-scripts\") pod \"barbican-71d5-account-create-update-mjs9k\" (UID: \"dabb3d2e-57fa-4ad3-9f3b-b85e0b670650\") " pod="openstack/barbican-71d5-account-create-update-mjs9k" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.339647 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-549vc\" (UniqueName: \"kubernetes.io/projected/dabb3d2e-57fa-4ad3-9f3b-b85e0b670650-kube-api-access-549vc\") pod \"barbican-71d5-account-create-update-mjs9k\" (UID: \"dabb3d2e-57fa-4ad3-9f3b-b85e0b670650\") " pod="openstack/barbican-71d5-account-create-update-mjs9k" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.339691 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scqkn\" (UniqueName: \"kubernetes.io/projected/465282ce-1312-4cb6-ae89-de6ada48a901-kube-api-access-scqkn\") pod \"keystone-db-sync-jsld8\" (UID: \"465282ce-1312-4cb6-ae89-de6ada48a901\") " pod="openstack/keystone-db-sync-jsld8" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.341994 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dabb3d2e-57fa-4ad3-9f3b-b85e0b670650-operator-scripts\") pod \"barbican-71d5-account-create-update-mjs9k\" (UID: \"dabb3d2e-57fa-4ad3-9f3b-b85e0b670650\") " pod="openstack/barbican-71d5-account-create-update-mjs9k" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.344906 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/465282ce-1312-4cb6-ae89-de6ada48a901-combined-ca-bundle\") pod \"keystone-db-sync-jsld8\" (UID: \"465282ce-1312-4cb6-ae89-de6ada48a901\") " pod="openstack/keystone-db-sync-jsld8" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.348098 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/465282ce-1312-4cb6-ae89-de6ada48a901-config-data\") pod \"keystone-db-sync-jsld8\" (UID: \"465282ce-1312-4cb6-ae89-de6ada48a901\") " pod="openstack/keystone-db-sync-jsld8" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.363807 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-e676-account-create-update-5ntgh"] Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.364745 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e676-account-create-update-5ntgh" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.365873 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-549vc\" (UniqueName: \"kubernetes.io/projected/dabb3d2e-57fa-4ad3-9f3b-b85e0b670650-kube-api-access-549vc\") pod \"barbican-71d5-account-create-update-mjs9k\" (UID: \"dabb3d2e-57fa-4ad3-9f3b-b85e0b670650\") " pod="openstack/barbican-71d5-account-create-update-mjs9k" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.366870 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.382459 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scqkn\" (UniqueName: \"kubernetes.io/projected/465282ce-1312-4cb6-ae89-de6ada48a901-kube-api-access-scqkn\") pod \"keystone-db-sync-jsld8\" (UID: \"465282ce-1312-4cb6-ae89-de6ada48a901\") " pod="openstack/keystone-db-sync-jsld8" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.393481 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-e676-account-create-update-5ntgh"] Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.435135 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-71d5-account-create-update-mjs9k" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.441361 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5nz7\" (UniqueName: \"kubernetes.io/projected/efc2406b-db33-4a33-86f1-dd69b0f537a1-kube-api-access-m5nz7\") pod \"neutron-db-create-zxh6d\" (UID: \"efc2406b-db33-4a33-86f1-dd69b0f537a1\") " pod="openstack/neutron-db-create-zxh6d" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.441502 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efc2406b-db33-4a33-86f1-dd69b0f537a1-operator-scripts\") pod \"neutron-db-create-zxh6d\" (UID: \"efc2406b-db33-4a33-86f1-dd69b0f537a1\") " pod="openstack/neutron-db-create-zxh6d" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.543081 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5nz7\" (UniqueName: \"kubernetes.io/projected/efc2406b-db33-4a33-86f1-dd69b0f537a1-kube-api-access-m5nz7\") pod \"neutron-db-create-zxh6d\" (UID: \"efc2406b-db33-4a33-86f1-dd69b0f537a1\") " pod="openstack/neutron-db-create-zxh6d" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.543441 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46b17f7c-595d-4b78-9076-037fb2998f60-operator-scripts\") pod \"neutron-e676-account-create-update-5ntgh\" (UID: \"46b17f7c-595d-4b78-9076-037fb2998f60\") " pod="openstack/neutron-e676-account-create-update-5ntgh" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.543463 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn2p8\" (UniqueName: \"kubernetes.io/projected/46b17f7c-595d-4b78-9076-037fb2998f60-kube-api-access-tn2p8\") pod \"neutron-e676-account-create-update-5ntgh\" (UID: \"46b17f7c-595d-4b78-9076-037fb2998f60\") " pod="openstack/neutron-e676-account-create-update-5ntgh" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.543509 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efc2406b-db33-4a33-86f1-dd69b0f537a1-operator-scripts\") pod \"neutron-db-create-zxh6d\" (UID: \"efc2406b-db33-4a33-86f1-dd69b0f537a1\") " pod="openstack/neutron-db-create-zxh6d" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.544393 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efc2406b-db33-4a33-86f1-dd69b0f537a1-operator-scripts\") pod \"neutron-db-create-zxh6d\" (UID: \"efc2406b-db33-4a33-86f1-dd69b0f537a1\") " pod="openstack/neutron-db-create-zxh6d" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.578534 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5nz7\" (UniqueName: \"kubernetes.io/projected/efc2406b-db33-4a33-86f1-dd69b0f537a1-kube-api-access-m5nz7\") pod \"neutron-db-create-zxh6d\" (UID: \"efc2406b-db33-4a33-86f1-dd69b0f537a1\") " pod="openstack/neutron-db-create-zxh6d" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.618297 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jsld8" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.645676 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46b17f7c-595d-4b78-9076-037fb2998f60-operator-scripts\") pod \"neutron-e676-account-create-update-5ntgh\" (UID: \"46b17f7c-595d-4b78-9076-037fb2998f60\") " pod="openstack/neutron-e676-account-create-update-5ntgh" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.645714 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn2p8\" (UniqueName: \"kubernetes.io/projected/46b17f7c-595d-4b78-9076-037fb2998f60-kube-api-access-tn2p8\") pod \"neutron-e676-account-create-update-5ntgh\" (UID: \"46b17f7c-595d-4b78-9076-037fb2998f60\") " pod="openstack/neutron-e676-account-create-update-5ntgh" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.646592 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46b17f7c-595d-4b78-9076-037fb2998f60-operator-scripts\") pod \"neutron-e676-account-create-update-5ntgh\" (UID: \"46b17f7c-595d-4b78-9076-037fb2998f60\") " pod="openstack/neutron-e676-account-create-update-5ntgh" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.658997 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zxh6d" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.669670 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn2p8\" (UniqueName: \"kubernetes.io/projected/46b17f7c-595d-4b78-9076-037fb2998f60-kube-api-access-tn2p8\") pod \"neutron-e676-account-create-update-5ntgh\" (UID: \"46b17f7c-595d-4b78-9076-037fb2998f60\") " pod="openstack/neutron-e676-account-create-update-5ntgh" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.691243 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e676-account-create-update-5ntgh" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.728267 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-t8rmw"] Mar 01 09:27:45 crc kubenswrapper[4792]: W0301 09:27:45.849885 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0b42afb_2954_442e_bc91_4c8275a4d2fd.slice/crio-6ef24e8a781a92526aa634263b2543452521c2477d0c97fecbfe90d267730b1c WatchSource:0}: Error finding container 6ef24e8a781a92526aa634263b2543452521c2477d0c97fecbfe90d267730b1c: Status 404 returned error can't find the container with id 6ef24e8a781a92526aa634263b2543452521c2477d0c97fecbfe90d267730b1c Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.943132 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8714-account-create-update-wzssg"] Mar 01 09:27:46 crc kubenswrapper[4792]: I0301 09:27:46.151598 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-71d5-account-create-update-mjs9k"] Mar 01 09:27:46 crc kubenswrapper[4792]: I0301 09:27:46.292286 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-jsld8"] Mar 01 09:27:46 crc kubenswrapper[4792]: I0301 09:27:46.333190 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-kv8gv"] Mar 01 09:27:46 crc kubenswrapper[4792]: I0301 09:27:46.421493 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" Mar 01 09:27:46 crc kubenswrapper[4792]: I0301 09:27:46.443558 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-e676-account-create-update-5ntgh"] Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:46.473123 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-zxh6d"] Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:46.508600 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-mz86z"] Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:46.509133 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" podUID="b3682585-f554-4a65-86cb-096243ccc793" containerName="dnsmasq-dns" containerID="cri-o://6049c60340fa24dee1fdddb897ca32ccd559a6ca6bdacf1ad0f33b624dbf4865" gracePeriod=10 Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:46.808128 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-71d5-account-create-update-mjs9k" event={"ID":"dabb3d2e-57fa-4ad3-9f3b-b85e0b670650","Type":"ContainerStarted","Data":"1cc52ebb7e1b86f46dbab0e11949d60082faaf96962f0529b5c27c6156f59218"} Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:46.808188 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-71d5-account-create-update-mjs9k" event={"ID":"dabb3d2e-57fa-4ad3-9f3b-b85e0b670650","Type":"ContainerStarted","Data":"212a58af65a44bd380179af8061fb1773b3c54204d964227d1d4aa6b00521785"} Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:46.830034 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-71d5-account-create-update-mjs9k" podStartSLOduration=1.830018323 podStartE2EDuration="1.830018323s" podCreationTimestamp="2026-03-01 09:27:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:27:46.823833891 +0000 UTC m=+1196.065713088" watchObservedRunningTime="2026-03-01 09:27:46.830018323 +0000 UTC m=+1196.071897520" Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:46.830415 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8714-account-create-update-wzssg" event={"ID":"b715bb3f-b181-4614-85c5-9155286ce80c","Type":"ContainerStarted","Data":"d1c76ac502d7f1c626951dc28dbcf8372a61e85a2c8a22e8bee3f4ce1c1f91c2"} Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:46.830451 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8714-account-create-update-wzssg" event={"ID":"b715bb3f-b181-4614-85c5-9155286ce80c","Type":"ContainerStarted","Data":"1ea35c12547bb348b7526cdd5cb7acad399b530142f24896499aaad66db0ae8a"} Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:46.839698 4792 generic.go:334] "Generic (PLEG): container finished" podID="f0b42afb-2954-442e-bc91-4c8275a4d2fd" containerID="d49c80ba137c1dfc24f0da7a4050addac018dbbe5eed7701b9bf0c31b472eef5" exitCode=0 Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:46.839832 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-t8rmw" event={"ID":"f0b42afb-2954-442e-bc91-4c8275a4d2fd","Type":"ContainerDied","Data":"d49c80ba137c1dfc24f0da7a4050addac018dbbe5eed7701b9bf0c31b472eef5"} Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:46.839858 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-t8rmw" event={"ID":"f0b42afb-2954-442e-bc91-4c8275a4d2fd","Type":"ContainerStarted","Data":"6ef24e8a781a92526aa634263b2543452521c2477d0c97fecbfe90d267730b1c"} Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:46.840975 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-zxh6d" event={"ID":"efc2406b-db33-4a33-86f1-dd69b0f537a1","Type":"ContainerStarted","Data":"0f765eadffce7430f601cba60023ade252609311b4c4752eb936b31a4dd4037c"} Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:46.841878 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jsld8" event={"ID":"465282ce-1312-4cb6-ae89-de6ada48a901","Type":"ContainerStarted","Data":"56894213f0a89a43fb441887180443bcea8f64c8866a065497a7cd889c1c397c"} Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:46.842995 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-kv8gv" event={"ID":"bd689802-7b27-463e-a155-ed837e8594e6","Type":"ContainerStarted","Data":"eed00bc467207cc8dc0e2dcbaae3a8c2c1b42a1295709035f04dc74d0943f1f0"} Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:46.844989 4792 generic.go:334] "Generic (PLEG): container finished" podID="b3682585-f554-4a65-86cb-096243ccc793" containerID="6049c60340fa24dee1fdddb897ca32ccd559a6ca6bdacf1ad0f33b624dbf4865" exitCode=0 Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:46.845087 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" event={"ID":"b3682585-f554-4a65-86cb-096243ccc793","Type":"ContainerDied","Data":"6049c60340fa24dee1fdddb897ca32ccd559a6ca6bdacf1ad0f33b624dbf4865"} Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:46.846486 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e676-account-create-update-5ntgh" event={"ID":"46b17f7c-595d-4b78-9076-037fb2998f60","Type":"ContainerStarted","Data":"babfcbb82a67d5d4aee470254138e36f3a5f2fe5e63c17001d84f8159a1935e7"} Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:46.859364 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-8714-account-create-update-wzssg" podStartSLOduration=2.859347773 podStartE2EDuration="2.859347773s" podCreationTimestamp="2026-03-01 09:27:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:27:46.854242378 +0000 UTC m=+1196.096121585" watchObservedRunningTime="2026-03-01 09:27:46.859347773 +0000 UTC m=+1196.101226970" Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.126636 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.298493 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8m768\" (UniqueName: \"kubernetes.io/projected/b3682585-f554-4a65-86cb-096243ccc793-kube-api-access-8m768\") pod \"b3682585-f554-4a65-86cb-096243ccc793\" (UID: \"b3682585-f554-4a65-86cb-096243ccc793\") " Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.298571 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-ovsdbserver-nb\") pod \"b3682585-f554-4a65-86cb-096243ccc793\" (UID: \"b3682585-f554-4a65-86cb-096243ccc793\") " Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.298645 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-dns-svc\") pod \"b3682585-f554-4a65-86cb-096243ccc793\" (UID: \"b3682585-f554-4a65-86cb-096243ccc793\") " Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.298868 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-config\") pod \"b3682585-f554-4a65-86cb-096243ccc793\" (UID: \"b3682585-f554-4a65-86cb-096243ccc793\") " Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.298893 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-ovsdbserver-sb\") pod \"b3682585-f554-4a65-86cb-096243ccc793\" (UID: \"b3682585-f554-4a65-86cb-096243ccc793\") " Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.320994 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3682585-f554-4a65-86cb-096243ccc793-kube-api-access-8m768" (OuterVolumeSpecName: "kube-api-access-8m768") pod "b3682585-f554-4a65-86cb-096243ccc793" (UID: "b3682585-f554-4a65-86cb-096243ccc793"). InnerVolumeSpecName "kube-api-access-8m768". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.363264 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b3682585-f554-4a65-86cb-096243ccc793" (UID: "b3682585-f554-4a65-86cb-096243ccc793"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.377191 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b3682585-f554-4a65-86cb-096243ccc793" (UID: "b3682585-f554-4a65-86cb-096243ccc793"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.377264 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-config" (OuterVolumeSpecName: "config") pod "b3682585-f554-4a65-86cb-096243ccc793" (UID: "b3682585-f554-4a65-86cb-096243ccc793"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.400991 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.401015 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.401026 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8m768\" (UniqueName: \"kubernetes.io/projected/b3682585-f554-4a65-86cb-096243ccc793-kube-api-access-8m768\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.401034 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.409688 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b3682585-f554-4a65-86cb-096243ccc793" (UID: "b3682585-f554-4a65-86cb-096243ccc793"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:47 crc kubenswrapper[4792]: E0301 09:27:47.477305 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podefc2406b_db33_4a33_86f1_dd69b0f537a1.slice/crio-3d36f5fe200b4f79f67807598d19a358fe63f35f70500bebea7ecf29d4c8c11d.scope\": RecentStats: unable to find data in memory cache]" Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.503086 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.862460 4792 generic.go:334] "Generic (PLEG): container finished" podID="bd689802-7b27-463e-a155-ed837e8594e6" containerID="a5333afd5d7c2f19e4d0551bd45c113ef37b9f8fcc1a7b85eb962769ca9d63e5" exitCode=0 Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.862529 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-kv8gv" event={"ID":"bd689802-7b27-463e-a155-ed837e8594e6","Type":"ContainerDied","Data":"a5333afd5d7c2f19e4d0551bd45c113ef37b9f8fcc1a7b85eb962769ca9d63e5"} Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.865152 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" event={"ID":"b3682585-f554-4a65-86cb-096243ccc793","Type":"ContainerDied","Data":"b2e8c06688804bde6de3674de22e963e00a7db65a6fb4924d8a98f95171a76cc"} Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.865213 4792 scope.go:117] "RemoveContainer" containerID="6049c60340fa24dee1fdddb897ca32ccd559a6ca6bdacf1ad0f33b624dbf4865" Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.865323 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.867864 4792 generic.go:334] "Generic (PLEG): container finished" podID="46b17f7c-595d-4b78-9076-037fb2998f60" containerID="2a9eb88c21c0505fd080c3b8fba46cc255546b5fb4c130561920988c70383a89" exitCode=0 Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.867922 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e676-account-create-update-5ntgh" event={"ID":"46b17f7c-595d-4b78-9076-037fb2998f60","Type":"ContainerDied","Data":"2a9eb88c21c0505fd080c3b8fba46cc255546b5fb4c130561920988c70383a89"} Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.871176 4792 generic.go:334] "Generic (PLEG): container finished" podID="dabb3d2e-57fa-4ad3-9f3b-b85e0b670650" containerID="1cc52ebb7e1b86f46dbab0e11949d60082faaf96962f0529b5c27c6156f59218" exitCode=0 Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.871236 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-71d5-account-create-update-mjs9k" event={"ID":"dabb3d2e-57fa-4ad3-9f3b-b85e0b670650","Type":"ContainerDied","Data":"1cc52ebb7e1b86f46dbab0e11949d60082faaf96962f0529b5c27c6156f59218"} Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.872988 4792 generic.go:334] "Generic (PLEG): container finished" podID="b715bb3f-b181-4614-85c5-9155286ce80c" containerID="d1c76ac502d7f1c626951dc28dbcf8372a61e85a2c8a22e8bee3f4ce1c1f91c2" exitCode=0 Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.873035 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8714-account-create-update-wzssg" event={"ID":"b715bb3f-b181-4614-85c5-9155286ce80c","Type":"ContainerDied","Data":"d1c76ac502d7f1c626951dc28dbcf8372a61e85a2c8a22e8bee3f4ce1c1f91c2"} Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.878414 4792 generic.go:334] "Generic (PLEG): container finished" podID="efc2406b-db33-4a33-86f1-dd69b0f537a1" containerID="3d36f5fe200b4f79f67807598d19a358fe63f35f70500bebea7ecf29d4c8c11d" exitCode=0 Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.878599 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-zxh6d" event={"ID":"efc2406b-db33-4a33-86f1-dd69b0f537a1","Type":"ContainerDied","Data":"3d36f5fe200b4f79f67807598d19a358fe63f35f70500bebea7ecf29d4c8c11d"} Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.921443 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-mz86z"] Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.928127 4792 scope.go:117] "RemoveContainer" containerID="0ca19db3fcc227c24e95850e002b21aaf1788482cc19ab24284a2d399a8eb0fd" Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.931765 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-mz86z"] Mar 01 09:27:48 crc kubenswrapper[4792]: I0301 09:27:48.148809 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-t8rmw" Mar 01 09:27:48 crc kubenswrapper[4792]: I0301 09:27:48.326236 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0b42afb-2954-442e-bc91-4c8275a4d2fd-operator-scripts\") pod \"f0b42afb-2954-442e-bc91-4c8275a4d2fd\" (UID: \"f0b42afb-2954-442e-bc91-4c8275a4d2fd\") " Mar 01 09:27:48 crc kubenswrapper[4792]: I0301 09:27:48.326338 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkfk2\" (UniqueName: \"kubernetes.io/projected/f0b42afb-2954-442e-bc91-4c8275a4d2fd-kube-api-access-nkfk2\") pod \"f0b42afb-2954-442e-bc91-4c8275a4d2fd\" (UID: \"f0b42afb-2954-442e-bc91-4c8275a4d2fd\") " Mar 01 09:27:48 crc kubenswrapper[4792]: I0301 09:27:48.326997 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0b42afb-2954-442e-bc91-4c8275a4d2fd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f0b42afb-2954-442e-bc91-4c8275a4d2fd" (UID: "f0b42afb-2954-442e-bc91-4c8275a4d2fd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:48 crc kubenswrapper[4792]: I0301 09:27:48.329582 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0b42afb-2954-442e-bc91-4c8275a4d2fd-kube-api-access-nkfk2" (OuterVolumeSpecName: "kube-api-access-nkfk2") pod "f0b42afb-2954-442e-bc91-4c8275a4d2fd" (UID: "f0b42afb-2954-442e-bc91-4c8275a4d2fd"). InnerVolumeSpecName "kube-api-access-nkfk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:27:48 crc kubenswrapper[4792]: I0301 09:27:48.427998 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0b42afb-2954-442e-bc91-4c8275a4d2fd-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:48 crc kubenswrapper[4792]: I0301 09:27:48.428034 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkfk2\" (UniqueName: \"kubernetes.io/projected/f0b42afb-2954-442e-bc91-4c8275a4d2fd-kube-api-access-nkfk2\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:48 crc kubenswrapper[4792]: I0301 09:27:48.887297 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-t8rmw" event={"ID":"f0b42afb-2954-442e-bc91-4c8275a4d2fd","Type":"ContainerDied","Data":"6ef24e8a781a92526aa634263b2543452521c2477d0c97fecbfe90d267730b1c"} Mar 01 09:27:48 crc kubenswrapper[4792]: I0301 09:27:48.887350 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ef24e8a781a92526aa634263b2543452521c2477d0c97fecbfe90d267730b1c" Mar 01 09:27:48 crc kubenswrapper[4792]: I0301 09:27:48.887316 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-t8rmw" Mar 01 09:27:49 crc kubenswrapper[4792]: I0301 09:27:49.418441 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3682585-f554-4a65-86cb-096243ccc793" path="/var/lib/kubelet/pods/b3682585-f554-4a65-86cb-096243ccc793/volumes" Mar 01 09:27:50 crc kubenswrapper[4792]: I0301 09:27:50.909787 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e676-account-create-update-5ntgh" event={"ID":"46b17f7c-595d-4b78-9076-037fb2998f60","Type":"ContainerDied","Data":"babfcbb82a67d5d4aee470254138e36f3a5f2fe5e63c17001d84f8159a1935e7"} Mar 01 09:27:50 crc kubenswrapper[4792]: I0301 09:27:50.910142 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="babfcbb82a67d5d4aee470254138e36f3a5f2fe5e63c17001d84f8159a1935e7" Mar 01 09:27:50 crc kubenswrapper[4792]: I0301 09:27:50.912059 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-71d5-account-create-update-mjs9k" event={"ID":"dabb3d2e-57fa-4ad3-9f3b-b85e0b670650","Type":"ContainerDied","Data":"212a58af65a44bd380179af8061fb1773b3c54204d964227d1d4aa6b00521785"} Mar 01 09:27:50 crc kubenswrapper[4792]: I0301 09:27:50.912087 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="212a58af65a44bd380179af8061fb1773b3c54204d964227d1d4aa6b00521785" Mar 01 09:27:50 crc kubenswrapper[4792]: I0301 09:27:50.913767 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8714-account-create-update-wzssg" event={"ID":"b715bb3f-b181-4614-85c5-9155286ce80c","Type":"ContainerDied","Data":"1ea35c12547bb348b7526cdd5cb7acad399b530142f24896499aaad66db0ae8a"} Mar 01 09:27:50 crc kubenswrapper[4792]: I0301 09:27:50.913795 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ea35c12547bb348b7526cdd5cb7acad399b530142f24896499aaad66db0ae8a" Mar 01 09:27:50 crc kubenswrapper[4792]: I0301 09:27:50.915982 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-zxh6d" event={"ID":"efc2406b-db33-4a33-86f1-dd69b0f537a1","Type":"ContainerDied","Data":"0f765eadffce7430f601cba60023ade252609311b4c4752eb936b31a4dd4037c"} Mar 01 09:27:50 crc kubenswrapper[4792]: I0301 09:27:50.916038 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f765eadffce7430f601cba60023ade252609311b4c4752eb936b31a4dd4037c" Mar 01 09:27:50 crc kubenswrapper[4792]: I0301 09:27:50.918625 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-kv8gv" event={"ID":"bd689802-7b27-463e-a155-ed837e8594e6","Type":"ContainerDied","Data":"eed00bc467207cc8dc0e2dcbaae3a8c2c1b42a1295709035f04dc74d0943f1f0"} Mar 01 09:27:50 crc kubenswrapper[4792]: I0301 09:27:50.918652 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eed00bc467207cc8dc0e2dcbaae3a8c2c1b42a1295709035f04dc74d0943f1f0" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.029417 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zxh6d" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.034270 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-71d5-account-create-update-mjs9k" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.053598 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e676-account-create-update-5ntgh" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.074423 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dabb3d2e-57fa-4ad3-9f3b-b85e0b670650-operator-scripts\") pod \"dabb3d2e-57fa-4ad3-9f3b-b85e0b670650\" (UID: \"dabb3d2e-57fa-4ad3-9f3b-b85e0b670650\") " Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.074489 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efc2406b-db33-4a33-86f1-dd69b0f537a1-operator-scripts\") pod \"efc2406b-db33-4a33-86f1-dd69b0f537a1\" (UID: \"efc2406b-db33-4a33-86f1-dd69b0f537a1\") " Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.074539 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-549vc\" (UniqueName: \"kubernetes.io/projected/dabb3d2e-57fa-4ad3-9f3b-b85e0b670650-kube-api-access-549vc\") pod \"dabb3d2e-57fa-4ad3-9f3b-b85e0b670650\" (UID: \"dabb3d2e-57fa-4ad3-9f3b-b85e0b670650\") " Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.074561 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5nz7\" (UniqueName: \"kubernetes.io/projected/efc2406b-db33-4a33-86f1-dd69b0f537a1-kube-api-access-m5nz7\") pod \"efc2406b-db33-4a33-86f1-dd69b0f537a1\" (UID: \"efc2406b-db33-4a33-86f1-dd69b0f537a1\") " Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.074579 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tn2p8\" (UniqueName: \"kubernetes.io/projected/46b17f7c-595d-4b78-9076-037fb2998f60-kube-api-access-tn2p8\") pod \"46b17f7c-595d-4b78-9076-037fb2998f60\" (UID: \"46b17f7c-595d-4b78-9076-037fb2998f60\") " Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.074623 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46b17f7c-595d-4b78-9076-037fb2998f60-operator-scripts\") pod \"46b17f7c-595d-4b78-9076-037fb2998f60\" (UID: \"46b17f7c-595d-4b78-9076-037fb2998f60\") " Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.076296 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46b17f7c-595d-4b78-9076-037fb2998f60-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "46b17f7c-595d-4b78-9076-037fb2998f60" (UID: "46b17f7c-595d-4b78-9076-037fb2998f60"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.076963 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efc2406b-db33-4a33-86f1-dd69b0f537a1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "efc2406b-db33-4a33-86f1-dd69b0f537a1" (UID: "efc2406b-db33-4a33-86f1-dd69b0f537a1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.077134 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dabb3d2e-57fa-4ad3-9f3b-b85e0b670650-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dabb3d2e-57fa-4ad3-9f3b-b85e0b670650" (UID: "dabb3d2e-57fa-4ad3-9f3b-b85e0b670650"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.090445 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dabb3d2e-57fa-4ad3-9f3b-b85e0b670650-kube-api-access-549vc" (OuterVolumeSpecName: "kube-api-access-549vc") pod "dabb3d2e-57fa-4ad3-9f3b-b85e0b670650" (UID: "dabb3d2e-57fa-4ad3-9f3b-b85e0b670650"). InnerVolumeSpecName "kube-api-access-549vc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.092167 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46b17f7c-595d-4b78-9076-037fb2998f60-kube-api-access-tn2p8" (OuterVolumeSpecName: "kube-api-access-tn2p8") pod "46b17f7c-595d-4b78-9076-037fb2998f60" (UID: "46b17f7c-595d-4b78-9076-037fb2998f60"). InnerVolumeSpecName "kube-api-access-tn2p8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.094689 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efc2406b-db33-4a33-86f1-dd69b0f537a1-kube-api-access-m5nz7" (OuterVolumeSpecName: "kube-api-access-m5nz7") pod "efc2406b-db33-4a33-86f1-dd69b0f537a1" (UID: "efc2406b-db33-4a33-86f1-dd69b0f537a1"). InnerVolumeSpecName "kube-api-access-m5nz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.136235 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8714-account-create-update-wzssg" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.139900 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-kv8gv" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.181265 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dabb3d2e-57fa-4ad3-9f3b-b85e0b670650-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.181339 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efc2406b-db33-4a33-86f1-dd69b0f537a1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.181378 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-549vc\" (UniqueName: \"kubernetes.io/projected/dabb3d2e-57fa-4ad3-9f3b-b85e0b670650-kube-api-access-549vc\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.181398 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5nz7\" (UniqueName: \"kubernetes.io/projected/efc2406b-db33-4a33-86f1-dd69b0f537a1-kube-api-access-m5nz7\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.181424 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tn2p8\" (UniqueName: \"kubernetes.io/projected/46b17f7c-595d-4b78-9076-037fb2998f60-kube-api-access-tn2p8\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.181433 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46b17f7c-595d-4b78-9076-037fb2998f60-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.282216 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd689802-7b27-463e-a155-ed837e8594e6-operator-scripts\") pod \"bd689802-7b27-463e-a155-ed837e8594e6\" (UID: \"bd689802-7b27-463e-a155-ed837e8594e6\") " Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.282384 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b715bb3f-b181-4614-85c5-9155286ce80c-operator-scripts\") pod \"b715bb3f-b181-4614-85c5-9155286ce80c\" (UID: \"b715bb3f-b181-4614-85c5-9155286ce80c\") " Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.282473 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgch2\" (UniqueName: \"kubernetes.io/projected/bd689802-7b27-463e-a155-ed837e8594e6-kube-api-access-lgch2\") pod \"bd689802-7b27-463e-a155-ed837e8594e6\" (UID: \"bd689802-7b27-463e-a155-ed837e8594e6\") " Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.282495 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djxnh\" (UniqueName: \"kubernetes.io/projected/b715bb3f-b181-4614-85c5-9155286ce80c-kube-api-access-djxnh\") pod \"b715bb3f-b181-4614-85c5-9155286ce80c\" (UID: \"b715bb3f-b181-4614-85c5-9155286ce80c\") " Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.282756 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd689802-7b27-463e-a155-ed837e8594e6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bd689802-7b27-463e-a155-ed837e8594e6" (UID: "bd689802-7b27-463e-a155-ed837e8594e6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.283243 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd689802-7b27-463e-a155-ed837e8594e6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.283372 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b715bb3f-b181-4614-85c5-9155286ce80c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b715bb3f-b181-4614-85c5-9155286ce80c" (UID: "b715bb3f-b181-4614-85c5-9155286ce80c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.285698 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b715bb3f-b181-4614-85c5-9155286ce80c-kube-api-access-djxnh" (OuterVolumeSpecName: "kube-api-access-djxnh") pod "b715bb3f-b181-4614-85c5-9155286ce80c" (UID: "b715bb3f-b181-4614-85c5-9155286ce80c"). InnerVolumeSpecName "kube-api-access-djxnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.287883 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd689802-7b27-463e-a155-ed837e8594e6-kube-api-access-lgch2" (OuterVolumeSpecName: "kube-api-access-lgch2") pod "bd689802-7b27-463e-a155-ed837e8594e6" (UID: "bd689802-7b27-463e-a155-ed837e8594e6"). InnerVolumeSpecName "kube-api-access-lgch2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.384849 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b715bb3f-b181-4614-85c5-9155286ce80c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.384896 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgch2\" (UniqueName: \"kubernetes.io/projected/bd689802-7b27-463e-a155-ed837e8594e6-kube-api-access-lgch2\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.384931 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djxnh\" (UniqueName: \"kubernetes.io/projected/b715bb3f-b181-4614-85c5-9155286ce80c-kube-api-access-djxnh\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.927098 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jsld8" event={"ID":"465282ce-1312-4cb6-ae89-de6ada48a901","Type":"ContainerStarted","Data":"ca00fe55b9ec531c46f7a5dbc120ce6185518403d6904d883bc1ec756288e2a0"} Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.927223 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-kv8gv" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.927266 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zxh6d" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.927214 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-71d5-account-create-update-mjs9k" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.927269 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8714-account-create-update-wzssg" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.927372 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e676-account-create-update-5ntgh" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.946400 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-jsld8" podStartSLOduration=2.37540398 podStartE2EDuration="6.946382109s" podCreationTimestamp="2026-03-01 09:27:45 +0000 UTC" firstStartedPulling="2026-03-01 09:27:46.331957297 +0000 UTC m=+1195.573836484" lastFinishedPulling="2026-03-01 09:27:50.902935416 +0000 UTC m=+1200.144814613" observedRunningTime="2026-03-01 09:27:51.940860243 +0000 UTC m=+1201.182739440" watchObservedRunningTime="2026-03-01 09:27:51.946382109 +0000 UTC m=+1201.188261316" Mar 01 09:27:54 crc kubenswrapper[4792]: I0301 09:27:54.949754 4792 generic.go:334] "Generic (PLEG): container finished" podID="465282ce-1312-4cb6-ae89-de6ada48a901" containerID="ca00fe55b9ec531c46f7a5dbc120ce6185518403d6904d883bc1ec756288e2a0" exitCode=0 Mar 01 09:27:54 crc kubenswrapper[4792]: I0301 09:27:54.949994 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jsld8" event={"ID":"465282ce-1312-4cb6-ae89-de6ada48a901","Type":"ContainerDied","Data":"ca00fe55b9ec531c46f7a5dbc120ce6185518403d6904d883bc1ec756288e2a0"} Mar 01 09:27:56 crc kubenswrapper[4792]: I0301 09:27:56.243809 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jsld8" Mar 01 09:27:56 crc kubenswrapper[4792]: I0301 09:27:56.276644 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scqkn\" (UniqueName: \"kubernetes.io/projected/465282ce-1312-4cb6-ae89-de6ada48a901-kube-api-access-scqkn\") pod \"465282ce-1312-4cb6-ae89-de6ada48a901\" (UID: \"465282ce-1312-4cb6-ae89-de6ada48a901\") " Mar 01 09:27:56 crc kubenswrapper[4792]: I0301 09:27:56.276795 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/465282ce-1312-4cb6-ae89-de6ada48a901-combined-ca-bundle\") pod \"465282ce-1312-4cb6-ae89-de6ada48a901\" (UID: \"465282ce-1312-4cb6-ae89-de6ada48a901\") " Mar 01 09:27:56 crc kubenswrapper[4792]: I0301 09:27:56.276883 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/465282ce-1312-4cb6-ae89-de6ada48a901-config-data\") pod \"465282ce-1312-4cb6-ae89-de6ada48a901\" (UID: \"465282ce-1312-4cb6-ae89-de6ada48a901\") " Mar 01 09:27:56 crc kubenswrapper[4792]: I0301 09:27:56.298654 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/465282ce-1312-4cb6-ae89-de6ada48a901-kube-api-access-scqkn" (OuterVolumeSpecName: "kube-api-access-scqkn") pod "465282ce-1312-4cb6-ae89-de6ada48a901" (UID: "465282ce-1312-4cb6-ae89-de6ada48a901"). InnerVolumeSpecName "kube-api-access-scqkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:27:56 crc kubenswrapper[4792]: I0301 09:27:56.305167 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/465282ce-1312-4cb6-ae89-de6ada48a901-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "465282ce-1312-4cb6-ae89-de6ada48a901" (UID: "465282ce-1312-4cb6-ae89-de6ada48a901"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:27:56 crc kubenswrapper[4792]: I0301 09:27:56.334425 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/465282ce-1312-4cb6-ae89-de6ada48a901-config-data" (OuterVolumeSpecName: "config-data") pod "465282ce-1312-4cb6-ae89-de6ada48a901" (UID: "465282ce-1312-4cb6-ae89-de6ada48a901"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:27:56 crc kubenswrapper[4792]: I0301 09:27:56.378716 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/465282ce-1312-4cb6-ae89-de6ada48a901-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:56 crc kubenswrapper[4792]: I0301 09:27:56.378837 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/465282ce-1312-4cb6-ae89-de6ada48a901-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:56 crc kubenswrapper[4792]: I0301 09:27:56.378892 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scqkn\" (UniqueName: \"kubernetes.io/projected/465282ce-1312-4cb6-ae89-de6ada48a901-kube-api-access-scqkn\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:56 crc kubenswrapper[4792]: I0301 09:27:56.968302 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jsld8" event={"ID":"465282ce-1312-4cb6-ae89-de6ada48a901","Type":"ContainerDied","Data":"56894213f0a89a43fb441887180443bcea8f64c8866a065497a7cd889c1c397c"} Mar 01 09:27:56 crc kubenswrapper[4792]: I0301 09:27:56.968353 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56894213f0a89a43fb441887180443bcea8f64c8866a065497a7cd889c1c397c" Mar 01 09:27:56 crc kubenswrapper[4792]: I0301 09:27:56.968400 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jsld8" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.215738 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7796644bc-dm4ww"] Mar 01 09:27:57 crc kubenswrapper[4792]: E0301 09:27:57.216407 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3682585-f554-4a65-86cb-096243ccc793" containerName="dnsmasq-dns" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.216427 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3682585-f554-4a65-86cb-096243ccc793" containerName="dnsmasq-dns" Mar 01 09:27:57 crc kubenswrapper[4792]: E0301 09:27:57.216436 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46b17f7c-595d-4b78-9076-037fb2998f60" containerName="mariadb-account-create-update" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.216443 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="46b17f7c-595d-4b78-9076-037fb2998f60" containerName="mariadb-account-create-update" Mar 01 09:27:57 crc kubenswrapper[4792]: E0301 09:27:57.216456 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd689802-7b27-463e-a155-ed837e8594e6" containerName="mariadb-database-create" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.216463 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd689802-7b27-463e-a155-ed837e8594e6" containerName="mariadb-database-create" Mar 01 09:27:57 crc kubenswrapper[4792]: E0301 09:27:57.216474 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="465282ce-1312-4cb6-ae89-de6ada48a901" containerName="keystone-db-sync" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.216479 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="465282ce-1312-4cb6-ae89-de6ada48a901" containerName="keystone-db-sync" Mar 01 09:27:57 crc kubenswrapper[4792]: E0301 09:27:57.216488 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3682585-f554-4a65-86cb-096243ccc793" containerName="init" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.216494 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3682585-f554-4a65-86cb-096243ccc793" containerName="init" Mar 01 09:27:57 crc kubenswrapper[4792]: E0301 09:27:57.216504 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b715bb3f-b181-4614-85c5-9155286ce80c" containerName="mariadb-account-create-update" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.216512 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b715bb3f-b181-4614-85c5-9155286ce80c" containerName="mariadb-account-create-update" Mar 01 09:27:57 crc kubenswrapper[4792]: E0301 09:27:57.216521 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efc2406b-db33-4a33-86f1-dd69b0f537a1" containerName="mariadb-database-create" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.216527 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="efc2406b-db33-4a33-86f1-dd69b0f537a1" containerName="mariadb-database-create" Mar 01 09:27:57 crc kubenswrapper[4792]: E0301 09:27:57.216539 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0b42afb-2954-442e-bc91-4c8275a4d2fd" containerName="mariadb-database-create" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.216545 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0b42afb-2954-442e-bc91-4c8275a4d2fd" containerName="mariadb-database-create" Mar 01 09:27:57 crc kubenswrapper[4792]: E0301 09:27:57.216551 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dabb3d2e-57fa-4ad3-9f3b-b85e0b670650" containerName="mariadb-account-create-update" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.216556 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="dabb3d2e-57fa-4ad3-9f3b-b85e0b670650" containerName="mariadb-account-create-update" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.216692 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3682585-f554-4a65-86cb-096243ccc793" containerName="dnsmasq-dns" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.216709 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="dabb3d2e-57fa-4ad3-9f3b-b85e0b670650" containerName="mariadb-account-create-update" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.216720 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="b715bb3f-b181-4614-85c5-9155286ce80c" containerName="mariadb-account-create-update" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.216729 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0b42afb-2954-442e-bc91-4c8275a4d2fd" containerName="mariadb-database-create" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.216738 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd689802-7b27-463e-a155-ed837e8594e6" containerName="mariadb-database-create" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.216748 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="46b17f7c-595d-4b78-9076-037fb2998f60" containerName="mariadb-account-create-update" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.216762 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="efc2406b-db33-4a33-86f1-dd69b0f537a1" containerName="mariadb-database-create" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.216771 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="465282ce-1312-4cb6-ae89-de6ada48a901" containerName="keystone-db-sync" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.217534 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7796644bc-dm4ww" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.237195 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-2zjg5"] Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.238249 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2zjg5" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.243979 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.243982 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.244039 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-fr9vh" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.244108 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.244793 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.255802 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7796644bc-dm4ww"] Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.265585 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-2zjg5"] Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.295002 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-config-data\") pod \"keystone-bootstrap-2zjg5\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " pod="openstack/keystone-bootstrap-2zjg5" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.295034 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-scripts\") pod \"keystone-bootstrap-2zjg5\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " pod="openstack/keystone-bootstrap-2zjg5" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.295076 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-credential-keys\") pod \"keystone-bootstrap-2zjg5\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " pod="openstack/keystone-bootstrap-2zjg5" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.295092 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-dns-svc\") pod \"dnsmasq-dns-7796644bc-dm4ww\" (UID: \"30b802af-3af5-430f-b06f-709fd4606fd0\") " pod="openstack/dnsmasq-dns-7796644bc-dm4ww" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.295115 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-ovsdbserver-sb\") pod \"dnsmasq-dns-7796644bc-dm4ww\" (UID: \"30b802af-3af5-430f-b06f-709fd4606fd0\") " pod="openstack/dnsmasq-dns-7796644bc-dm4ww" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.295145 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-combined-ca-bundle\") pod \"keystone-bootstrap-2zjg5\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " pod="openstack/keystone-bootstrap-2zjg5" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.295170 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxwh8\" (UniqueName: \"kubernetes.io/projected/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-kube-api-access-jxwh8\") pod \"keystone-bootstrap-2zjg5\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " pod="openstack/keystone-bootstrap-2zjg5" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.295202 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mwlc\" (UniqueName: \"kubernetes.io/projected/30b802af-3af5-430f-b06f-709fd4606fd0-kube-api-access-4mwlc\") pod \"dnsmasq-dns-7796644bc-dm4ww\" (UID: \"30b802af-3af5-430f-b06f-709fd4606fd0\") " pod="openstack/dnsmasq-dns-7796644bc-dm4ww" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.295219 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-config\") pod \"dnsmasq-dns-7796644bc-dm4ww\" (UID: \"30b802af-3af5-430f-b06f-709fd4606fd0\") " pod="openstack/dnsmasq-dns-7796644bc-dm4ww" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.295258 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-ovsdbserver-nb\") pod \"dnsmasq-dns-7796644bc-dm4ww\" (UID: \"30b802af-3af5-430f-b06f-709fd4606fd0\") " pod="openstack/dnsmasq-dns-7796644bc-dm4ww" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.295295 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-fernet-keys\") pod \"keystone-bootstrap-2zjg5\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " pod="openstack/keystone-bootstrap-2zjg5" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.396156 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxwh8\" (UniqueName: \"kubernetes.io/projected/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-kube-api-access-jxwh8\") pod \"keystone-bootstrap-2zjg5\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " pod="openstack/keystone-bootstrap-2zjg5" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.396198 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mwlc\" (UniqueName: \"kubernetes.io/projected/30b802af-3af5-430f-b06f-709fd4606fd0-kube-api-access-4mwlc\") pod \"dnsmasq-dns-7796644bc-dm4ww\" (UID: \"30b802af-3af5-430f-b06f-709fd4606fd0\") " pod="openstack/dnsmasq-dns-7796644bc-dm4ww" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.396216 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-config\") pod \"dnsmasq-dns-7796644bc-dm4ww\" (UID: \"30b802af-3af5-430f-b06f-709fd4606fd0\") " pod="openstack/dnsmasq-dns-7796644bc-dm4ww" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.396266 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-ovsdbserver-nb\") pod \"dnsmasq-dns-7796644bc-dm4ww\" (UID: \"30b802af-3af5-430f-b06f-709fd4606fd0\") " pod="openstack/dnsmasq-dns-7796644bc-dm4ww" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.396304 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-fernet-keys\") pod \"keystone-bootstrap-2zjg5\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " pod="openstack/keystone-bootstrap-2zjg5" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.396343 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-config-data\") pod \"keystone-bootstrap-2zjg5\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " pod="openstack/keystone-bootstrap-2zjg5" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.396363 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-scripts\") pod \"keystone-bootstrap-2zjg5\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " pod="openstack/keystone-bootstrap-2zjg5" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.396383 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-credential-keys\") pod \"keystone-bootstrap-2zjg5\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " pod="openstack/keystone-bootstrap-2zjg5" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.396398 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-dns-svc\") pod \"dnsmasq-dns-7796644bc-dm4ww\" (UID: \"30b802af-3af5-430f-b06f-709fd4606fd0\") " pod="openstack/dnsmasq-dns-7796644bc-dm4ww" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.396417 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-ovsdbserver-sb\") pod \"dnsmasq-dns-7796644bc-dm4ww\" (UID: \"30b802af-3af5-430f-b06f-709fd4606fd0\") " pod="openstack/dnsmasq-dns-7796644bc-dm4ww" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.396441 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-combined-ca-bundle\") pod \"keystone-bootstrap-2zjg5\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " pod="openstack/keystone-bootstrap-2zjg5" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.397967 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-dns-svc\") pod \"dnsmasq-dns-7796644bc-dm4ww\" (UID: \"30b802af-3af5-430f-b06f-709fd4606fd0\") " pod="openstack/dnsmasq-dns-7796644bc-dm4ww" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.398002 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-ovsdbserver-sb\") pod \"dnsmasq-dns-7796644bc-dm4ww\" (UID: \"30b802af-3af5-430f-b06f-709fd4606fd0\") " pod="openstack/dnsmasq-dns-7796644bc-dm4ww" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.398505 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-ovsdbserver-nb\") pod \"dnsmasq-dns-7796644bc-dm4ww\" (UID: \"30b802af-3af5-430f-b06f-709fd4606fd0\") " pod="openstack/dnsmasq-dns-7796644bc-dm4ww" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.398644 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-config\") pod \"dnsmasq-dns-7796644bc-dm4ww\" (UID: \"30b802af-3af5-430f-b06f-709fd4606fd0\") " pod="openstack/dnsmasq-dns-7796644bc-dm4ww" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.403805 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-credential-keys\") pod \"keystone-bootstrap-2zjg5\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " pod="openstack/keystone-bootstrap-2zjg5" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.403887 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-scripts\") pod \"keystone-bootstrap-2zjg5\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " pod="openstack/keystone-bootstrap-2zjg5" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.404031 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-combined-ca-bundle\") pod \"keystone-bootstrap-2zjg5\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " pod="openstack/keystone-bootstrap-2zjg5" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.404464 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-fernet-keys\") pod \"keystone-bootstrap-2zjg5\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " pod="openstack/keystone-bootstrap-2zjg5" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.409569 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-config-data\") pod \"keystone-bootstrap-2zjg5\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " pod="openstack/keystone-bootstrap-2zjg5" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.450503 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mwlc\" (UniqueName: \"kubernetes.io/projected/30b802af-3af5-430f-b06f-709fd4606fd0-kube-api-access-4mwlc\") pod \"dnsmasq-dns-7796644bc-dm4ww\" (UID: \"30b802af-3af5-430f-b06f-709fd4606fd0\") " pod="openstack/dnsmasq-dns-7796644bc-dm4ww" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.465986 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxwh8\" (UniqueName: \"kubernetes.io/projected/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-kube-api-access-jxwh8\") pod \"keystone-bootstrap-2zjg5\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " pod="openstack/keystone-bootstrap-2zjg5" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.534293 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7796644bc-dm4ww" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.551311 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.560841 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.567013 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.567502 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2zjg5" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.590124 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.601843 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " pod="openstack/ceilometer-0" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.601891 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-config-data\") pod \"ceilometer-0\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " pod="openstack/ceilometer-0" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.601938 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " pod="openstack/ceilometer-0" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.601959 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-scripts\") pod \"ceilometer-0\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " pod="openstack/ceilometer-0" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.602008 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-run-httpd\") pod \"ceilometer-0\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " pod="openstack/ceilometer-0" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.602042 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-log-httpd\") pod \"ceilometer-0\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " pod="openstack/ceilometer-0" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.602072 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgczn\" (UniqueName: \"kubernetes.io/projected/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-kube-api-access-zgczn\") pod \"ceilometer-0\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " pod="openstack/ceilometer-0" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.604545 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-gsxqb"] Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.605857 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-gsxqb" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.621666 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.621843 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-7rpd7" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.621960 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.649623 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.701952 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-gbmwh"] Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.702960 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-gbmwh" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.704546 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " pod="openstack/ceilometer-0" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.704593 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-scripts\") pod \"cinder-db-sync-gsxqb\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " pod="openstack/cinder-db-sync-gsxqb" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.704619 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-config-data\") pod \"ceilometer-0\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " pod="openstack/ceilometer-0" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.704640 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " pod="openstack/ceilometer-0" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.704658 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-scripts\") pod \"ceilometer-0\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " pod="openstack/ceilometer-0" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.704698 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/737aa0a0-6e53-451e-9d5f-2deada87b5b4-etc-machine-id\") pod \"cinder-db-sync-gsxqb\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " pod="openstack/cinder-db-sync-gsxqb" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.704718 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-config-data\") pod \"cinder-db-sync-gsxqb\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " pod="openstack/cinder-db-sync-gsxqb" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.704732 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-combined-ca-bundle\") pod \"cinder-db-sync-gsxqb\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " pod="openstack/cinder-db-sync-gsxqb" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.704757 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-run-httpd\") pod \"ceilometer-0\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " pod="openstack/ceilometer-0" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.704775 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84xmq\" (UniqueName: \"kubernetes.io/projected/737aa0a0-6e53-451e-9d5f-2deada87b5b4-kube-api-access-84xmq\") pod \"cinder-db-sync-gsxqb\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " pod="openstack/cinder-db-sync-gsxqb" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.704810 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-log-httpd\") pod \"ceilometer-0\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " pod="openstack/ceilometer-0" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.704839 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-db-sync-config-data\") pod \"cinder-db-sync-gsxqb\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " pod="openstack/cinder-db-sync-gsxqb" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.704855 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgczn\" (UniqueName: \"kubernetes.io/projected/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-kube-api-access-zgczn\") pod \"ceilometer-0\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " pod="openstack/ceilometer-0" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.712053 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.712237 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-k82sx" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.712335 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.712704 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-run-httpd\") pod \"ceilometer-0\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " pod="openstack/ceilometer-0" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.715152 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-log-httpd\") pod \"ceilometer-0\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " pod="openstack/ceilometer-0" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.717638 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-config-data\") pod \"ceilometer-0\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " pod="openstack/ceilometer-0" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.726406 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " pod="openstack/ceilometer-0" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.729151 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " pod="openstack/ceilometer-0" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.737768 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-scripts\") pod \"ceilometer-0\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " pod="openstack/ceilometer-0" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.740659 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgczn\" (UniqueName: \"kubernetes.io/projected/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-kube-api-access-zgczn\") pod \"ceilometer-0\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " pod="openstack/ceilometer-0" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.743246 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-gsxqb"] Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.787979 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-gbmwh"] Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.806452 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/66aba873-81b0-452a-81f9-73cc18445180-config\") pod \"neutron-db-sync-gbmwh\" (UID: \"66aba873-81b0-452a-81f9-73cc18445180\") " pod="openstack/neutron-db-sync-gbmwh" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.806580 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/737aa0a0-6e53-451e-9d5f-2deada87b5b4-etc-machine-id\") pod \"cinder-db-sync-gsxqb\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " pod="openstack/cinder-db-sync-gsxqb" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.806606 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66aba873-81b0-452a-81f9-73cc18445180-combined-ca-bundle\") pod \"neutron-db-sync-gbmwh\" (UID: \"66aba873-81b0-452a-81f9-73cc18445180\") " pod="openstack/neutron-db-sync-gbmwh" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.806624 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-config-data\") pod \"cinder-db-sync-gsxqb\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " pod="openstack/cinder-db-sync-gsxqb" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.806641 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-combined-ca-bundle\") pod \"cinder-db-sync-gsxqb\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " pod="openstack/cinder-db-sync-gsxqb" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.806683 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84xmq\" (UniqueName: \"kubernetes.io/projected/737aa0a0-6e53-451e-9d5f-2deada87b5b4-kube-api-access-84xmq\") pod \"cinder-db-sync-gsxqb\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " pod="openstack/cinder-db-sync-gsxqb" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.806731 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-db-sync-config-data\") pod \"cinder-db-sync-gsxqb\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " pod="openstack/cinder-db-sync-gsxqb" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.806770 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95d98\" (UniqueName: \"kubernetes.io/projected/66aba873-81b0-452a-81f9-73cc18445180-kube-api-access-95d98\") pod \"neutron-db-sync-gbmwh\" (UID: \"66aba873-81b0-452a-81f9-73cc18445180\") " pod="openstack/neutron-db-sync-gbmwh" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.806814 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-scripts\") pod \"cinder-db-sync-gsxqb\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " pod="openstack/cinder-db-sync-gsxqb" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.807217 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/737aa0a0-6e53-451e-9d5f-2deada87b5b4-etc-machine-id\") pod \"cinder-db-sync-gsxqb\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " pod="openstack/cinder-db-sync-gsxqb" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.812657 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-combined-ca-bundle\") pod \"cinder-db-sync-gsxqb\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " pod="openstack/cinder-db-sync-gsxqb" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.814147 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-db-sync-config-data\") pod \"cinder-db-sync-gsxqb\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " pod="openstack/cinder-db-sync-gsxqb" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.818484 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-config-data\") pod \"cinder-db-sync-gsxqb\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " pod="openstack/cinder-db-sync-gsxqb" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.818840 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-scripts\") pod \"cinder-db-sync-gsxqb\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " pod="openstack/cinder-db-sync-gsxqb" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.845405 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84xmq\" (UniqueName: \"kubernetes.io/projected/737aa0a0-6e53-451e-9d5f-2deada87b5b4-kube-api-access-84xmq\") pod \"cinder-db-sync-gsxqb\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " pod="openstack/cinder-db-sync-gsxqb" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.854128 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7796644bc-dm4ww"] Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.862872 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-bxx5d"] Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.864093 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-bxx5d" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.867342 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-wjs57" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.868285 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.879208 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-bxx5d"] Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.891149 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-f89zl"] Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.892355 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-f89zl" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.898097 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-z8zjf" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.900608 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.900801 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.906482 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-589bbb667-6gxlc"] Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.908505 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzxv7\" (UniqueName: \"kubernetes.io/projected/9e6bad7a-881b-4ef4-9916-f447e2fc1ffd-kube-api-access-pzxv7\") pod \"barbican-db-sync-bxx5d\" (UID: \"9e6bad7a-881b-4ef4-9916-f447e2fc1ffd\") " pod="openstack/barbican-db-sync-bxx5d" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.908597 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95d98\" (UniqueName: \"kubernetes.io/projected/66aba873-81b0-452a-81f9-73cc18445180-kube-api-access-95d98\") pod \"neutron-db-sync-gbmwh\" (UID: \"66aba873-81b0-452a-81f9-73cc18445180\") " pod="openstack/neutron-db-sync-gbmwh" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.908666 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9e6bad7a-881b-4ef4-9916-f447e2fc1ffd-db-sync-config-data\") pod \"barbican-db-sync-bxx5d\" (UID: \"9e6bad7a-881b-4ef4-9916-f447e2fc1ffd\") " pod="openstack/barbican-db-sync-bxx5d" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.908715 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/66aba873-81b0-452a-81f9-73cc18445180-config\") pod \"neutron-db-sync-gbmwh\" (UID: \"66aba873-81b0-452a-81f9-73cc18445180\") " pod="openstack/neutron-db-sync-gbmwh" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.908753 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66aba873-81b0-452a-81f9-73cc18445180-combined-ca-bundle\") pod \"neutron-db-sync-gbmwh\" (UID: \"66aba873-81b0-452a-81f9-73cc18445180\") " pod="openstack/neutron-db-sync-gbmwh" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.908779 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e6bad7a-881b-4ef4-9916-f447e2fc1ffd-combined-ca-bundle\") pod \"barbican-db-sync-bxx5d\" (UID: \"9e6bad7a-881b-4ef4-9916-f447e2fc1ffd\") " pod="openstack/barbican-db-sync-bxx5d" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.913693 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589bbb667-6gxlc" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.946638 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-f89zl"] Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.955209 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589bbb667-6gxlc"] Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.960837 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/66aba873-81b0-452a-81f9-73cc18445180-config\") pod \"neutron-db-sync-gbmwh\" (UID: \"66aba873-81b0-452a-81f9-73cc18445180\") " pod="openstack/neutron-db-sync-gbmwh" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.963068 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66aba873-81b0-452a-81f9-73cc18445180-combined-ca-bundle\") pod \"neutron-db-sync-gbmwh\" (UID: \"66aba873-81b0-452a-81f9-73cc18445180\") " pod="openstack/neutron-db-sync-gbmwh" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.963517 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95d98\" (UniqueName: \"kubernetes.io/projected/66aba873-81b0-452a-81f9-73cc18445180-kube-api-access-95d98\") pod \"neutron-db-sync-gbmwh\" (UID: \"66aba873-81b0-452a-81f9-73cc18445180\") " pod="openstack/neutron-db-sync-gbmwh" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.000179 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.013064 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e623b24a-64a5-4209-86bb-1814ae9c400b-combined-ca-bundle\") pod \"placement-db-sync-f89zl\" (UID: \"e623b24a-64a5-4209-86bb-1814ae9c400b\") " pod="openstack/placement-db-sync-f89zl" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.013119 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e623b24a-64a5-4209-86bb-1814ae9c400b-scripts\") pod \"placement-db-sync-f89zl\" (UID: \"e623b24a-64a5-4209-86bb-1814ae9c400b\") " pod="openstack/placement-db-sync-f89zl" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.013143 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e623b24a-64a5-4209-86bb-1814ae9c400b-config-data\") pod \"placement-db-sync-f89zl\" (UID: \"e623b24a-64a5-4209-86bb-1814ae9c400b\") " pod="openstack/placement-db-sync-f89zl" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.013167 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e623b24a-64a5-4209-86bb-1814ae9c400b-logs\") pod \"placement-db-sync-f89zl\" (UID: \"e623b24a-64a5-4209-86bb-1814ae9c400b\") " pod="openstack/placement-db-sync-f89zl" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.013190 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv7bl\" (UniqueName: \"kubernetes.io/projected/4e1a508b-9db4-414a-b06d-2f01a2c132a1-kube-api-access-cv7bl\") pod \"dnsmasq-dns-589bbb667-6gxlc\" (UID: \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\") " pod="openstack/dnsmasq-dns-589bbb667-6gxlc" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.013228 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-ovsdbserver-nb\") pod \"dnsmasq-dns-589bbb667-6gxlc\" (UID: \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\") " pod="openstack/dnsmasq-dns-589bbb667-6gxlc" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.013260 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9e6bad7a-881b-4ef4-9916-f447e2fc1ffd-db-sync-config-data\") pod \"barbican-db-sync-bxx5d\" (UID: \"9e6bad7a-881b-4ef4-9916-f447e2fc1ffd\") " pod="openstack/barbican-db-sync-bxx5d" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.013300 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f79b\" (UniqueName: \"kubernetes.io/projected/e623b24a-64a5-4209-86bb-1814ae9c400b-kube-api-access-4f79b\") pod \"placement-db-sync-f89zl\" (UID: \"e623b24a-64a5-4209-86bb-1814ae9c400b\") " pod="openstack/placement-db-sync-f89zl" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.013349 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-config\") pod \"dnsmasq-dns-589bbb667-6gxlc\" (UID: \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\") " pod="openstack/dnsmasq-dns-589bbb667-6gxlc" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.013380 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-dns-svc\") pod \"dnsmasq-dns-589bbb667-6gxlc\" (UID: \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\") " pod="openstack/dnsmasq-dns-589bbb667-6gxlc" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.013407 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e6bad7a-881b-4ef4-9916-f447e2fc1ffd-combined-ca-bundle\") pod \"barbican-db-sync-bxx5d\" (UID: \"9e6bad7a-881b-4ef4-9916-f447e2fc1ffd\") " pod="openstack/barbican-db-sync-bxx5d" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.013460 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzxv7\" (UniqueName: \"kubernetes.io/projected/9e6bad7a-881b-4ef4-9916-f447e2fc1ffd-kube-api-access-pzxv7\") pod \"barbican-db-sync-bxx5d\" (UID: \"9e6bad7a-881b-4ef4-9916-f447e2fc1ffd\") " pod="openstack/barbican-db-sync-bxx5d" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.013504 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-ovsdbserver-sb\") pod \"dnsmasq-dns-589bbb667-6gxlc\" (UID: \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\") " pod="openstack/dnsmasq-dns-589bbb667-6gxlc" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.017354 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9e6bad7a-881b-4ef4-9916-f447e2fc1ffd-db-sync-config-data\") pod \"barbican-db-sync-bxx5d\" (UID: \"9e6bad7a-881b-4ef4-9916-f447e2fc1ffd\") " pod="openstack/barbican-db-sync-bxx5d" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.039528 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-gsxqb" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.039986 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e6bad7a-881b-4ef4-9916-f447e2fc1ffd-combined-ca-bundle\") pod \"barbican-db-sync-bxx5d\" (UID: \"9e6bad7a-881b-4ef4-9916-f447e2fc1ffd\") " pod="openstack/barbican-db-sync-bxx5d" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.040207 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzxv7\" (UniqueName: \"kubernetes.io/projected/9e6bad7a-881b-4ef4-9916-f447e2fc1ffd-kube-api-access-pzxv7\") pod \"barbican-db-sync-bxx5d\" (UID: \"9e6bad7a-881b-4ef4-9916-f447e2fc1ffd\") " pod="openstack/barbican-db-sync-bxx5d" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.051390 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-gbmwh" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.114960 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e623b24a-64a5-4209-86bb-1814ae9c400b-combined-ca-bundle\") pod \"placement-db-sync-f89zl\" (UID: \"e623b24a-64a5-4209-86bb-1814ae9c400b\") " pod="openstack/placement-db-sync-f89zl" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.115009 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e623b24a-64a5-4209-86bb-1814ae9c400b-scripts\") pod \"placement-db-sync-f89zl\" (UID: \"e623b24a-64a5-4209-86bb-1814ae9c400b\") " pod="openstack/placement-db-sync-f89zl" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.115032 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e623b24a-64a5-4209-86bb-1814ae9c400b-config-data\") pod \"placement-db-sync-f89zl\" (UID: \"e623b24a-64a5-4209-86bb-1814ae9c400b\") " pod="openstack/placement-db-sync-f89zl" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.115054 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e623b24a-64a5-4209-86bb-1814ae9c400b-logs\") pod \"placement-db-sync-f89zl\" (UID: \"e623b24a-64a5-4209-86bb-1814ae9c400b\") " pod="openstack/placement-db-sync-f89zl" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.115080 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cv7bl\" (UniqueName: \"kubernetes.io/projected/4e1a508b-9db4-414a-b06d-2f01a2c132a1-kube-api-access-cv7bl\") pod \"dnsmasq-dns-589bbb667-6gxlc\" (UID: \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\") " pod="openstack/dnsmasq-dns-589bbb667-6gxlc" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.115110 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-ovsdbserver-nb\") pod \"dnsmasq-dns-589bbb667-6gxlc\" (UID: \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\") " pod="openstack/dnsmasq-dns-589bbb667-6gxlc" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.115151 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f79b\" (UniqueName: \"kubernetes.io/projected/e623b24a-64a5-4209-86bb-1814ae9c400b-kube-api-access-4f79b\") pod \"placement-db-sync-f89zl\" (UID: \"e623b24a-64a5-4209-86bb-1814ae9c400b\") " pod="openstack/placement-db-sync-f89zl" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.115199 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-config\") pod \"dnsmasq-dns-589bbb667-6gxlc\" (UID: \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\") " pod="openstack/dnsmasq-dns-589bbb667-6gxlc" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.115229 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-dns-svc\") pod \"dnsmasq-dns-589bbb667-6gxlc\" (UID: \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\") " pod="openstack/dnsmasq-dns-589bbb667-6gxlc" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.115293 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-ovsdbserver-sb\") pod \"dnsmasq-dns-589bbb667-6gxlc\" (UID: \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\") " pod="openstack/dnsmasq-dns-589bbb667-6gxlc" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.116316 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-ovsdbserver-sb\") pod \"dnsmasq-dns-589bbb667-6gxlc\" (UID: \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\") " pod="openstack/dnsmasq-dns-589bbb667-6gxlc" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.116841 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-ovsdbserver-nb\") pod \"dnsmasq-dns-589bbb667-6gxlc\" (UID: \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\") " pod="openstack/dnsmasq-dns-589bbb667-6gxlc" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.117730 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-config\") pod \"dnsmasq-dns-589bbb667-6gxlc\" (UID: \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\") " pod="openstack/dnsmasq-dns-589bbb667-6gxlc" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.119113 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e623b24a-64a5-4209-86bb-1814ae9c400b-logs\") pod \"placement-db-sync-f89zl\" (UID: \"e623b24a-64a5-4209-86bb-1814ae9c400b\") " pod="openstack/placement-db-sync-f89zl" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.119796 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-dns-svc\") pod \"dnsmasq-dns-589bbb667-6gxlc\" (UID: \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\") " pod="openstack/dnsmasq-dns-589bbb667-6gxlc" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.121087 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e623b24a-64a5-4209-86bb-1814ae9c400b-config-data\") pod \"placement-db-sync-f89zl\" (UID: \"e623b24a-64a5-4209-86bb-1814ae9c400b\") " pod="openstack/placement-db-sync-f89zl" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.121542 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e623b24a-64a5-4209-86bb-1814ae9c400b-combined-ca-bundle\") pod \"placement-db-sync-f89zl\" (UID: \"e623b24a-64a5-4209-86bb-1814ae9c400b\") " pod="openstack/placement-db-sync-f89zl" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.121926 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e623b24a-64a5-4209-86bb-1814ae9c400b-scripts\") pod \"placement-db-sync-f89zl\" (UID: \"e623b24a-64a5-4209-86bb-1814ae9c400b\") " pod="openstack/placement-db-sync-f89zl" Mar 01 09:27:58 crc kubenswrapper[4792]: E0301 09:27:58.129405 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddabb3d2e_57fa_4ad3_9f3b_b85e0b670650.slice/crio-212a58af65a44bd380179af8061fb1773b3c54204d964227d1d4aa6b00521785\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddabb3d2e_57fa_4ad3_9f3b_b85e0b670650.slice\": RecentStats: unable to find data in memory cache]" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.135596 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv7bl\" (UniqueName: \"kubernetes.io/projected/4e1a508b-9db4-414a-b06d-2f01a2c132a1-kube-api-access-cv7bl\") pod \"dnsmasq-dns-589bbb667-6gxlc\" (UID: \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\") " pod="openstack/dnsmasq-dns-589bbb667-6gxlc" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.146507 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f79b\" (UniqueName: \"kubernetes.io/projected/e623b24a-64a5-4209-86bb-1814ae9c400b-kube-api-access-4f79b\") pod \"placement-db-sync-f89zl\" (UID: \"e623b24a-64a5-4209-86bb-1814ae9c400b\") " pod="openstack/placement-db-sync-f89zl" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.293266 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-bxx5d" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.330548 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-f89zl" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.342225 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589bbb667-6gxlc" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.388437 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-2zjg5"] Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.557423 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7796644bc-dm4ww"] Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.720202 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-gsxqb"] Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.731670 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.874868 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-gbmwh"] Mar 01 09:27:58 crc kubenswrapper[4792]: W0301 09:27:58.898564 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66aba873_81b0_452a_81f9_73cc18445180.slice/crio-0d7cc39250ba374cd76a9cab23087e1f60274ea1219843b9f8303a09038d9fc3 WatchSource:0}: Error finding container 0d7cc39250ba374cd76a9cab23087e1f60274ea1219843b9f8303a09038d9fc3: Status 404 returned error can't find the container with id 0d7cc39250ba374cd76a9cab23087e1f60274ea1219843b9f8303a09038d9fc3 Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.997717 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2zjg5" event={"ID":"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d","Type":"ContainerStarted","Data":"670f35fecbdbd6b75d71f5beac8b8d230ac85378c92ee4b0813d0d49f8a4dde5"} Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.997758 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2zjg5" event={"ID":"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d","Type":"ContainerStarted","Data":"1f893a9e0566eb476f19571ee15d1e6b6197f05ea1b028ec4b64824306365a2d"} Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.004535 4792 generic.go:334] "Generic (PLEG): container finished" podID="30b802af-3af5-430f-b06f-709fd4606fd0" containerID="da4e59955cfda8f625c142c442b34e7bee84a99ca2a745a710222ff222c9839d" exitCode=0 Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.004620 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7796644bc-dm4ww" event={"ID":"30b802af-3af5-430f-b06f-709fd4606fd0","Type":"ContainerDied","Data":"da4e59955cfda8f625c142c442b34e7bee84a99ca2a745a710222ff222c9839d"} Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.004643 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7796644bc-dm4ww" event={"ID":"30b802af-3af5-430f-b06f-709fd4606fd0","Type":"ContainerStarted","Data":"16b663874cdd99123d93e52d5d4fca0608f91f60cad52b8d32e1ae059e5512ec"} Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.008489 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-gbmwh" event={"ID":"66aba873-81b0-452a-81f9-73cc18445180","Type":"ContainerStarted","Data":"0d7cc39250ba374cd76a9cab23087e1f60274ea1219843b9f8303a09038d9fc3"} Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.010010 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-gsxqb" event={"ID":"737aa0a0-6e53-451e-9d5f-2deada87b5b4","Type":"ContainerStarted","Data":"dc7bf25ff6493b89f8d3d42eee96feaadc16025ec1b5d1ef3c591647a4fb7abf"} Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.017804 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd","Type":"ContainerStarted","Data":"4a875a1ec016948e8ea916192a157e8f35d24195db5c29c64d33740934a209c2"} Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.063193 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-2zjg5" podStartSLOduration=2.063170056 podStartE2EDuration="2.063170056s" podCreationTimestamp="2026-03-01 09:27:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:27:59.015147197 +0000 UTC m=+1208.257026394" watchObservedRunningTime="2026-03-01 09:27:59.063170056 +0000 UTC m=+1208.305049253" Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.103949 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-bxx5d"] Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.176101 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-f89zl"] Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.218662 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589bbb667-6gxlc"] Mar 01 09:27:59 crc kubenswrapper[4792]: W0301 09:27:59.225844 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e1a508b_9db4_414a_b06d_2f01a2c132a1.slice/crio-0fc29a339e8c09faaadb240388644d964ff0e254b0fbfad00516d578fb8bff1b WatchSource:0}: Error finding container 0fc29a339e8c09faaadb240388644d964ff0e254b0fbfad00516d578fb8bff1b: Status 404 returned error can't find the container with id 0fc29a339e8c09faaadb240388644d964ff0e254b0fbfad00516d578fb8bff1b Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.387839 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7796644bc-dm4ww" Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.459421 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-config\") pod \"30b802af-3af5-430f-b06f-709fd4606fd0\" (UID: \"30b802af-3af5-430f-b06f-709fd4606fd0\") " Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.459751 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-dns-svc\") pod \"30b802af-3af5-430f-b06f-709fd4606fd0\" (UID: \"30b802af-3af5-430f-b06f-709fd4606fd0\") " Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.459796 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-ovsdbserver-sb\") pod \"30b802af-3af5-430f-b06f-709fd4606fd0\" (UID: \"30b802af-3af5-430f-b06f-709fd4606fd0\") " Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.459820 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mwlc\" (UniqueName: \"kubernetes.io/projected/30b802af-3af5-430f-b06f-709fd4606fd0-kube-api-access-4mwlc\") pod \"30b802af-3af5-430f-b06f-709fd4606fd0\" (UID: \"30b802af-3af5-430f-b06f-709fd4606fd0\") " Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.459864 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-ovsdbserver-nb\") pod \"30b802af-3af5-430f-b06f-709fd4606fd0\" (UID: \"30b802af-3af5-430f-b06f-709fd4606fd0\") " Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.504121 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30b802af-3af5-430f-b06f-709fd4606fd0-kube-api-access-4mwlc" (OuterVolumeSpecName: "kube-api-access-4mwlc") pod "30b802af-3af5-430f-b06f-709fd4606fd0" (UID: "30b802af-3af5-430f-b06f-709fd4606fd0"). InnerVolumeSpecName "kube-api-access-4mwlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.514748 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "30b802af-3af5-430f-b06f-709fd4606fd0" (UID: "30b802af-3af5-430f-b06f-709fd4606fd0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.518553 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-config" (OuterVolumeSpecName: "config") pod "30b802af-3af5-430f-b06f-709fd4606fd0" (UID: "30b802af-3af5-430f-b06f-709fd4606fd0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.524480 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "30b802af-3af5-430f-b06f-709fd4606fd0" (UID: "30b802af-3af5-430f-b06f-709fd4606fd0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.527312 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "30b802af-3af5-430f-b06f-709fd4606fd0" (UID: "30b802af-3af5-430f-b06f-709fd4606fd0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.561480 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.561502 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.561513 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mwlc\" (UniqueName: \"kubernetes.io/projected/30b802af-3af5-430f-b06f-709fd4606fd0-kube-api-access-4mwlc\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.561521 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.561531 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.777211 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.056054 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-gbmwh" event={"ID":"66aba873-81b0-452a-81f9-73cc18445180","Type":"ContainerStarted","Data":"acaeed4a0d4cc4c819f11994601b2946e5e093014d4fc45dbb6ce057d16aef6a"} Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.058715 4792 generic.go:334] "Generic (PLEG): container finished" podID="4e1a508b-9db4-414a-b06d-2f01a2c132a1" containerID="70a638170c1ac600ae163392b053537afb8b6aa9687b87f677426f4f023db168" exitCode=0 Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.058758 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589bbb667-6gxlc" event={"ID":"4e1a508b-9db4-414a-b06d-2f01a2c132a1","Type":"ContainerDied","Data":"70a638170c1ac600ae163392b053537afb8b6aa9687b87f677426f4f023db168"} Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.058776 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589bbb667-6gxlc" event={"ID":"4e1a508b-9db4-414a-b06d-2f01a2c132a1","Type":"ContainerStarted","Data":"0fc29a339e8c09faaadb240388644d964ff0e254b0fbfad00516d578fb8bff1b"} Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.111219 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7796644bc-dm4ww" event={"ID":"30b802af-3af5-430f-b06f-709fd4606fd0","Type":"ContainerDied","Data":"16b663874cdd99123d93e52d5d4fca0608f91f60cad52b8d32e1ae059e5512ec"} Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.111267 4792 scope.go:117] "RemoveContainer" containerID="da4e59955cfda8f625c142c442b34e7bee84a99ca2a745a710222ff222c9839d" Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.111398 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7796644bc-dm4ww" Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.139399 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-gbmwh" podStartSLOduration=3.139382822 podStartE2EDuration="3.139382822s" podCreationTimestamp="2026-03-01 09:27:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:28:00.088556014 +0000 UTC m=+1209.330435211" watchObservedRunningTime="2026-03-01 09:28:00.139382822 +0000 UTC m=+1209.381262009" Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.197986 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-f89zl" event={"ID":"e623b24a-64a5-4209-86bb-1814ae9c400b","Type":"ContainerStarted","Data":"e4885fc9359de722bc25d23b6b1337620c1bc715e4d3353e33a0de4919152488"} Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.215325 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-bxx5d" event={"ID":"9e6bad7a-881b-4ef4-9916-f447e2fc1ffd","Type":"ContainerStarted","Data":"5a8a6506253c42d0a8617675f0f4091a77e441fa914d2388057587c940f25850"} Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.234248 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539288-9klf4"] Mar 01 09:28:00 crc kubenswrapper[4792]: E0301 09:28:00.234584 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30b802af-3af5-430f-b06f-709fd4606fd0" containerName="init" Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.234596 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="30b802af-3af5-430f-b06f-709fd4606fd0" containerName="init" Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.234752 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="30b802af-3af5-430f-b06f-709fd4606fd0" containerName="init" Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.235248 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539288-9klf4" Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.246505 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.246702 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.276474 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.314086 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539288-9klf4"] Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.344204 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7796644bc-dm4ww"] Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.358989 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7796644bc-dm4ww"] Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.409157 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67qpx\" (UniqueName: \"kubernetes.io/projected/2e8f417d-a9b7-4969-9e24-785fa8baf9c4-kube-api-access-67qpx\") pod \"auto-csr-approver-29539288-9klf4\" (UID: \"2e8f417d-a9b7-4969-9e24-785fa8baf9c4\") " pod="openshift-infra/auto-csr-approver-29539288-9klf4" Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.525072 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67qpx\" (UniqueName: \"kubernetes.io/projected/2e8f417d-a9b7-4969-9e24-785fa8baf9c4-kube-api-access-67qpx\") pod \"auto-csr-approver-29539288-9klf4\" (UID: \"2e8f417d-a9b7-4969-9e24-785fa8baf9c4\") " pod="openshift-infra/auto-csr-approver-29539288-9klf4" Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.542171 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67qpx\" (UniqueName: \"kubernetes.io/projected/2e8f417d-a9b7-4969-9e24-785fa8baf9c4-kube-api-access-67qpx\") pod \"auto-csr-approver-29539288-9klf4\" (UID: \"2e8f417d-a9b7-4969-9e24-785fa8baf9c4\") " pod="openshift-infra/auto-csr-approver-29539288-9klf4" Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.587155 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539288-9klf4" Mar 01 09:28:01 crc kubenswrapper[4792]: I0301 09:28:01.221509 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539288-9klf4"] Mar 01 09:28:01 crc kubenswrapper[4792]: I0301 09:28:01.249005 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589bbb667-6gxlc" event={"ID":"4e1a508b-9db4-414a-b06d-2f01a2c132a1","Type":"ContainerStarted","Data":"245d762d71bcda288abc788f3327cb41b0f2c4d918af85b1ae35b3d1615c472c"} Mar 01 09:28:01 crc kubenswrapper[4792]: I0301 09:28:01.249124 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-589bbb667-6gxlc" Mar 01 09:28:01 crc kubenswrapper[4792]: W0301 09:28:01.295637 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e8f417d_a9b7_4969_9e24_785fa8baf9c4.slice/crio-4b8eb3139f21d41482ab553230f14aad91a6b71c593740f60f59167f5d6203d0 WatchSource:0}: Error finding container 4b8eb3139f21d41482ab553230f14aad91a6b71c593740f60f59167f5d6203d0: Status 404 returned error can't find the container with id 4b8eb3139f21d41482ab553230f14aad91a6b71c593740f60f59167f5d6203d0 Mar 01 09:28:01 crc kubenswrapper[4792]: I0301 09:28:01.452218 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30b802af-3af5-430f-b06f-709fd4606fd0" path="/var/lib/kubelet/pods/30b802af-3af5-430f-b06f-709fd4606fd0/volumes" Mar 01 09:28:01 crc kubenswrapper[4792]: I0301 09:28:01.472039 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-589bbb667-6gxlc" podStartSLOduration=4.472013663 podStartE2EDuration="4.472013663s" podCreationTimestamp="2026-03-01 09:27:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:28:01.281875796 +0000 UTC m=+1210.523754993" watchObservedRunningTime="2026-03-01 09:28:01.472013663 +0000 UTC m=+1210.713892860" Mar 01 09:28:02 crc kubenswrapper[4792]: I0301 09:28:02.276307 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539288-9klf4" event={"ID":"2e8f417d-a9b7-4969-9e24-785fa8baf9c4","Type":"ContainerStarted","Data":"4b8eb3139f21d41482ab553230f14aad91a6b71c593740f60f59167f5d6203d0"} Mar 01 09:28:04 crc kubenswrapper[4792]: I0301 09:28:04.314604 4792 generic.go:334] "Generic (PLEG): container finished" podID="3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d" containerID="670f35fecbdbd6b75d71f5beac8b8d230ac85378c92ee4b0813d0d49f8a4dde5" exitCode=0 Mar 01 09:28:04 crc kubenswrapper[4792]: I0301 09:28:04.314649 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2zjg5" event={"ID":"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d","Type":"ContainerDied","Data":"670f35fecbdbd6b75d71f5beac8b8d230ac85378c92ee4b0813d0d49f8a4dde5"} Mar 01 09:28:05 crc kubenswrapper[4792]: I0301 09:28:05.916353 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2zjg5" Mar 01 09:28:05 crc kubenswrapper[4792]: I0301 09:28:05.920712 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxwh8\" (UniqueName: \"kubernetes.io/projected/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-kube-api-access-jxwh8\") pod \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " Mar 01 09:28:05 crc kubenswrapper[4792]: I0301 09:28:05.921762 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-config-data\") pod \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " Mar 01 09:28:05 crc kubenswrapper[4792]: I0301 09:28:05.921804 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-fernet-keys\") pod \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " Mar 01 09:28:05 crc kubenswrapper[4792]: I0301 09:28:05.921845 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-credential-keys\") pod \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " Mar 01 09:28:05 crc kubenswrapper[4792]: I0301 09:28:05.921961 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-combined-ca-bundle\") pod \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " Mar 01 09:28:05 crc kubenswrapper[4792]: I0301 09:28:05.921985 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-scripts\") pod \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " Mar 01 09:28:05 crc kubenswrapper[4792]: I0301 09:28:05.926662 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-scripts" (OuterVolumeSpecName: "scripts") pod "3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d" (UID: "3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:05 crc kubenswrapper[4792]: I0301 09:28:05.928088 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-kube-api-access-jxwh8" (OuterVolumeSpecName: "kube-api-access-jxwh8") pod "3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d" (UID: "3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d"). InnerVolumeSpecName "kube-api-access-jxwh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:28:05 crc kubenswrapper[4792]: I0301 09:28:05.929565 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d" (UID: "3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:05 crc kubenswrapper[4792]: I0301 09:28:05.954199 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d" (UID: "3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:05 crc kubenswrapper[4792]: I0301 09:28:05.964640 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d" (UID: "3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:05 crc kubenswrapper[4792]: I0301 09:28:05.973368 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-config-data" (OuterVolumeSpecName: "config-data") pod "3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d" (UID: "3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.023720 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.023770 4792 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.023780 4792 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.023793 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.023806 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.023817 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxwh8\" (UniqueName: \"kubernetes.io/projected/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-kube-api-access-jxwh8\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.336004 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2zjg5" event={"ID":"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d","Type":"ContainerDied","Data":"1f893a9e0566eb476f19571ee15d1e6b6197f05ea1b028ec4b64824306365a2d"} Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.336042 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f893a9e0566eb476f19571ee15d1e6b6197f05ea1b028ec4b64824306365a2d" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.336079 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2zjg5" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.423015 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-2zjg5"] Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.430996 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-2zjg5"] Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.513577 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-m9l5f"] Mar 01 09:28:06 crc kubenswrapper[4792]: E0301 09:28:06.513954 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d" containerName="keystone-bootstrap" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.513973 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d" containerName="keystone-bootstrap" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.514130 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d" containerName="keystone-bootstrap" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.514620 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-m9l5f" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.516375 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-fr9vh" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.516726 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.516929 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.518952 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.522275 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.533164 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-m9l5f"] Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.633418 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-scripts\") pod \"keystone-bootstrap-m9l5f\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " pod="openstack/keystone-bootstrap-m9l5f" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.633479 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l92rl\" (UniqueName: \"kubernetes.io/projected/7108e9ac-8215-41ca-ac84-3b3851142a42-kube-api-access-l92rl\") pod \"keystone-bootstrap-m9l5f\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " pod="openstack/keystone-bootstrap-m9l5f" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.633697 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-config-data\") pod \"keystone-bootstrap-m9l5f\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " pod="openstack/keystone-bootstrap-m9l5f" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.633767 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-credential-keys\") pod \"keystone-bootstrap-m9l5f\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " pod="openstack/keystone-bootstrap-m9l5f" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.633830 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-fernet-keys\") pod \"keystone-bootstrap-m9l5f\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " pod="openstack/keystone-bootstrap-m9l5f" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.633942 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-combined-ca-bundle\") pod \"keystone-bootstrap-m9l5f\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " pod="openstack/keystone-bootstrap-m9l5f" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.735779 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-fernet-keys\") pod \"keystone-bootstrap-m9l5f\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " pod="openstack/keystone-bootstrap-m9l5f" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.735868 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-combined-ca-bundle\") pod \"keystone-bootstrap-m9l5f\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " pod="openstack/keystone-bootstrap-m9l5f" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.735935 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-scripts\") pod \"keystone-bootstrap-m9l5f\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " pod="openstack/keystone-bootstrap-m9l5f" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.735964 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l92rl\" (UniqueName: \"kubernetes.io/projected/7108e9ac-8215-41ca-ac84-3b3851142a42-kube-api-access-l92rl\") pod \"keystone-bootstrap-m9l5f\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " pod="openstack/keystone-bootstrap-m9l5f" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.736011 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-config-data\") pod \"keystone-bootstrap-m9l5f\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " pod="openstack/keystone-bootstrap-m9l5f" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.736031 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-credential-keys\") pod \"keystone-bootstrap-m9l5f\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " pod="openstack/keystone-bootstrap-m9l5f" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.741852 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-config-data\") pod \"keystone-bootstrap-m9l5f\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " pod="openstack/keystone-bootstrap-m9l5f" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.742413 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-credential-keys\") pod \"keystone-bootstrap-m9l5f\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " pod="openstack/keystone-bootstrap-m9l5f" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.743758 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-scripts\") pod \"keystone-bootstrap-m9l5f\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " pod="openstack/keystone-bootstrap-m9l5f" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.743861 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-fernet-keys\") pod \"keystone-bootstrap-m9l5f\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " pod="openstack/keystone-bootstrap-m9l5f" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.745105 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-combined-ca-bundle\") pod \"keystone-bootstrap-m9l5f\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " pod="openstack/keystone-bootstrap-m9l5f" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.751853 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l92rl\" (UniqueName: \"kubernetes.io/projected/7108e9ac-8215-41ca-ac84-3b3851142a42-kube-api-access-l92rl\") pod \"keystone-bootstrap-m9l5f\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " pod="openstack/keystone-bootstrap-m9l5f" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.893336 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-m9l5f" Mar 01 09:28:07 crc kubenswrapper[4792]: I0301 09:28:07.419776 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d" path="/var/lib/kubelet/pods/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d/volumes" Mar 01 09:28:08 crc kubenswrapper[4792]: I0301 09:28:08.343746 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-589bbb667-6gxlc" Mar 01 09:28:08 crc kubenswrapper[4792]: E0301 09:28:08.353491 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddabb3d2e_57fa_4ad3_9f3b_b85e0b670650.slice/crio-212a58af65a44bd380179af8061fb1773b3c54204d964227d1d4aa6b00521785\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddabb3d2e_57fa_4ad3_9f3b_b85e0b670650.slice\": RecentStats: unable to find data in memory cache]" Mar 01 09:28:08 crc kubenswrapper[4792]: I0301 09:28:08.411325 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6958f867f9-p8x5s"] Mar 01 09:28:08 crc kubenswrapper[4792]: I0301 09:28:08.411771 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" podUID="667fff68-7113-4dfe-86b4-34b80b41d326" containerName="dnsmasq-dns" containerID="cri-o://b035d55e8df5d946580d2dfd8c4b3f7b4f07c6856584d562fa0676ed055fd3e5" gracePeriod=10 Mar 01 09:28:09 crc kubenswrapper[4792]: I0301 09:28:09.364894 4792 generic.go:334] "Generic (PLEG): container finished" podID="667fff68-7113-4dfe-86b4-34b80b41d326" containerID="b035d55e8df5d946580d2dfd8c4b3f7b4f07c6856584d562fa0676ed055fd3e5" exitCode=0 Mar 01 09:28:09 crc kubenswrapper[4792]: I0301 09:28:09.364947 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" event={"ID":"667fff68-7113-4dfe-86b4-34b80b41d326","Type":"ContainerDied","Data":"b035d55e8df5d946580d2dfd8c4b3f7b4f07c6856584d562fa0676ed055fd3e5"} Mar 01 09:28:11 crc kubenswrapper[4792]: I0301 09:28:11.418077 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" podUID="667fff68-7113-4dfe-86b4-34b80b41d326" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: connect: connection refused" Mar 01 09:28:16 crc kubenswrapper[4792]: I0301 09:28:16.417262 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" podUID="667fff68-7113-4dfe-86b4-34b80b41d326" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: connect: connection refused" Mar 01 09:28:18 crc kubenswrapper[4792]: E0301 09:28:18.550597 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddabb3d2e_57fa_4ad3_9f3b_b85e0b670650.slice/crio-212a58af65a44bd380179af8061fb1773b3c54204d964227d1d4aa6b00521785\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddabb3d2e_57fa_4ad3_9f3b_b85e0b670650.slice\": RecentStats: unable to find data in memory cache]" Mar 01 09:28:22 crc kubenswrapper[4792]: E0301 09:28:22.148974 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:2c52c0f4b4baa15796eb284522adff7fa9e5c85a2d77c2e47ef4afdf8e4a7c7d" Mar 01 09:28:22 crc kubenswrapper[4792]: E0301 09:28:22.149609 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:2c52c0f4b4baa15796eb284522adff7fa9e5c85a2d77c2e47ef4afdf8e4a7c7d,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pzxv7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-bxx5d_openstack(9e6bad7a-881b-4ef4-9916-f447e2fc1ffd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 01 09:28:22 crc kubenswrapper[4792]: E0301 09:28:22.150871 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-bxx5d" podUID="9e6bad7a-881b-4ef4-9916-f447e2fc1ffd" Mar 01 09:28:22 crc kubenswrapper[4792]: E0301 09:28:22.483409 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:2c52c0f4b4baa15796eb284522adff7fa9e5c85a2d77c2e47ef4afdf8e4a7c7d\\\"\"" pod="openstack/barbican-db-sync-bxx5d" podUID="9e6bad7a-881b-4ef4-9916-f447e2fc1ffd" Mar 01 09:28:23 crc kubenswrapper[4792]: E0301 09:28:23.092364 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838" Mar 01 09:28:23 crc kubenswrapper[4792]: E0301 09:28:23.092739 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-84xmq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-gsxqb_openstack(737aa0a0-6e53-451e-9d5f-2deada87b5b4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 01 09:28:23 crc kubenswrapper[4792]: E0301 09:28:23.093997 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-gsxqb" podUID="737aa0a0-6e53-451e-9d5f-2deada87b5b4" Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.341623 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.454778 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-config\") pod \"667fff68-7113-4dfe-86b4-34b80b41d326\" (UID: \"667fff68-7113-4dfe-86b4-34b80b41d326\") " Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.454855 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77bc7\" (UniqueName: \"kubernetes.io/projected/667fff68-7113-4dfe-86b4-34b80b41d326-kube-api-access-77bc7\") pod \"667fff68-7113-4dfe-86b4-34b80b41d326\" (UID: \"667fff68-7113-4dfe-86b4-34b80b41d326\") " Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.455233 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-ovsdbserver-sb\") pod \"667fff68-7113-4dfe-86b4-34b80b41d326\" (UID: \"667fff68-7113-4dfe-86b4-34b80b41d326\") " Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.455314 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-dns-svc\") pod \"667fff68-7113-4dfe-86b4-34b80b41d326\" (UID: \"667fff68-7113-4dfe-86b4-34b80b41d326\") " Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.455458 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-ovsdbserver-nb\") pod \"667fff68-7113-4dfe-86b4-34b80b41d326\" (UID: \"667fff68-7113-4dfe-86b4-34b80b41d326\") " Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.464588 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.469262 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/667fff68-7113-4dfe-86b4-34b80b41d326-kube-api-access-77bc7" (OuterVolumeSpecName: "kube-api-access-77bc7") pod "667fff68-7113-4dfe-86b4-34b80b41d326" (UID: "667fff68-7113-4dfe-86b4-34b80b41d326"). InnerVolumeSpecName "kube-api-access-77bc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.494046 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-f89zl" event={"ID":"e623b24a-64a5-4209-86bb-1814ae9c400b","Type":"ContainerStarted","Data":"fded697acef2e4939680efd28c30dd0707c1b449f6152a36c981b92695845052"} Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.496747 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" event={"ID":"667fff68-7113-4dfe-86b4-34b80b41d326","Type":"ContainerDied","Data":"386c10f9f902b49dfdcc7e63fa9588772c851427627a00112a847f427ef0dd79"} Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.496809 4792 scope.go:117] "RemoveContainer" containerID="b035d55e8df5d946580d2dfd8c4b3f7b4f07c6856584d562fa0676ed055fd3e5" Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.496964 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.501059 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd","Type":"ContainerStarted","Data":"4fd86a535781157d736a326c5d3973270ef9e75f90ca5c7a184728477646f601"} Mar 01 09:28:23 crc kubenswrapper[4792]: E0301 09:28:23.502064 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838\\\"\"" pod="openstack/cinder-db-sync-gsxqb" podUID="737aa0a0-6e53-451e-9d5f-2deada87b5b4" Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.513392 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-f89zl" podStartSLOduration=2.60839733 podStartE2EDuration="26.513342096s" podCreationTimestamp="2026-03-01 09:27:57 +0000 UTC" firstStartedPulling="2026-03-01 09:27:59.179601974 +0000 UTC m=+1208.421481171" lastFinishedPulling="2026-03-01 09:28:23.08454674 +0000 UTC m=+1232.326425937" observedRunningTime="2026-03-01 09:28:23.509556833 +0000 UTC m=+1232.751436030" watchObservedRunningTime="2026-03-01 09:28:23.513342096 +0000 UTC m=+1232.755221293" Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.520464 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "667fff68-7113-4dfe-86b4-34b80b41d326" (UID: "667fff68-7113-4dfe-86b4-34b80b41d326"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.524247 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-config" (OuterVolumeSpecName: "config") pod "667fff68-7113-4dfe-86b4-34b80b41d326" (UID: "667fff68-7113-4dfe-86b4-34b80b41d326"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.544703 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "667fff68-7113-4dfe-86b4-34b80b41d326" (UID: "667fff68-7113-4dfe-86b4-34b80b41d326"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.548704 4792 scope.go:117] "RemoveContainer" containerID="e23b7bf7394271fedc5e4abedff3f86bcbcc1a5fe82a8164474dcf3fb06696d9" Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.555718 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "667fff68-7113-4dfe-86b4-34b80b41d326" (UID: "667fff68-7113-4dfe-86b4-34b80b41d326"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.556490 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-m9l5f"] Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.559748 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.560985 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77bc7\" (UniqueName: \"kubernetes.io/projected/667fff68-7113-4dfe-86b4-34b80b41d326-kube-api-access-77bc7\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.561024 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.561037 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.561090 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.829322 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6958f867f9-p8x5s"] Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.858591 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6958f867f9-p8x5s"] Mar 01 09:28:24 crc kubenswrapper[4792]: I0301 09:28:24.516947 4792 generic.go:334] "Generic (PLEG): container finished" podID="2e8f417d-a9b7-4969-9e24-785fa8baf9c4" containerID="0b4398286a53ae92983ef93db19480d6804e4b83a997761fc68f16627e65ecd5" exitCode=0 Mar 01 09:28:24 crc kubenswrapper[4792]: I0301 09:28:24.517024 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539288-9klf4" event={"ID":"2e8f417d-a9b7-4969-9e24-785fa8baf9c4","Type":"ContainerDied","Data":"0b4398286a53ae92983ef93db19480d6804e4b83a997761fc68f16627e65ecd5"} Mar 01 09:28:24 crc kubenswrapper[4792]: I0301 09:28:24.519966 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-m9l5f" event={"ID":"7108e9ac-8215-41ca-ac84-3b3851142a42","Type":"ContainerStarted","Data":"4a5e793bcbd54f67d2aa56894763cca0ce1c06ab0ab5c25152dbd8e3b2985066"} Mar 01 09:28:24 crc kubenswrapper[4792]: I0301 09:28:24.520010 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-m9l5f" event={"ID":"7108e9ac-8215-41ca-ac84-3b3851142a42","Type":"ContainerStarted","Data":"2b1de65ec699fbf1baf52d26468fee288183e04f12523b93783a52b6e1c65a17"} Mar 01 09:28:24 crc kubenswrapper[4792]: I0301 09:28:24.551954 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-m9l5f" podStartSLOduration=18.551938049 podStartE2EDuration="18.551938049s" podCreationTimestamp="2026-03-01 09:28:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:28:24.550015492 +0000 UTC m=+1233.791894689" watchObservedRunningTime="2026-03-01 09:28:24.551938049 +0000 UTC m=+1233.793817236" Mar 01 09:28:25 crc kubenswrapper[4792]: I0301 09:28:25.422100 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="667fff68-7113-4dfe-86b4-34b80b41d326" path="/var/lib/kubelet/pods/667fff68-7113-4dfe-86b4-34b80b41d326/volumes" Mar 01 09:28:25 crc kubenswrapper[4792]: I0301 09:28:25.527985 4792 generic.go:334] "Generic (PLEG): container finished" podID="66aba873-81b0-452a-81f9-73cc18445180" containerID="acaeed4a0d4cc4c819f11994601b2946e5e093014d4fc45dbb6ce057d16aef6a" exitCode=0 Mar 01 09:28:25 crc kubenswrapper[4792]: I0301 09:28:25.528059 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-gbmwh" event={"ID":"66aba873-81b0-452a-81f9-73cc18445180","Type":"ContainerDied","Data":"acaeed4a0d4cc4c819f11994601b2946e5e093014d4fc45dbb6ce057d16aef6a"} Mar 01 09:28:25 crc kubenswrapper[4792]: I0301 09:28:25.529690 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd","Type":"ContainerStarted","Data":"ac3d91f96c6efaf7baa089ecdf84d5d3fe923f61545b960c7a1aa1d77e8db2e5"} Mar 01 09:28:25 crc kubenswrapper[4792]: I0301 09:28:25.905329 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539288-9klf4" Mar 01 09:28:26 crc kubenswrapper[4792]: I0301 09:28:26.103669 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67qpx\" (UniqueName: \"kubernetes.io/projected/2e8f417d-a9b7-4969-9e24-785fa8baf9c4-kube-api-access-67qpx\") pod \"2e8f417d-a9b7-4969-9e24-785fa8baf9c4\" (UID: \"2e8f417d-a9b7-4969-9e24-785fa8baf9c4\") " Mar 01 09:28:26 crc kubenswrapper[4792]: I0301 09:28:26.114847 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e8f417d-a9b7-4969-9e24-785fa8baf9c4-kube-api-access-67qpx" (OuterVolumeSpecName: "kube-api-access-67qpx") pod "2e8f417d-a9b7-4969-9e24-785fa8baf9c4" (UID: "2e8f417d-a9b7-4969-9e24-785fa8baf9c4"). InnerVolumeSpecName "kube-api-access-67qpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:28:26 crc kubenswrapper[4792]: I0301 09:28:26.205392 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67qpx\" (UniqueName: \"kubernetes.io/projected/2e8f417d-a9b7-4969-9e24-785fa8baf9c4-kube-api-access-67qpx\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:26 crc kubenswrapper[4792]: I0301 09:28:26.417659 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" podUID="667fff68-7113-4dfe-86b4-34b80b41d326" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: i/o timeout" Mar 01 09:28:26 crc kubenswrapper[4792]: I0301 09:28:26.538433 4792 generic.go:334] "Generic (PLEG): container finished" podID="e623b24a-64a5-4209-86bb-1814ae9c400b" containerID="fded697acef2e4939680efd28c30dd0707c1b449f6152a36c981b92695845052" exitCode=0 Mar 01 09:28:26 crc kubenswrapper[4792]: I0301 09:28:26.538495 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-f89zl" event={"ID":"e623b24a-64a5-4209-86bb-1814ae9c400b","Type":"ContainerDied","Data":"fded697acef2e4939680efd28c30dd0707c1b449f6152a36c981b92695845052"} Mar 01 09:28:26 crc kubenswrapper[4792]: I0301 09:28:26.541761 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539288-9klf4" event={"ID":"2e8f417d-a9b7-4969-9e24-785fa8baf9c4","Type":"ContainerDied","Data":"4b8eb3139f21d41482ab553230f14aad91a6b71c593740f60f59167f5d6203d0"} Mar 01 09:28:26 crc kubenswrapper[4792]: I0301 09:28:26.541805 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539288-9klf4" Mar 01 09:28:26 crc kubenswrapper[4792]: I0301 09:28:26.541813 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b8eb3139f21d41482ab553230f14aad91a6b71c593740f60f59167f5d6203d0" Mar 01 09:28:26 crc kubenswrapper[4792]: I0301 09:28:26.953067 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-gbmwh" Mar 01 09:28:26 crc kubenswrapper[4792]: I0301 09:28:26.966890 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539282-dcfkb"] Mar 01 09:28:26 crc kubenswrapper[4792]: I0301 09:28:26.975051 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539282-dcfkb"] Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.119155 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/66aba873-81b0-452a-81f9-73cc18445180-config\") pod \"66aba873-81b0-452a-81f9-73cc18445180\" (UID: \"66aba873-81b0-452a-81f9-73cc18445180\") " Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.119248 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66aba873-81b0-452a-81f9-73cc18445180-combined-ca-bundle\") pod \"66aba873-81b0-452a-81f9-73cc18445180\" (UID: \"66aba873-81b0-452a-81f9-73cc18445180\") " Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.119343 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95d98\" (UniqueName: \"kubernetes.io/projected/66aba873-81b0-452a-81f9-73cc18445180-kube-api-access-95d98\") pod \"66aba873-81b0-452a-81f9-73cc18445180\" (UID: \"66aba873-81b0-452a-81f9-73cc18445180\") " Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.142237 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66aba873-81b0-452a-81f9-73cc18445180-kube-api-access-95d98" (OuterVolumeSpecName: "kube-api-access-95d98") pod "66aba873-81b0-452a-81f9-73cc18445180" (UID: "66aba873-81b0-452a-81f9-73cc18445180"). InnerVolumeSpecName "kube-api-access-95d98". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.149625 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66aba873-81b0-452a-81f9-73cc18445180-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "66aba873-81b0-452a-81f9-73cc18445180" (UID: "66aba873-81b0-452a-81f9-73cc18445180"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.152128 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66aba873-81b0-452a-81f9-73cc18445180-config" (OuterVolumeSpecName: "config") pod "66aba873-81b0-452a-81f9-73cc18445180" (UID: "66aba873-81b0-452a-81f9-73cc18445180"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.221545 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/66aba873-81b0-452a-81f9-73cc18445180-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.221749 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66aba873-81b0-452a-81f9-73cc18445180-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.221831 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95d98\" (UniqueName: \"kubernetes.io/projected/66aba873-81b0-452a-81f9-73cc18445180-kube-api-access-95d98\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.419967 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fcb7c96-6ab5-413c-b776-d1bc938e85c0" path="/var/lib/kubelet/pods/1fcb7c96-6ab5-413c-b776-d1bc938e85c0/volumes" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.554402 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-gbmwh" event={"ID":"66aba873-81b0-452a-81f9-73cc18445180","Type":"ContainerDied","Data":"0d7cc39250ba374cd76a9cab23087e1f60274ea1219843b9f8303a09038d9fc3"} Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.554445 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-gbmwh" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.554471 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d7cc39250ba374cd76a9cab23087e1f60274ea1219843b9f8303a09038d9fc3" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.849946 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76fc8568c7-drpr8"] Mar 01 09:28:27 crc kubenswrapper[4792]: E0301 09:28:27.850926 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e8f417d-a9b7-4969-9e24-785fa8baf9c4" containerName="oc" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.850944 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e8f417d-a9b7-4969-9e24-785fa8baf9c4" containerName="oc" Mar 01 09:28:27 crc kubenswrapper[4792]: E0301 09:28:27.850971 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="667fff68-7113-4dfe-86b4-34b80b41d326" containerName="init" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.850978 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="667fff68-7113-4dfe-86b4-34b80b41d326" containerName="init" Mar 01 09:28:27 crc kubenswrapper[4792]: E0301 09:28:27.853462 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="667fff68-7113-4dfe-86b4-34b80b41d326" containerName="dnsmasq-dns" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.853482 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="667fff68-7113-4dfe-86b4-34b80b41d326" containerName="dnsmasq-dns" Mar 01 09:28:27 crc kubenswrapper[4792]: E0301 09:28:27.853498 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66aba873-81b0-452a-81f9-73cc18445180" containerName="neutron-db-sync" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.853506 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="66aba873-81b0-452a-81f9-73cc18445180" containerName="neutron-db-sync" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.853959 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="667fff68-7113-4dfe-86b4-34b80b41d326" containerName="dnsmasq-dns" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.853978 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="66aba873-81b0-452a-81f9-73cc18445180" containerName="neutron-db-sync" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.853992 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e8f417d-a9b7-4969-9e24-785fa8baf9c4" containerName="oc" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.857307 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.911035 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76fc8568c7-drpr8"] Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.956615 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-dns-svc\") pod \"dnsmasq-dns-76fc8568c7-drpr8\" (UID: \"82467164-5e77-4ea0-beee-b3a70126c075\") " pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.956674 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-ovsdbserver-nb\") pod \"dnsmasq-dns-76fc8568c7-drpr8\" (UID: \"82467164-5e77-4ea0-beee-b3a70126c075\") " pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.956695 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-ovsdbserver-sb\") pod \"dnsmasq-dns-76fc8568c7-drpr8\" (UID: \"82467164-5e77-4ea0-beee-b3a70126c075\") " pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.956762 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-config\") pod \"dnsmasq-dns-76fc8568c7-drpr8\" (UID: \"82467164-5e77-4ea0-beee-b3a70126c075\") " pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.956785 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg4gm\" (UniqueName: \"kubernetes.io/projected/82467164-5e77-4ea0-beee-b3a70126c075-kube-api-access-hg4gm\") pod \"dnsmasq-dns-76fc8568c7-drpr8\" (UID: \"82467164-5e77-4ea0-beee-b3a70126c075\") " pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.964093 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7bbc5b86d6-8b672"] Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.966447 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7bbc5b86d6-8b672" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.972618 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.973003 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.973138 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.973439 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-k82sx" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.981491 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7bbc5b86d6-8b672"] Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.059514 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-config\") pod \"dnsmasq-dns-76fc8568c7-drpr8\" (UID: \"82467164-5e77-4ea0-beee-b3a70126c075\") " pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.059586 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg4gm\" (UniqueName: \"kubernetes.io/projected/82467164-5e77-4ea0-beee-b3a70126c075-kube-api-access-hg4gm\") pod \"dnsmasq-dns-76fc8568c7-drpr8\" (UID: \"82467164-5e77-4ea0-beee-b3a70126c075\") " pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.059650 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-dns-svc\") pod \"dnsmasq-dns-76fc8568c7-drpr8\" (UID: \"82467164-5e77-4ea0-beee-b3a70126c075\") " pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.059699 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-ovsdbserver-nb\") pod \"dnsmasq-dns-76fc8568c7-drpr8\" (UID: \"82467164-5e77-4ea0-beee-b3a70126c075\") " pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.059722 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-ovsdbserver-sb\") pod \"dnsmasq-dns-76fc8568c7-drpr8\" (UID: \"82467164-5e77-4ea0-beee-b3a70126c075\") " pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.060787 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-ovsdbserver-sb\") pod \"dnsmasq-dns-76fc8568c7-drpr8\" (UID: \"82467164-5e77-4ea0-beee-b3a70126c075\") " pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.062795 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-dns-svc\") pod \"dnsmasq-dns-76fc8568c7-drpr8\" (UID: \"82467164-5e77-4ea0-beee-b3a70126c075\") " pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.062986 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-ovsdbserver-nb\") pod \"dnsmasq-dns-76fc8568c7-drpr8\" (UID: \"82467164-5e77-4ea0-beee-b3a70126c075\") " pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.063182 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-config\") pod \"dnsmasq-dns-76fc8568c7-drpr8\" (UID: \"82467164-5e77-4ea0-beee-b3a70126c075\") " pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.109678 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg4gm\" (UniqueName: \"kubernetes.io/projected/82467164-5e77-4ea0-beee-b3a70126c075-kube-api-access-hg4gm\") pod \"dnsmasq-dns-76fc8568c7-drpr8\" (UID: \"82467164-5e77-4ea0-beee-b3a70126c075\") " pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.160764 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsctq\" (UniqueName: \"kubernetes.io/projected/013566fd-5627-422a-809a-e81a8ec059d9-kube-api-access-hsctq\") pod \"neutron-7bbc5b86d6-8b672\" (UID: \"013566fd-5627-422a-809a-e81a8ec059d9\") " pod="openstack/neutron-7bbc5b86d6-8b672" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.160872 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-combined-ca-bundle\") pod \"neutron-7bbc5b86d6-8b672\" (UID: \"013566fd-5627-422a-809a-e81a8ec059d9\") " pod="openstack/neutron-7bbc5b86d6-8b672" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.160919 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-config\") pod \"neutron-7bbc5b86d6-8b672\" (UID: \"013566fd-5627-422a-809a-e81a8ec059d9\") " pod="openstack/neutron-7bbc5b86d6-8b672" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.160964 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-ovndb-tls-certs\") pod \"neutron-7bbc5b86d6-8b672\" (UID: \"013566fd-5627-422a-809a-e81a8ec059d9\") " pod="openstack/neutron-7bbc5b86d6-8b672" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.161004 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-httpd-config\") pod \"neutron-7bbc5b86d6-8b672\" (UID: \"013566fd-5627-422a-809a-e81a8ec059d9\") " pod="openstack/neutron-7bbc5b86d6-8b672" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.211897 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.261955 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-combined-ca-bundle\") pod \"neutron-7bbc5b86d6-8b672\" (UID: \"013566fd-5627-422a-809a-e81a8ec059d9\") " pod="openstack/neutron-7bbc5b86d6-8b672" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.262018 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-config\") pod \"neutron-7bbc5b86d6-8b672\" (UID: \"013566fd-5627-422a-809a-e81a8ec059d9\") " pod="openstack/neutron-7bbc5b86d6-8b672" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.262150 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-ovndb-tls-certs\") pod \"neutron-7bbc5b86d6-8b672\" (UID: \"013566fd-5627-422a-809a-e81a8ec059d9\") " pod="openstack/neutron-7bbc5b86d6-8b672" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.262192 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-httpd-config\") pod \"neutron-7bbc5b86d6-8b672\" (UID: \"013566fd-5627-422a-809a-e81a8ec059d9\") " pod="openstack/neutron-7bbc5b86d6-8b672" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.262236 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsctq\" (UniqueName: \"kubernetes.io/projected/013566fd-5627-422a-809a-e81a8ec059d9-kube-api-access-hsctq\") pod \"neutron-7bbc5b86d6-8b672\" (UID: \"013566fd-5627-422a-809a-e81a8ec059d9\") " pod="openstack/neutron-7bbc5b86d6-8b672" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.267025 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-ovndb-tls-certs\") pod \"neutron-7bbc5b86d6-8b672\" (UID: \"013566fd-5627-422a-809a-e81a8ec059d9\") " pod="openstack/neutron-7bbc5b86d6-8b672" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.267451 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-httpd-config\") pod \"neutron-7bbc5b86d6-8b672\" (UID: \"013566fd-5627-422a-809a-e81a8ec059d9\") " pod="openstack/neutron-7bbc5b86d6-8b672" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.268217 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-config\") pod \"neutron-7bbc5b86d6-8b672\" (UID: \"013566fd-5627-422a-809a-e81a8ec059d9\") " pod="openstack/neutron-7bbc5b86d6-8b672" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.269953 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-combined-ca-bundle\") pod \"neutron-7bbc5b86d6-8b672\" (UID: \"013566fd-5627-422a-809a-e81a8ec059d9\") " pod="openstack/neutron-7bbc5b86d6-8b672" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.288457 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsctq\" (UniqueName: \"kubernetes.io/projected/013566fd-5627-422a-809a-e81a8ec059d9-kube-api-access-hsctq\") pod \"neutron-7bbc5b86d6-8b672\" (UID: \"013566fd-5627-422a-809a-e81a8ec059d9\") " pod="openstack/neutron-7bbc5b86d6-8b672" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.296682 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7bbc5b86d6-8b672" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.570729 4792 generic.go:334] "Generic (PLEG): container finished" podID="7108e9ac-8215-41ca-ac84-3b3851142a42" containerID="4a5e793bcbd54f67d2aa56894763cca0ce1c06ab0ab5c25152dbd8e3b2985066" exitCode=0 Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.570767 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-m9l5f" event={"ID":"7108e9ac-8215-41ca-ac84-3b3851142a42","Type":"ContainerDied","Data":"4a5e793bcbd54f67d2aa56894763cca0ce1c06ab0ab5c25152dbd8e3b2985066"} Mar 01 09:28:28 crc kubenswrapper[4792]: E0301 09:28:28.763062 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddabb3d2e_57fa_4ad3_9f3b_b85e0b670650.slice/crio-212a58af65a44bd380179af8061fb1773b3c54204d964227d1d4aa6b00521785\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddabb3d2e_57fa_4ad3_9f3b_b85e0b670650.slice\": RecentStats: unable to find data in memory cache]" Mar 01 09:28:29 crc kubenswrapper[4792]: I0301 09:28:29.281449 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-f89zl" Mar 01 09:28:29 crc kubenswrapper[4792]: I0301 09:28:29.389678 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e623b24a-64a5-4209-86bb-1814ae9c400b-logs\") pod \"e623b24a-64a5-4209-86bb-1814ae9c400b\" (UID: \"e623b24a-64a5-4209-86bb-1814ae9c400b\") " Mar 01 09:28:29 crc kubenswrapper[4792]: I0301 09:28:29.389755 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4f79b\" (UniqueName: \"kubernetes.io/projected/e623b24a-64a5-4209-86bb-1814ae9c400b-kube-api-access-4f79b\") pod \"e623b24a-64a5-4209-86bb-1814ae9c400b\" (UID: \"e623b24a-64a5-4209-86bb-1814ae9c400b\") " Mar 01 09:28:29 crc kubenswrapper[4792]: I0301 09:28:29.389816 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e623b24a-64a5-4209-86bb-1814ae9c400b-scripts\") pod \"e623b24a-64a5-4209-86bb-1814ae9c400b\" (UID: \"e623b24a-64a5-4209-86bb-1814ae9c400b\") " Mar 01 09:28:29 crc kubenswrapper[4792]: I0301 09:28:29.390358 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e623b24a-64a5-4209-86bb-1814ae9c400b-logs" (OuterVolumeSpecName: "logs") pod "e623b24a-64a5-4209-86bb-1814ae9c400b" (UID: "e623b24a-64a5-4209-86bb-1814ae9c400b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:28:29 crc kubenswrapper[4792]: I0301 09:28:29.390674 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e623b24a-64a5-4209-86bb-1814ae9c400b-combined-ca-bundle\") pod \"e623b24a-64a5-4209-86bb-1814ae9c400b\" (UID: \"e623b24a-64a5-4209-86bb-1814ae9c400b\") " Mar 01 09:28:29 crc kubenswrapper[4792]: I0301 09:28:29.390711 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e623b24a-64a5-4209-86bb-1814ae9c400b-config-data\") pod \"e623b24a-64a5-4209-86bb-1814ae9c400b\" (UID: \"e623b24a-64a5-4209-86bb-1814ae9c400b\") " Mar 01 09:28:29 crc kubenswrapper[4792]: I0301 09:28:29.391174 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e623b24a-64a5-4209-86bb-1814ae9c400b-logs\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:29 crc kubenswrapper[4792]: I0301 09:28:29.401019 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e623b24a-64a5-4209-86bb-1814ae9c400b-scripts" (OuterVolumeSpecName: "scripts") pod "e623b24a-64a5-4209-86bb-1814ae9c400b" (UID: "e623b24a-64a5-4209-86bb-1814ae9c400b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:29 crc kubenswrapper[4792]: I0301 09:28:29.401086 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e623b24a-64a5-4209-86bb-1814ae9c400b-kube-api-access-4f79b" (OuterVolumeSpecName: "kube-api-access-4f79b") pod "e623b24a-64a5-4209-86bb-1814ae9c400b" (UID: "e623b24a-64a5-4209-86bb-1814ae9c400b"). InnerVolumeSpecName "kube-api-access-4f79b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:28:29 crc kubenswrapper[4792]: I0301 09:28:29.430476 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e623b24a-64a5-4209-86bb-1814ae9c400b-config-data" (OuterVolumeSpecName: "config-data") pod "e623b24a-64a5-4209-86bb-1814ae9c400b" (UID: "e623b24a-64a5-4209-86bb-1814ae9c400b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:29 crc kubenswrapper[4792]: I0301 09:28:29.430500 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e623b24a-64a5-4209-86bb-1814ae9c400b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e623b24a-64a5-4209-86bb-1814ae9c400b" (UID: "e623b24a-64a5-4209-86bb-1814ae9c400b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:29 crc kubenswrapper[4792]: I0301 09:28:29.492599 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4f79b\" (UniqueName: \"kubernetes.io/projected/e623b24a-64a5-4209-86bb-1814ae9c400b-kube-api-access-4f79b\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:29 crc kubenswrapper[4792]: I0301 09:28:29.492632 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e623b24a-64a5-4209-86bb-1814ae9c400b-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:29 crc kubenswrapper[4792]: I0301 09:28:29.492643 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e623b24a-64a5-4209-86bb-1814ae9c400b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:29 crc kubenswrapper[4792]: I0301 09:28:29.492651 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e623b24a-64a5-4209-86bb-1814ae9c400b-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:29 crc kubenswrapper[4792]: I0301 09:28:29.580929 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-f89zl" Mar 01 09:28:29 crc kubenswrapper[4792]: I0301 09:28:29.580950 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-f89zl" event={"ID":"e623b24a-64a5-4209-86bb-1814ae9c400b","Type":"ContainerDied","Data":"e4885fc9359de722bc25d23b6b1337620c1bc715e4d3353e33a0de4919152488"} Mar 01 09:28:29 crc kubenswrapper[4792]: I0301 09:28:29.581031 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4885fc9359de722bc25d23b6b1337620c1bc715e4d3353e33a0de4919152488" Mar 01 09:28:29 crc kubenswrapper[4792]: I0301 09:28:29.995672 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-66955dfdb5-6j2wx"] Mar 01 09:28:29 crc kubenswrapper[4792]: E0301 09:28:29.996663 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e623b24a-64a5-4209-86bb-1814ae9c400b" containerName="placement-db-sync" Mar 01 09:28:29 crc kubenswrapper[4792]: I0301 09:28:29.996736 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e623b24a-64a5-4209-86bb-1814ae9c400b" containerName="placement-db-sync" Mar 01 09:28:29 crc kubenswrapper[4792]: I0301 09:28:29.996954 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e623b24a-64a5-4209-86bb-1814ae9c400b" containerName="placement-db-sync" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:29.997823 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.011226 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-66955dfdb5-6j2wx"] Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.011674 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.012065 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.104797 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-public-tls-certs\") pod \"neutron-66955dfdb5-6j2wx\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.104870 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-httpd-config\") pod \"neutron-66955dfdb5-6j2wx\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.104983 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8fsh\" (UniqueName: \"kubernetes.io/projected/90e1a395-ebc6-49ca-9924-c64283c12ec4-kube-api-access-h8fsh\") pod \"neutron-66955dfdb5-6j2wx\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.105017 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-config\") pod \"neutron-66955dfdb5-6j2wx\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.105035 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-combined-ca-bundle\") pod \"neutron-66955dfdb5-6j2wx\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.105089 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-internal-tls-certs\") pod \"neutron-66955dfdb5-6j2wx\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.105145 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-ovndb-tls-certs\") pod \"neutron-66955dfdb5-6j2wx\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.207050 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-ovndb-tls-certs\") pod \"neutron-66955dfdb5-6j2wx\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.207256 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-public-tls-certs\") pod \"neutron-66955dfdb5-6j2wx\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.207305 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-httpd-config\") pod \"neutron-66955dfdb5-6j2wx\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.207345 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8fsh\" (UniqueName: \"kubernetes.io/projected/90e1a395-ebc6-49ca-9924-c64283c12ec4-kube-api-access-h8fsh\") pod \"neutron-66955dfdb5-6j2wx\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.207385 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-config\") pod \"neutron-66955dfdb5-6j2wx\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.207401 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-combined-ca-bundle\") pod \"neutron-66955dfdb5-6j2wx\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.207434 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-internal-tls-certs\") pod \"neutron-66955dfdb5-6j2wx\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.213016 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-config\") pod \"neutron-66955dfdb5-6j2wx\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.213575 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-public-tls-certs\") pod \"neutron-66955dfdb5-6j2wx\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.216229 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-httpd-config\") pod \"neutron-66955dfdb5-6j2wx\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.226616 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-ovndb-tls-certs\") pod \"neutron-66955dfdb5-6j2wx\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.227434 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-internal-tls-certs\") pod \"neutron-66955dfdb5-6j2wx\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.237573 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-combined-ca-bundle\") pod \"neutron-66955dfdb5-6j2wx\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.239053 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8fsh\" (UniqueName: \"kubernetes.io/projected/90e1a395-ebc6-49ca-9924-c64283c12ec4-kube-api-access-h8fsh\") pod \"neutron-66955dfdb5-6j2wx\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.321819 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.386810 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7f7447dcd6-cpnn5"] Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.388099 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.394229 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.394250 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.394575 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.394803 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.394967 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-z8zjf" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.460205 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-public-tls-certs\") pod \"placement-7f7447dcd6-cpnn5\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.460260 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-combined-ca-bundle\") pod \"placement-7f7447dcd6-cpnn5\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.460308 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9trll\" (UniqueName: \"kubernetes.io/projected/947b32da-5664-42ff-a665-ac182dea1433-kube-api-access-9trll\") pod \"placement-7f7447dcd6-cpnn5\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.460348 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-config-data\") pod \"placement-7f7447dcd6-cpnn5\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.460401 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-internal-tls-certs\") pod \"placement-7f7447dcd6-cpnn5\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.460464 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-scripts\") pod \"placement-7f7447dcd6-cpnn5\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.460525 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/947b32da-5664-42ff-a665-ac182dea1433-logs\") pod \"placement-7f7447dcd6-cpnn5\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.482587 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7f7447dcd6-cpnn5"] Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.563845 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-scripts\") pod \"placement-7f7447dcd6-cpnn5\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.563937 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/947b32da-5664-42ff-a665-ac182dea1433-logs\") pod \"placement-7f7447dcd6-cpnn5\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.563967 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-public-tls-certs\") pod \"placement-7f7447dcd6-cpnn5\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.564495 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-combined-ca-bundle\") pod \"placement-7f7447dcd6-cpnn5\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.565018 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/947b32da-5664-42ff-a665-ac182dea1433-logs\") pod \"placement-7f7447dcd6-cpnn5\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.565046 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9trll\" (UniqueName: \"kubernetes.io/projected/947b32da-5664-42ff-a665-ac182dea1433-kube-api-access-9trll\") pod \"placement-7f7447dcd6-cpnn5\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.565102 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-config-data\") pod \"placement-7f7447dcd6-cpnn5\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.565190 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-internal-tls-certs\") pod \"placement-7f7447dcd6-cpnn5\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.568368 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-scripts\") pod \"placement-7f7447dcd6-cpnn5\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.568955 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-combined-ca-bundle\") pod \"placement-7f7447dcd6-cpnn5\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.569250 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-public-tls-certs\") pod \"placement-7f7447dcd6-cpnn5\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.577572 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-config-data\") pod \"placement-7f7447dcd6-cpnn5\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.577980 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-internal-tls-certs\") pod \"placement-7f7447dcd6-cpnn5\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.600469 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9trll\" (UniqueName: \"kubernetes.io/projected/947b32da-5664-42ff-a665-ac182dea1433-kube-api-access-9trll\") pod \"placement-7f7447dcd6-cpnn5\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.763971 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.036533 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-m9l5f" Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.196634 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-credential-keys\") pod \"7108e9ac-8215-41ca-ac84-3b3851142a42\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.196687 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l92rl\" (UniqueName: \"kubernetes.io/projected/7108e9ac-8215-41ca-ac84-3b3851142a42-kube-api-access-l92rl\") pod \"7108e9ac-8215-41ca-ac84-3b3851142a42\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.196752 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-combined-ca-bundle\") pod \"7108e9ac-8215-41ca-ac84-3b3851142a42\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.196775 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-config-data\") pod \"7108e9ac-8215-41ca-ac84-3b3851142a42\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.196802 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-fernet-keys\") pod \"7108e9ac-8215-41ca-ac84-3b3851142a42\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.196918 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-scripts\") pod \"7108e9ac-8215-41ca-ac84-3b3851142a42\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.205756 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-scripts" (OuterVolumeSpecName: "scripts") pod "7108e9ac-8215-41ca-ac84-3b3851142a42" (UID: "7108e9ac-8215-41ca-ac84-3b3851142a42"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.214900 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "7108e9ac-8215-41ca-ac84-3b3851142a42" (UID: "7108e9ac-8215-41ca-ac84-3b3851142a42"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.216408 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "7108e9ac-8215-41ca-ac84-3b3851142a42" (UID: "7108e9ac-8215-41ca-ac84-3b3851142a42"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.219051 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7108e9ac-8215-41ca-ac84-3b3851142a42-kube-api-access-l92rl" (OuterVolumeSpecName: "kube-api-access-l92rl") pod "7108e9ac-8215-41ca-ac84-3b3851142a42" (UID: "7108e9ac-8215-41ca-ac84-3b3851142a42"). InnerVolumeSpecName "kube-api-access-l92rl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.272971 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-config-data" (OuterVolumeSpecName: "config-data") pod "7108e9ac-8215-41ca-ac84-3b3851142a42" (UID: "7108e9ac-8215-41ca-ac84-3b3851142a42"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.288639 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7108e9ac-8215-41ca-ac84-3b3851142a42" (UID: "7108e9ac-8215-41ca-ac84-3b3851142a42"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.300339 4792 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.300367 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l92rl\" (UniqueName: \"kubernetes.io/projected/7108e9ac-8215-41ca-ac84-3b3851142a42-kube-api-access-l92rl\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.300379 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.300388 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.300397 4792 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.300406 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.605129 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-m9l5f" event={"ID":"7108e9ac-8215-41ca-ac84-3b3851142a42","Type":"ContainerDied","Data":"2b1de65ec699fbf1baf52d26468fee288183e04f12523b93783a52b6e1c65a17"} Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.605166 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b1de65ec699fbf1baf52d26468fee288183e04f12523b93783a52b6e1c65a17" Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.605224 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-m9l5f" Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.608251 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd","Type":"ContainerStarted","Data":"10b8b301db81f96185e1ca933d6d257371da7cfb2a533a1434c80a6ff2a5895f"} Mar 01 09:28:32 crc kubenswrapper[4792]: W0301 09:28:32.702156 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod947b32da_5664_42ff_a665_ac182dea1433.slice/crio-5d91235d937c34cd2f7190314bb2d67a56ac04c2b63da01a91c659eee91c9179 WatchSource:0}: Error finding container 5d91235d937c34cd2f7190314bb2d67a56ac04c2b63da01a91c659eee91c9179: Status 404 returned error can't find the container with id 5d91235d937c34cd2f7190314bb2d67a56ac04c2b63da01a91c659eee91c9179 Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.707205 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7f7447dcd6-cpnn5"] Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.908474 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-66955dfdb5-6j2wx"] Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.940101 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76fc8568c7-drpr8"] Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.035252 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7bbc5b86d6-8b672"] Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.184450 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-749f685d77-ggsln"] Mar 01 09:28:33 crc kubenswrapper[4792]: E0301 09:28:33.186046 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7108e9ac-8215-41ca-ac84-3b3851142a42" containerName="keystone-bootstrap" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.186331 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7108e9ac-8215-41ca-ac84-3b3851142a42" containerName="keystone-bootstrap" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.186581 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="7108e9ac-8215-41ca-ac84-3b3851142a42" containerName="keystone-bootstrap" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.188163 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.191105 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.191549 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-fr9vh" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.191722 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.192416 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.192711 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.192782 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.211604 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-749f685d77-ggsln"] Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.235652 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b60e7776-3e2a-4e08-900d-cd39a29a78bc-credential-keys\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.235719 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkxsj\" (UniqueName: \"kubernetes.io/projected/b60e7776-3e2a-4e08-900d-cd39a29a78bc-kube-api-access-rkxsj\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.235864 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b60e7776-3e2a-4e08-900d-cd39a29a78bc-scripts\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.235888 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b60e7776-3e2a-4e08-900d-cd39a29a78bc-public-tls-certs\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.235934 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b60e7776-3e2a-4e08-900d-cd39a29a78bc-config-data\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.235983 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b60e7776-3e2a-4e08-900d-cd39a29a78bc-fernet-keys\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.236038 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b60e7776-3e2a-4e08-900d-cd39a29a78bc-internal-tls-certs\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.236124 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b60e7776-3e2a-4e08-900d-cd39a29a78bc-combined-ca-bundle\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.347441 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b60e7776-3e2a-4e08-900d-cd39a29a78bc-config-data\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.348082 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b60e7776-3e2a-4e08-900d-cd39a29a78bc-fernet-keys\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.351760 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b60e7776-3e2a-4e08-900d-cd39a29a78bc-internal-tls-certs\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.351809 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b60e7776-3e2a-4e08-900d-cd39a29a78bc-combined-ca-bundle\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.355232 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b60e7776-3e2a-4e08-900d-cd39a29a78bc-fernet-keys\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.355587 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b60e7776-3e2a-4e08-900d-cd39a29a78bc-combined-ca-bundle\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.357675 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b60e7776-3e2a-4e08-900d-cd39a29a78bc-config-data\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.358484 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b60e7776-3e2a-4e08-900d-cd39a29a78bc-credential-keys\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.358531 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkxsj\" (UniqueName: \"kubernetes.io/projected/b60e7776-3e2a-4e08-900d-cd39a29a78bc-kube-api-access-rkxsj\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.358664 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b60e7776-3e2a-4e08-900d-cd39a29a78bc-scripts\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.358693 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b60e7776-3e2a-4e08-900d-cd39a29a78bc-public-tls-certs\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.362356 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b60e7776-3e2a-4e08-900d-cd39a29a78bc-internal-tls-certs\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.367255 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b60e7776-3e2a-4e08-900d-cd39a29a78bc-credential-keys\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.368076 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b60e7776-3e2a-4e08-900d-cd39a29a78bc-scripts\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.368812 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b60e7776-3e2a-4e08-900d-cd39a29a78bc-public-tls-certs\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.380006 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkxsj\" (UniqueName: \"kubernetes.io/projected/b60e7776-3e2a-4e08-900d-cd39a29a78bc-kube-api-access-rkxsj\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.577583 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.624951 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66955dfdb5-6j2wx" event={"ID":"90e1a395-ebc6-49ca-9924-c64283c12ec4","Type":"ContainerStarted","Data":"171f60709b450c4b056a3daad4b651d3e95b5d1e5702d55092b0d107469669dc"} Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.624996 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66955dfdb5-6j2wx" event={"ID":"90e1a395-ebc6-49ca-9924-c64283c12ec4","Type":"ContainerStarted","Data":"0e01f5820150bb847cd98736209f40a8cfae4c1d42fc832b6f738e299bc2db88"} Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.631429 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f7447dcd6-cpnn5" event={"ID":"947b32da-5664-42ff-a665-ac182dea1433","Type":"ContainerStarted","Data":"9288e4694d44c535f5db6a3588b5f57161da3432c7fdcf10f8abb56af7bd4be9"} Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.631469 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f7447dcd6-cpnn5" event={"ID":"947b32da-5664-42ff-a665-ac182dea1433","Type":"ContainerStarted","Data":"4555af75d6b8f403b02d065ff405189b215772f534b2956e72f7441d250ba2de"} Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.631481 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f7447dcd6-cpnn5" event={"ID":"947b32da-5664-42ff-a665-ac182dea1433","Type":"ContainerStarted","Data":"5d91235d937c34cd2f7190314bb2d67a56ac04c2b63da01a91c659eee91c9179"} Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.631843 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.631963 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.635158 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bbc5b86d6-8b672" event={"ID":"013566fd-5627-422a-809a-e81a8ec059d9","Type":"ContainerStarted","Data":"ca55616f2e5de805229eb50d3f643de3b33e4b53039ccf7569dc0337fc8e14a5"} Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.635187 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bbc5b86d6-8b672" event={"ID":"013566fd-5627-422a-809a-e81a8ec059d9","Type":"ContainerStarted","Data":"120df1c67b7935983b4052512da03cb57b70749fae0a4306db0f53d6bed8199c"} Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.645950 4792 generic.go:334] "Generic (PLEG): container finished" podID="82467164-5e77-4ea0-beee-b3a70126c075" containerID="610fd4808253a760f048f5c379d2e39f3eec919b6601aa983fb2a5f9ece83ce6" exitCode=0 Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.645996 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" event={"ID":"82467164-5e77-4ea0-beee-b3a70126c075","Type":"ContainerDied","Data":"610fd4808253a760f048f5c379d2e39f3eec919b6601aa983fb2a5f9ece83ce6"} Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.646019 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" event={"ID":"82467164-5e77-4ea0-beee-b3a70126c075","Type":"ContainerStarted","Data":"8b9ae7283e8be3bed9877439415e05f84a3d2c818de0aa317ad4a53c2c6bc4d6"} Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.663586 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7f7447dcd6-cpnn5" podStartSLOduration=3.663555251 podStartE2EDuration="3.663555251s" podCreationTimestamp="2026-03-01 09:28:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:28:33.647735923 +0000 UTC m=+1242.889615120" watchObservedRunningTime="2026-03-01 09:28:33.663555251 +0000 UTC m=+1242.905434448" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.900132 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-749f685d77-ggsln"] Mar 01 09:28:34 crc kubenswrapper[4792]: I0301 09:28:34.662379 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bbc5b86d6-8b672" event={"ID":"013566fd-5627-422a-809a-e81a8ec059d9","Type":"ContainerStarted","Data":"2f7d0d1b6918c9e962de8e492d201469a86475bd623f4086bd8a9e1f30a74d63"} Mar 01 09:28:34 crc kubenswrapper[4792]: I0301 09:28:34.662735 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7bbc5b86d6-8b672" Mar 01 09:28:34 crc kubenswrapper[4792]: I0301 09:28:34.664345 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-749f685d77-ggsln" event={"ID":"b60e7776-3e2a-4e08-900d-cd39a29a78bc","Type":"ContainerStarted","Data":"46610eac821e49ac438645f9b9439a857b8afb4d70878718f9ff80be2c356756"} Mar 01 09:28:34 crc kubenswrapper[4792]: I0301 09:28:34.667302 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" event={"ID":"82467164-5e77-4ea0-beee-b3a70126c075","Type":"ContainerStarted","Data":"4fab450c58bb439c06a393afb4862347d838c516e4e1d271f6a55936b2e24c70"} Mar 01 09:28:34 crc kubenswrapper[4792]: I0301 09:28:34.667444 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" Mar 01 09:28:34 crc kubenswrapper[4792]: I0301 09:28:34.669153 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66955dfdb5-6j2wx" event={"ID":"90e1a395-ebc6-49ca-9924-c64283c12ec4","Type":"ContainerStarted","Data":"5ed4ad048c8c293d006aad77d5956e915974d390f9663c49f5f3c523ced04257"} Mar 01 09:28:34 crc kubenswrapper[4792]: I0301 09:28:34.669696 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:34 crc kubenswrapper[4792]: I0301 09:28:34.690364 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7bbc5b86d6-8b672" podStartSLOduration=7.690349465 podStartE2EDuration="7.690349465s" podCreationTimestamp="2026-03-01 09:28:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:28:34.685722011 +0000 UTC m=+1243.927601208" watchObservedRunningTime="2026-03-01 09:28:34.690349465 +0000 UTC m=+1243.932228662" Mar 01 09:28:34 crc kubenswrapper[4792]: I0301 09:28:34.753689 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-66955dfdb5-6j2wx" podStartSLOduration=5.753672009 podStartE2EDuration="5.753672009s" podCreationTimestamp="2026-03-01 09:28:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:28:34.723717584 +0000 UTC m=+1243.965596781" watchObservedRunningTime="2026-03-01 09:28:34.753672009 +0000 UTC m=+1243.995551196" Mar 01 09:28:35 crc kubenswrapper[4792]: I0301 09:28:35.431767 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" podStartSLOduration=8.431728363 podStartE2EDuration="8.431728363s" podCreationTimestamp="2026-03-01 09:28:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:28:34.755972385 +0000 UTC m=+1243.997851582" watchObservedRunningTime="2026-03-01 09:28:35.431728363 +0000 UTC m=+1244.673607560" Mar 01 09:28:35 crc kubenswrapper[4792]: I0301 09:28:35.680449 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-749f685d77-ggsln" event={"ID":"b60e7776-3e2a-4e08-900d-cd39a29a78bc","Type":"ContainerStarted","Data":"9d05507d7c784c54e92f11b486349b46e1781ea9f40abf0c6c7a47b5a0f5a762"} Mar 01 09:28:35 crc kubenswrapper[4792]: I0301 09:28:35.680617 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:35 crc kubenswrapper[4792]: I0301 09:28:35.708914 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-749f685d77-ggsln" podStartSLOduration=2.708879435 podStartE2EDuration="2.708879435s" podCreationTimestamp="2026-03-01 09:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:28:35.70013982 +0000 UTC m=+1244.942019017" watchObservedRunningTime="2026-03-01 09:28:35.708879435 +0000 UTC m=+1244.950758632" Mar 01 09:28:38 crc kubenswrapper[4792]: I0301 09:28:38.214115 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" Mar 01 09:28:38 crc kubenswrapper[4792]: I0301 09:28:38.277270 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589bbb667-6gxlc"] Mar 01 09:28:38 crc kubenswrapper[4792]: I0301 09:28:38.277514 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-589bbb667-6gxlc" podUID="4e1a508b-9db4-414a-b06d-2f01a2c132a1" containerName="dnsmasq-dns" containerID="cri-o://245d762d71bcda288abc788f3327cb41b0f2c4d918af85b1ae35b3d1615c472c" gracePeriod=10 Mar 01 09:28:38 crc kubenswrapper[4792]: I0301 09:28:38.343342 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-589bbb667-6gxlc" podUID="4e1a508b-9db4-414a-b06d-2f01a2c132a1" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.140:5353: connect: connection refused" Mar 01 09:28:38 crc kubenswrapper[4792]: I0301 09:28:38.722258 4792 generic.go:334] "Generic (PLEG): container finished" podID="4e1a508b-9db4-414a-b06d-2f01a2c132a1" containerID="245d762d71bcda288abc788f3327cb41b0f2c4d918af85b1ae35b3d1615c472c" exitCode=0 Mar 01 09:28:38 crc kubenswrapper[4792]: I0301 09:28:38.722300 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589bbb667-6gxlc" event={"ID":"4e1a508b-9db4-414a-b06d-2f01a2c132a1","Type":"ContainerDied","Data":"245d762d71bcda288abc788f3327cb41b0f2c4d918af85b1ae35b3d1615c472c"} Mar 01 09:28:38 crc kubenswrapper[4792]: E0301 09:28:38.954697 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddabb3d2e_57fa_4ad3_9f3b_b85e0b670650.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddabb3d2e_57fa_4ad3_9f3b_b85e0b670650.slice/crio-212a58af65a44bd380179af8061fb1773b3c54204d964227d1d4aa6b00521785\": RecentStats: unable to find data in memory cache]" Mar 01 09:28:42 crc kubenswrapper[4792]: I0301 09:28:42.705503 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589bbb667-6gxlc" Mar 01 09:28:42 crc kubenswrapper[4792]: I0301 09:28:42.814832 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589bbb667-6gxlc" event={"ID":"4e1a508b-9db4-414a-b06d-2f01a2c132a1","Type":"ContainerDied","Data":"0fc29a339e8c09faaadb240388644d964ff0e254b0fbfad00516d578fb8bff1b"} Mar 01 09:28:42 crc kubenswrapper[4792]: I0301 09:28:42.815189 4792 scope.go:117] "RemoveContainer" containerID="245d762d71bcda288abc788f3327cb41b0f2c4d918af85b1ae35b3d1615c472c" Mar 01 09:28:42 crc kubenswrapper[4792]: I0301 09:28:42.815627 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589bbb667-6gxlc" Mar 01 09:28:42 crc kubenswrapper[4792]: I0301 09:28:42.867649 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-ovsdbserver-sb\") pod \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\" (UID: \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\") " Mar 01 09:28:42 crc kubenswrapper[4792]: I0301 09:28:42.867793 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-config\") pod \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\" (UID: \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\") " Mar 01 09:28:42 crc kubenswrapper[4792]: I0301 09:28:42.867819 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cv7bl\" (UniqueName: \"kubernetes.io/projected/4e1a508b-9db4-414a-b06d-2f01a2c132a1-kube-api-access-cv7bl\") pod \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\" (UID: \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\") " Mar 01 09:28:42 crc kubenswrapper[4792]: I0301 09:28:42.867858 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-ovsdbserver-nb\") pod \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\" (UID: \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\") " Mar 01 09:28:42 crc kubenswrapper[4792]: I0301 09:28:42.867877 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-dns-svc\") pod \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\" (UID: \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\") " Mar 01 09:28:42 crc kubenswrapper[4792]: I0301 09:28:42.891581 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e1a508b-9db4-414a-b06d-2f01a2c132a1-kube-api-access-cv7bl" (OuterVolumeSpecName: "kube-api-access-cv7bl") pod "4e1a508b-9db4-414a-b06d-2f01a2c132a1" (UID: "4e1a508b-9db4-414a-b06d-2f01a2c132a1"). InnerVolumeSpecName "kube-api-access-cv7bl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:28:42 crc kubenswrapper[4792]: I0301 09:28:42.902412 4792 scope.go:117] "RemoveContainer" containerID="70a638170c1ac600ae163392b053537afb8b6aa9687b87f677426f4f023db168" Mar 01 09:28:42 crc kubenswrapper[4792]: I0301 09:28:42.923819 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-config" (OuterVolumeSpecName: "config") pod "4e1a508b-9db4-414a-b06d-2f01a2c132a1" (UID: "4e1a508b-9db4-414a-b06d-2f01a2c132a1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:28:42 crc kubenswrapper[4792]: I0301 09:28:42.936086 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4e1a508b-9db4-414a-b06d-2f01a2c132a1" (UID: "4e1a508b-9db4-414a-b06d-2f01a2c132a1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:28:42 crc kubenswrapper[4792]: I0301 09:28:42.938394 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4e1a508b-9db4-414a-b06d-2f01a2c132a1" (UID: "4e1a508b-9db4-414a-b06d-2f01a2c132a1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:28:42 crc kubenswrapper[4792]: I0301 09:28:42.941477 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4e1a508b-9db4-414a-b06d-2f01a2c132a1" (UID: "4e1a508b-9db4-414a-b06d-2f01a2c132a1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:28:42 crc kubenswrapper[4792]: I0301 09:28:42.970278 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:42 crc kubenswrapper[4792]: I0301 09:28:42.970400 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:42 crc kubenswrapper[4792]: I0301 09:28:42.970467 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cv7bl\" (UniqueName: \"kubernetes.io/projected/4e1a508b-9db4-414a-b06d-2f01a2c132a1-kube-api-access-cv7bl\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:42 crc kubenswrapper[4792]: I0301 09:28:42.970538 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:42 crc kubenswrapper[4792]: I0301 09:28:42.970601 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:43 crc kubenswrapper[4792]: I0301 09:28:43.175619 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589bbb667-6gxlc"] Mar 01 09:28:43 crc kubenswrapper[4792]: I0301 09:28:43.181968 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-589bbb667-6gxlc"] Mar 01 09:28:43 crc kubenswrapper[4792]: I0301 09:28:43.423976 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e1a508b-9db4-414a-b06d-2f01a2c132a1" path="/var/lib/kubelet/pods/4e1a508b-9db4-414a-b06d-2f01a2c132a1/volumes" Mar 01 09:28:43 crc kubenswrapper[4792]: I0301 09:28:43.823086 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-bxx5d" event={"ID":"9e6bad7a-881b-4ef4-9916-f447e2fc1ffd","Type":"ContainerStarted","Data":"eb8be93bccd98a25e35b0b04ca4b752b359f9eaa5d34f76412ea01464dd8c3f9"} Mar 01 09:28:43 crc kubenswrapper[4792]: I0301 09:28:43.825059 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-gsxqb" event={"ID":"737aa0a0-6e53-451e-9d5f-2deada87b5b4","Type":"ContainerStarted","Data":"5753e582a896b76584d26c8b6fbaf1b0c86841fe9960e87056d0ee4ab735dcee"} Mar 01 09:28:43 crc kubenswrapper[4792]: I0301 09:28:43.827322 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd","Type":"ContainerStarted","Data":"131d1c48281fda07ee509861ecd19ed50a8dc2c67c40d98a6892403dc5e2415a"} Mar 01 09:28:43 crc kubenswrapper[4792]: I0301 09:28:43.827431 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" containerName="ceilometer-central-agent" containerID="cri-o://4fd86a535781157d736a326c5d3973270ef9e75f90ca5c7a184728477646f601" gracePeriod=30 Mar 01 09:28:43 crc kubenswrapper[4792]: I0301 09:28:43.827688 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 01 09:28:43 crc kubenswrapper[4792]: I0301 09:28:43.827733 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" containerName="proxy-httpd" containerID="cri-o://131d1c48281fda07ee509861ecd19ed50a8dc2c67c40d98a6892403dc5e2415a" gracePeriod=30 Mar 01 09:28:43 crc kubenswrapper[4792]: I0301 09:28:43.827774 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" containerName="sg-core" containerID="cri-o://10b8b301db81f96185e1ca933d6d257371da7cfb2a533a1434c80a6ff2a5895f" gracePeriod=30 Mar 01 09:28:43 crc kubenswrapper[4792]: I0301 09:28:43.827813 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" containerName="ceilometer-notification-agent" containerID="cri-o://ac3d91f96c6efaf7baa089ecdf84d5d3fe923f61545b960c7a1aa1d77e8db2e5" gracePeriod=30 Mar 01 09:28:43 crc kubenswrapper[4792]: I0301 09:28:43.871959 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-bxx5d" podStartSLOduration=3.222251536 podStartE2EDuration="46.871942903s" podCreationTimestamp="2026-03-01 09:27:57 +0000 UTC" firstStartedPulling="2026-03-01 09:27:59.114605319 +0000 UTC m=+1208.356484516" lastFinishedPulling="2026-03-01 09:28:42.764296686 +0000 UTC m=+1252.006175883" observedRunningTime="2026-03-01 09:28:43.850027875 +0000 UTC m=+1253.091907072" watchObservedRunningTime="2026-03-01 09:28:43.871942903 +0000 UTC m=+1253.113822100" Mar 01 09:28:43 crc kubenswrapper[4792]: I0301 09:28:43.900791 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-gsxqb" podStartSLOduration=2.934817501 podStartE2EDuration="46.900776011s" podCreationTimestamp="2026-03-01 09:27:57 +0000 UTC" firstStartedPulling="2026-03-01 09:27:58.769330444 +0000 UTC m=+1208.011209641" lastFinishedPulling="2026-03-01 09:28:42.735288954 +0000 UTC m=+1251.977168151" observedRunningTime="2026-03-01 09:28:43.87304073 +0000 UTC m=+1253.114919927" watchObservedRunningTime="2026-03-01 09:28:43.900776011 +0000 UTC m=+1253.142655208" Mar 01 09:28:44 crc kubenswrapper[4792]: I0301 09:28:44.837834 4792 generic.go:334] "Generic (PLEG): container finished" podID="f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" containerID="131d1c48281fda07ee509861ecd19ed50a8dc2c67c40d98a6892403dc5e2415a" exitCode=0 Mar 01 09:28:44 crc kubenswrapper[4792]: I0301 09:28:44.838153 4792 generic.go:334] "Generic (PLEG): container finished" podID="f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" containerID="10b8b301db81f96185e1ca933d6d257371da7cfb2a533a1434c80a6ff2a5895f" exitCode=2 Mar 01 09:28:44 crc kubenswrapper[4792]: I0301 09:28:44.838163 4792 generic.go:334] "Generic (PLEG): container finished" podID="f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" containerID="4fd86a535781157d736a326c5d3973270ef9e75f90ca5c7a184728477646f601" exitCode=0 Mar 01 09:28:44 crc kubenswrapper[4792]: I0301 09:28:44.837928 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd","Type":"ContainerDied","Data":"131d1c48281fda07ee509861ecd19ed50a8dc2c67c40d98a6892403dc5e2415a"} Mar 01 09:28:44 crc kubenswrapper[4792]: I0301 09:28:44.838193 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd","Type":"ContainerDied","Data":"10b8b301db81f96185e1ca933d6d257371da7cfb2a533a1434c80a6ff2a5895f"} Mar 01 09:28:44 crc kubenswrapper[4792]: I0301 09:28:44.838205 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd","Type":"ContainerDied","Data":"4fd86a535781157d736a326c5d3973270ef9e75f90ca5c7a184728477646f601"} Mar 01 09:28:46 crc kubenswrapper[4792]: I0301 09:28:46.857703 4792 generic.go:334] "Generic (PLEG): container finished" podID="9e6bad7a-881b-4ef4-9916-f447e2fc1ffd" containerID="eb8be93bccd98a25e35b0b04ca4b752b359f9eaa5d34f76412ea01464dd8c3f9" exitCode=0 Mar 01 09:28:46 crc kubenswrapper[4792]: I0301 09:28:46.857850 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-bxx5d" event={"ID":"9e6bad7a-881b-4ef4-9916-f447e2fc1ffd","Type":"ContainerDied","Data":"eb8be93bccd98a25e35b0b04ca4b752b359f9eaa5d34f76412ea01464dd8c3f9"} Mar 01 09:28:46 crc kubenswrapper[4792]: I0301 09:28:46.865197 4792 generic.go:334] "Generic (PLEG): container finished" podID="f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" containerID="ac3d91f96c6efaf7baa089ecdf84d5d3fe923f61545b960c7a1aa1d77e8db2e5" exitCode=0 Mar 01 09:28:46 crc kubenswrapper[4792]: I0301 09:28:46.865289 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd","Type":"ContainerDied","Data":"ac3d91f96c6efaf7baa089ecdf84d5d3fe923f61545b960c7a1aa1d77e8db2e5"} Mar 01 09:28:46 crc kubenswrapper[4792]: I0301 09:28:46.865355 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd","Type":"ContainerDied","Data":"4a875a1ec016948e8ea916192a157e8f35d24195db5c29c64d33740934a209c2"} Mar 01 09:28:46 crc kubenswrapper[4792]: I0301 09:28:46.865372 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a875a1ec016948e8ea916192a157e8f35d24195db5c29c64d33740934a209c2" Mar 01 09:28:46 crc kubenswrapper[4792]: I0301 09:28:46.884753 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=5.871369731 podStartE2EDuration="49.884733645s" podCreationTimestamp="2026-03-01 09:27:57 +0000 UTC" firstStartedPulling="2026-03-01 09:27:58.755777331 +0000 UTC m=+1207.997656528" lastFinishedPulling="2026-03-01 09:28:42.769141245 +0000 UTC m=+1252.011020442" observedRunningTime="2026-03-01 09:28:43.904948334 +0000 UTC m=+1253.146827531" watchObservedRunningTime="2026-03-01 09:28:46.884733645 +0000 UTC m=+1256.126612842" Mar 01 09:28:46 crc kubenswrapper[4792]: I0301 09:28:46.893675 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.051802 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-run-httpd\") pod \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.052386 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-log-httpd\") pod \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.052517 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" (UID: "f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.052546 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-config-data\") pod \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.052716 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" (UID: "f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.053827 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-scripts\") pod \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.053934 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-sg-core-conf-yaml\") pod \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.053986 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgczn\" (UniqueName: \"kubernetes.io/projected/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-kube-api-access-zgczn\") pod \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.054084 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-combined-ca-bundle\") pod \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.055253 4792 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.055281 4792 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.060565 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-kube-api-access-zgczn" (OuterVolumeSpecName: "kube-api-access-zgczn") pod "f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" (UID: "f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd"). InnerVolumeSpecName "kube-api-access-zgczn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.060667 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-scripts" (OuterVolumeSpecName: "scripts") pod "f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" (UID: "f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.086155 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" (UID: "f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.141823 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" (UID: "f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.156852 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.156891 4792 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.156905 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgczn\" (UniqueName: \"kubernetes.io/projected/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-kube-api-access-zgczn\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.156933 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.163003 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-config-data" (OuterVolumeSpecName: "config-data") pod "f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" (UID: "f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.258325 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.871392 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.904384 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.934377 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.989993 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:28:47 crc kubenswrapper[4792]: E0301 09:28:47.990465 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e1a508b-9db4-414a-b06d-2f01a2c132a1" containerName="dnsmasq-dns" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.990485 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e1a508b-9db4-414a-b06d-2f01a2c132a1" containerName="dnsmasq-dns" Mar 01 09:28:47 crc kubenswrapper[4792]: E0301 09:28:47.990506 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" containerName="sg-core" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.990512 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" containerName="sg-core" Mar 01 09:28:47 crc kubenswrapper[4792]: E0301 09:28:47.990528 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" containerName="proxy-httpd" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.990534 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" containerName="proxy-httpd" Mar 01 09:28:47 crc kubenswrapper[4792]: E0301 09:28:47.990547 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e1a508b-9db4-414a-b06d-2f01a2c132a1" containerName="init" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.990554 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e1a508b-9db4-414a-b06d-2f01a2c132a1" containerName="init" Mar 01 09:28:47 crc kubenswrapper[4792]: E0301 09:28:47.990564 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" containerName="ceilometer-central-agent" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.990570 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" containerName="ceilometer-central-agent" Mar 01 09:28:47 crc kubenswrapper[4792]: E0301 09:28:47.990581 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" containerName="ceilometer-notification-agent" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.990587 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" containerName="ceilometer-notification-agent" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.990753 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" containerName="ceilometer-notification-agent" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.990767 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e1a508b-9db4-414a-b06d-2f01a2c132a1" containerName="dnsmasq-dns" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.990775 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" containerName="proxy-httpd" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.990791 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" containerName="ceilometer-central-agent" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.990800 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" containerName="sg-core" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.992435 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.999934 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.000128 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.001770 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.156650 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:28:48 crc kubenswrapper[4792]: E0301 09:28:48.157242 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-2hgmj log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="e3d1f920-cbe0-4883-8030-826eab25677d" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.173356 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " pod="openstack/ceilometer-0" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.173724 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3d1f920-cbe0-4883-8030-826eab25677d-run-httpd\") pod \"ceilometer-0\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " pod="openstack/ceilometer-0" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.173775 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3d1f920-cbe0-4883-8030-826eab25677d-log-httpd\") pod \"ceilometer-0\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " pod="openstack/ceilometer-0" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.173798 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " pod="openstack/ceilometer-0" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.173828 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-config-data\") pod \"ceilometer-0\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " pod="openstack/ceilometer-0" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.173848 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hgmj\" (UniqueName: \"kubernetes.io/projected/e3d1f920-cbe0-4883-8030-826eab25677d-kube-api-access-2hgmj\") pod \"ceilometer-0\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " pod="openstack/ceilometer-0" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.173880 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-scripts\") pod \"ceilometer-0\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " pod="openstack/ceilometer-0" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.275135 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3d1f920-cbe0-4883-8030-826eab25677d-log-httpd\") pod \"ceilometer-0\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " pod="openstack/ceilometer-0" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.275192 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " pod="openstack/ceilometer-0" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.275228 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-config-data\") pod \"ceilometer-0\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " pod="openstack/ceilometer-0" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.275257 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hgmj\" (UniqueName: \"kubernetes.io/projected/e3d1f920-cbe0-4883-8030-826eab25677d-kube-api-access-2hgmj\") pod \"ceilometer-0\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " pod="openstack/ceilometer-0" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.275300 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-scripts\") pod \"ceilometer-0\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " pod="openstack/ceilometer-0" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.275329 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " pod="openstack/ceilometer-0" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.275366 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3d1f920-cbe0-4883-8030-826eab25677d-run-httpd\") pod \"ceilometer-0\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " pod="openstack/ceilometer-0" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.275727 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3d1f920-cbe0-4883-8030-826eab25677d-run-httpd\") pod \"ceilometer-0\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " pod="openstack/ceilometer-0" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.275987 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3d1f920-cbe0-4883-8030-826eab25677d-log-httpd\") pod \"ceilometer-0\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " pod="openstack/ceilometer-0" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.291464 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-config-data\") pod \"ceilometer-0\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " pod="openstack/ceilometer-0" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.292087 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " pod="openstack/ceilometer-0" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.296581 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hgmj\" (UniqueName: \"kubernetes.io/projected/e3d1f920-cbe0-4883-8030-826eab25677d-kube-api-access-2hgmj\") pod \"ceilometer-0\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " pod="openstack/ceilometer-0" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.299540 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-scripts\") pod \"ceilometer-0\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " pod="openstack/ceilometer-0" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.303700 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " pod="openstack/ceilometer-0" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.395280 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-bxx5d" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.477782 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzxv7\" (UniqueName: \"kubernetes.io/projected/9e6bad7a-881b-4ef4-9916-f447e2fc1ffd-kube-api-access-pzxv7\") pod \"9e6bad7a-881b-4ef4-9916-f447e2fc1ffd\" (UID: \"9e6bad7a-881b-4ef4-9916-f447e2fc1ffd\") " Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.478175 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9e6bad7a-881b-4ef4-9916-f447e2fc1ffd-db-sync-config-data\") pod \"9e6bad7a-881b-4ef4-9916-f447e2fc1ffd\" (UID: \"9e6bad7a-881b-4ef4-9916-f447e2fc1ffd\") " Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.478280 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e6bad7a-881b-4ef4-9916-f447e2fc1ffd-combined-ca-bundle\") pod \"9e6bad7a-881b-4ef4-9916-f447e2fc1ffd\" (UID: \"9e6bad7a-881b-4ef4-9916-f447e2fc1ffd\") " Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.481826 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e6bad7a-881b-4ef4-9916-f447e2fc1ffd-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "9e6bad7a-881b-4ef4-9916-f447e2fc1ffd" (UID: "9e6bad7a-881b-4ef4-9916-f447e2fc1ffd"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.482734 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e6bad7a-881b-4ef4-9916-f447e2fc1ffd-kube-api-access-pzxv7" (OuterVolumeSpecName: "kube-api-access-pzxv7") pod "9e6bad7a-881b-4ef4-9916-f447e2fc1ffd" (UID: "9e6bad7a-881b-4ef4-9916-f447e2fc1ffd"). InnerVolumeSpecName "kube-api-access-pzxv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.503822 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e6bad7a-881b-4ef4-9916-f447e2fc1ffd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9e6bad7a-881b-4ef4-9916-f447e2fc1ffd" (UID: "9e6bad7a-881b-4ef4-9916-f447e2fc1ffd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.583556 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzxv7\" (UniqueName: \"kubernetes.io/projected/9e6bad7a-881b-4ef4-9916-f447e2fc1ffd-kube-api-access-pzxv7\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.583620 4792 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9e6bad7a-881b-4ef4-9916-f447e2fc1ffd-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.583636 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e6bad7a-881b-4ef4-9916-f447e2fc1ffd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.885339 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.885394 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-bxx5d" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.885325 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-bxx5d" event={"ID":"9e6bad7a-881b-4ef4-9916-f447e2fc1ffd","Type":"ContainerDied","Data":"5a8a6506253c42d0a8617675f0f4091a77e441fa914d2388057587c940f25850"} Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.885569 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a8a6506253c42d0a8617675f0f4091a77e441fa914d2388057587c940f25850" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.900436 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.988757 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3d1f920-cbe0-4883-8030-826eab25677d-log-httpd\") pod \"e3d1f920-cbe0-4883-8030-826eab25677d\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.989423 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3d1f920-cbe0-4883-8030-826eab25677d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e3d1f920-cbe0-4883-8030-826eab25677d" (UID: "e3d1f920-cbe0-4883-8030-826eab25677d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.989489 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-combined-ca-bundle\") pod \"e3d1f920-cbe0-4883-8030-826eab25677d\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.989560 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-config-data\") pod \"e3d1f920-cbe0-4883-8030-826eab25677d\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.990077 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hgmj\" (UniqueName: \"kubernetes.io/projected/e3d1f920-cbe0-4883-8030-826eab25677d-kube-api-access-2hgmj\") pod \"e3d1f920-cbe0-4883-8030-826eab25677d\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.990199 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-scripts\") pod \"e3d1f920-cbe0-4883-8030-826eab25677d\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.990220 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-sg-core-conf-yaml\") pod \"e3d1f920-cbe0-4883-8030-826eab25677d\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.990280 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3d1f920-cbe0-4883-8030-826eab25677d-run-httpd\") pod \"e3d1f920-cbe0-4883-8030-826eab25677d\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.990646 4792 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3d1f920-cbe0-4883-8030-826eab25677d-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.991172 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3d1f920-cbe0-4883-8030-826eab25677d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e3d1f920-cbe0-4883-8030-826eab25677d" (UID: "e3d1f920-cbe0-4883-8030-826eab25677d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.996127 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3d1f920-cbe0-4883-8030-826eab25677d" (UID: "e3d1f920-cbe0-4883-8030-826eab25677d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:48.999733 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-scripts" (OuterVolumeSpecName: "scripts") pod "e3d1f920-cbe0-4883-8030-826eab25677d" (UID: "e3d1f920-cbe0-4883-8030-826eab25677d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.000324 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e3d1f920-cbe0-4883-8030-826eab25677d" (UID: "e3d1f920-cbe0-4883-8030-826eab25677d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.000649 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-config-data" (OuterVolumeSpecName: "config-data") pod "e3d1f920-cbe0-4883-8030-826eab25677d" (UID: "e3d1f920-cbe0-4883-8030-826eab25677d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.004526 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3d1f920-cbe0-4883-8030-826eab25677d-kube-api-access-2hgmj" (OuterVolumeSpecName: "kube-api-access-2hgmj") pod "e3d1f920-cbe0-4883-8030-826eab25677d" (UID: "e3d1f920-cbe0-4883-8030-826eab25677d"). InnerVolumeSpecName "kube-api-access-2hgmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.094906 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hgmj\" (UniqueName: \"kubernetes.io/projected/e3d1f920-cbe0-4883-8030-826eab25677d-kube-api-access-2hgmj\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.100630 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.100643 4792 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.100654 4792 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3d1f920-cbe0-4883-8030-826eab25677d-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.100666 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.100678 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:49 crc kubenswrapper[4792]: E0301 09:28:49.206116 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddabb3d2e_57fa_4ad3_9f3b_b85e0b670650.slice/crio-212a58af65a44bd380179af8061fb1773b3c54204d964227d1d4aa6b00521785\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddabb3d2e_57fa_4ad3_9f3b_b85e0b670650.slice\": RecentStats: unable to find data in memory cache]" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.238337 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-65f4d58895-tvn59"] Mar 01 09:28:49 crc kubenswrapper[4792]: E0301 09:28:49.238761 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e6bad7a-881b-4ef4-9916-f447e2fc1ffd" containerName="barbican-db-sync" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.238780 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e6bad7a-881b-4ef4-9916-f447e2fc1ffd" containerName="barbican-db-sync" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.246403 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e6bad7a-881b-4ef4-9916-f447e2fc1ffd" containerName="barbican-db-sync" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.247408 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-65f4d58895-tvn59" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.252371 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-wjs57" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.266983 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.267080 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.290937 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-64f98fd86b-96l6n"] Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.293096 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-64f98fd86b-96l6n" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.305800 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.311980 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-65f4d58895-tvn59"] Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.342440 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-64f98fd86b-96l6n"] Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.391860 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-775688cbd9-h9lg8"] Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.393731 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.407407 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24mz2\" (UniqueName: \"kubernetes.io/projected/d30c642c-b4ae-495a-8acd-cc8be4a0f412-kube-api-access-24mz2\") pod \"barbican-keystone-listener-64f98fd86b-96l6n\" (UID: \"d30c642c-b4ae-495a-8acd-cc8be4a0f412\") " pod="openstack/barbican-keystone-listener-64f98fd86b-96l6n" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.407459 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26fbd30a-a485-4463-9aac-bb695c43e9e3-logs\") pod \"barbican-worker-65f4d58895-tvn59\" (UID: \"26fbd30a-a485-4463-9aac-bb695c43e9e3\") " pod="openstack/barbican-worker-65f4d58895-tvn59" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.407482 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s92q\" (UniqueName: \"kubernetes.io/projected/26fbd30a-a485-4463-9aac-bb695c43e9e3-kube-api-access-7s92q\") pod \"barbican-worker-65f4d58895-tvn59\" (UID: \"26fbd30a-a485-4463-9aac-bb695c43e9e3\") " pod="openstack/barbican-worker-65f4d58895-tvn59" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.407548 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26fbd30a-a485-4463-9aac-bb695c43e9e3-config-data-custom\") pod \"barbican-worker-65f4d58895-tvn59\" (UID: \"26fbd30a-a485-4463-9aac-bb695c43e9e3\") " pod="openstack/barbican-worker-65f4d58895-tvn59" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.407573 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d30c642c-b4ae-495a-8acd-cc8be4a0f412-combined-ca-bundle\") pod \"barbican-keystone-listener-64f98fd86b-96l6n\" (UID: \"d30c642c-b4ae-495a-8acd-cc8be4a0f412\") " pod="openstack/barbican-keystone-listener-64f98fd86b-96l6n" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.407641 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d30c642c-b4ae-495a-8acd-cc8be4a0f412-logs\") pod \"barbican-keystone-listener-64f98fd86b-96l6n\" (UID: \"d30c642c-b4ae-495a-8acd-cc8be4a0f412\") " pod="openstack/barbican-keystone-listener-64f98fd86b-96l6n" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.407681 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d30c642c-b4ae-495a-8acd-cc8be4a0f412-config-data\") pod \"barbican-keystone-listener-64f98fd86b-96l6n\" (UID: \"d30c642c-b4ae-495a-8acd-cc8be4a0f412\") " pod="openstack/barbican-keystone-listener-64f98fd86b-96l6n" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.407732 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26fbd30a-a485-4463-9aac-bb695c43e9e3-combined-ca-bundle\") pod \"barbican-worker-65f4d58895-tvn59\" (UID: \"26fbd30a-a485-4463-9aac-bb695c43e9e3\") " pod="openstack/barbican-worker-65f4d58895-tvn59" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.408247 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-775688cbd9-h9lg8"] Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.408427 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d30c642c-b4ae-495a-8acd-cc8be4a0f412-config-data-custom\") pod \"barbican-keystone-listener-64f98fd86b-96l6n\" (UID: \"d30c642c-b4ae-495a-8acd-cc8be4a0f412\") " pod="openstack/barbican-keystone-listener-64f98fd86b-96l6n" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.408481 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26fbd30a-a485-4463-9aac-bb695c43e9e3-config-data\") pod \"barbican-worker-65f4d58895-tvn59\" (UID: \"26fbd30a-a485-4463-9aac-bb695c43e9e3\") " pod="openstack/barbican-worker-65f4d58895-tvn59" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.440245 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" path="/var/lib/kubelet/pods/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd/volumes" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.510307 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-ovsdbserver-sb\") pod \"dnsmasq-dns-775688cbd9-h9lg8\" (UID: \"c418cc81-d55e-4678-8aad-caa1573d366a\") " pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.510352 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d30c642c-b4ae-495a-8acd-cc8be4a0f412-combined-ca-bundle\") pod \"barbican-keystone-listener-64f98fd86b-96l6n\" (UID: \"d30c642c-b4ae-495a-8acd-cc8be4a0f412\") " pod="openstack/barbican-keystone-listener-64f98fd86b-96l6n" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.510391 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d30c642c-b4ae-495a-8acd-cc8be4a0f412-logs\") pod \"barbican-keystone-listener-64f98fd86b-96l6n\" (UID: \"d30c642c-b4ae-495a-8acd-cc8be4a0f412\") " pod="openstack/barbican-keystone-listener-64f98fd86b-96l6n" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.510429 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d30c642c-b4ae-495a-8acd-cc8be4a0f412-config-data\") pod \"barbican-keystone-listener-64f98fd86b-96l6n\" (UID: \"d30c642c-b4ae-495a-8acd-cc8be4a0f412\") " pod="openstack/barbican-keystone-listener-64f98fd86b-96l6n" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.510453 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26fbd30a-a485-4463-9aac-bb695c43e9e3-combined-ca-bundle\") pod \"barbican-worker-65f4d58895-tvn59\" (UID: \"26fbd30a-a485-4463-9aac-bb695c43e9e3\") " pod="openstack/barbican-worker-65f4d58895-tvn59" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.510468 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d30c642c-b4ae-495a-8acd-cc8be4a0f412-config-data-custom\") pod \"barbican-keystone-listener-64f98fd86b-96l6n\" (UID: \"d30c642c-b4ae-495a-8acd-cc8be4a0f412\") " pod="openstack/barbican-keystone-listener-64f98fd86b-96l6n" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.510488 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26fbd30a-a485-4463-9aac-bb695c43e9e3-config-data\") pod \"barbican-worker-65f4d58895-tvn59\" (UID: \"26fbd30a-a485-4463-9aac-bb695c43e9e3\") " pod="openstack/barbican-worker-65f4d58895-tvn59" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.510510 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-config\") pod \"dnsmasq-dns-775688cbd9-h9lg8\" (UID: \"c418cc81-d55e-4678-8aad-caa1573d366a\") " pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.510544 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24mz2\" (UniqueName: \"kubernetes.io/projected/d30c642c-b4ae-495a-8acd-cc8be4a0f412-kube-api-access-24mz2\") pod \"barbican-keystone-listener-64f98fd86b-96l6n\" (UID: \"d30c642c-b4ae-495a-8acd-cc8be4a0f412\") " pod="openstack/barbican-keystone-listener-64f98fd86b-96l6n" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.510568 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-dns-svc\") pod \"dnsmasq-dns-775688cbd9-h9lg8\" (UID: \"c418cc81-d55e-4678-8aad-caa1573d366a\") " pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.510588 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26fbd30a-a485-4463-9aac-bb695c43e9e3-logs\") pod \"barbican-worker-65f4d58895-tvn59\" (UID: \"26fbd30a-a485-4463-9aac-bb695c43e9e3\") " pod="openstack/barbican-worker-65f4d58895-tvn59" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.510605 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s92q\" (UniqueName: \"kubernetes.io/projected/26fbd30a-a485-4463-9aac-bb695c43e9e3-kube-api-access-7s92q\") pod \"barbican-worker-65f4d58895-tvn59\" (UID: \"26fbd30a-a485-4463-9aac-bb695c43e9e3\") " pod="openstack/barbican-worker-65f4d58895-tvn59" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.510644 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcf7g\" (UniqueName: \"kubernetes.io/projected/c418cc81-d55e-4678-8aad-caa1573d366a-kube-api-access-mcf7g\") pod \"dnsmasq-dns-775688cbd9-h9lg8\" (UID: \"c418cc81-d55e-4678-8aad-caa1573d366a\") " pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.510662 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-ovsdbserver-nb\") pod \"dnsmasq-dns-775688cbd9-h9lg8\" (UID: \"c418cc81-d55e-4678-8aad-caa1573d366a\") " pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.510694 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26fbd30a-a485-4463-9aac-bb695c43e9e3-config-data-custom\") pod \"barbican-worker-65f4d58895-tvn59\" (UID: \"26fbd30a-a485-4463-9aac-bb695c43e9e3\") " pod="openstack/barbican-worker-65f4d58895-tvn59" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.511464 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d30c642c-b4ae-495a-8acd-cc8be4a0f412-logs\") pod \"barbican-keystone-listener-64f98fd86b-96l6n\" (UID: \"d30c642c-b4ae-495a-8acd-cc8be4a0f412\") " pod="openstack/barbican-keystone-listener-64f98fd86b-96l6n" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.512526 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26fbd30a-a485-4463-9aac-bb695c43e9e3-logs\") pod \"barbican-worker-65f4d58895-tvn59\" (UID: \"26fbd30a-a485-4463-9aac-bb695c43e9e3\") " pod="openstack/barbican-worker-65f4d58895-tvn59" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.520828 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d30c642c-b4ae-495a-8acd-cc8be4a0f412-combined-ca-bundle\") pod \"barbican-keystone-listener-64f98fd86b-96l6n\" (UID: \"d30c642c-b4ae-495a-8acd-cc8be4a0f412\") " pod="openstack/barbican-keystone-listener-64f98fd86b-96l6n" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.520859 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26fbd30a-a485-4463-9aac-bb695c43e9e3-combined-ca-bundle\") pod \"barbican-worker-65f4d58895-tvn59\" (UID: \"26fbd30a-a485-4463-9aac-bb695c43e9e3\") " pod="openstack/barbican-worker-65f4d58895-tvn59" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.521930 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26fbd30a-a485-4463-9aac-bb695c43e9e3-config-data\") pod \"barbican-worker-65f4d58895-tvn59\" (UID: \"26fbd30a-a485-4463-9aac-bb695c43e9e3\") " pod="openstack/barbican-worker-65f4d58895-tvn59" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.523212 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d30c642c-b4ae-495a-8acd-cc8be4a0f412-config-data-custom\") pod \"barbican-keystone-listener-64f98fd86b-96l6n\" (UID: \"d30c642c-b4ae-495a-8acd-cc8be4a0f412\") " pod="openstack/barbican-keystone-listener-64f98fd86b-96l6n" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.526835 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d30c642c-b4ae-495a-8acd-cc8be4a0f412-config-data\") pod \"barbican-keystone-listener-64f98fd86b-96l6n\" (UID: \"d30c642c-b4ae-495a-8acd-cc8be4a0f412\") " pod="openstack/barbican-keystone-listener-64f98fd86b-96l6n" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.546461 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-686d9f9896-9zsh2"] Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.562232 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-686d9f9896-9zsh2" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.562664 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26fbd30a-a485-4463-9aac-bb695c43e9e3-config-data-custom\") pod \"barbican-worker-65f4d58895-tvn59\" (UID: \"26fbd30a-a485-4463-9aac-bb695c43e9e3\") " pod="openstack/barbican-worker-65f4d58895-tvn59" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.567926 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-686d9f9896-9zsh2"] Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.613833 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.615009 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-config\") pod \"dnsmasq-dns-775688cbd9-h9lg8\" (UID: \"c418cc81-d55e-4678-8aad-caa1573d366a\") " pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.615105 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-dns-svc\") pod \"dnsmasq-dns-775688cbd9-h9lg8\" (UID: \"c418cc81-d55e-4678-8aad-caa1573d366a\") " pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.615159 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcf7g\" (UniqueName: \"kubernetes.io/projected/c418cc81-d55e-4678-8aad-caa1573d366a-kube-api-access-mcf7g\") pod \"dnsmasq-dns-775688cbd9-h9lg8\" (UID: \"c418cc81-d55e-4678-8aad-caa1573d366a\") " pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.615187 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-ovsdbserver-nb\") pod \"dnsmasq-dns-775688cbd9-h9lg8\" (UID: \"c418cc81-d55e-4678-8aad-caa1573d366a\") " pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.615254 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-ovsdbserver-sb\") pod \"dnsmasq-dns-775688cbd9-h9lg8\" (UID: \"c418cc81-d55e-4678-8aad-caa1573d366a\") " pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.615923 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-dns-svc\") pod \"dnsmasq-dns-775688cbd9-h9lg8\" (UID: \"c418cc81-d55e-4678-8aad-caa1573d366a\") " pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.616392 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-ovsdbserver-sb\") pod \"dnsmasq-dns-775688cbd9-h9lg8\" (UID: \"c418cc81-d55e-4678-8aad-caa1573d366a\") " pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.616430 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-ovsdbserver-nb\") pod \"dnsmasq-dns-775688cbd9-h9lg8\" (UID: \"c418cc81-d55e-4678-8aad-caa1573d366a\") " pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.616528 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-config\") pod \"dnsmasq-dns-775688cbd9-h9lg8\" (UID: \"c418cc81-d55e-4678-8aad-caa1573d366a\") " pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.616795 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s92q\" (UniqueName: \"kubernetes.io/projected/26fbd30a-a485-4463-9aac-bb695c43e9e3-kube-api-access-7s92q\") pod \"barbican-worker-65f4d58895-tvn59\" (UID: \"26fbd30a-a485-4463-9aac-bb695c43e9e3\") " pod="openstack/barbican-worker-65f4d58895-tvn59" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.634595 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24mz2\" (UniqueName: \"kubernetes.io/projected/d30c642c-b4ae-495a-8acd-cc8be4a0f412-kube-api-access-24mz2\") pod \"barbican-keystone-listener-64f98fd86b-96l6n\" (UID: \"d30c642c-b4ae-495a-8acd-cc8be4a0f412\") " pod="openstack/barbican-keystone-listener-64f98fd86b-96l6n" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.650919 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcf7g\" (UniqueName: \"kubernetes.io/projected/c418cc81-d55e-4678-8aad-caa1573d366a-kube-api-access-mcf7g\") pod \"dnsmasq-dns-775688cbd9-h9lg8\" (UID: \"c418cc81-d55e-4678-8aad-caa1573d366a\") " pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.716676 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4721d2a8-efb5-4fa3-9779-797448455198-config-data\") pod \"barbican-api-686d9f9896-9zsh2\" (UID: \"4721d2a8-efb5-4fa3-9779-797448455198\") " pod="openstack/barbican-api-686d9f9896-9zsh2" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.716967 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4721d2a8-efb5-4fa3-9779-797448455198-combined-ca-bundle\") pod \"barbican-api-686d9f9896-9zsh2\" (UID: \"4721d2a8-efb5-4fa3-9779-797448455198\") " pod="openstack/barbican-api-686d9f9896-9zsh2" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.717099 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4721d2a8-efb5-4fa3-9779-797448455198-logs\") pod \"barbican-api-686d9f9896-9zsh2\" (UID: \"4721d2a8-efb5-4fa3-9779-797448455198\") " pod="openstack/barbican-api-686d9f9896-9zsh2" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.717198 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4721d2a8-efb5-4fa3-9779-797448455198-config-data-custom\") pod \"barbican-api-686d9f9896-9zsh2\" (UID: \"4721d2a8-efb5-4fa3-9779-797448455198\") " pod="openstack/barbican-api-686d9f9896-9zsh2" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.717269 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58r6b\" (UniqueName: \"kubernetes.io/projected/4721d2a8-efb5-4fa3-9779-797448455198-kube-api-access-58r6b\") pod \"barbican-api-686d9f9896-9zsh2\" (UID: \"4721d2a8-efb5-4fa3-9779-797448455198\") " pod="openstack/barbican-api-686d9f9896-9zsh2" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.727235 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.818528 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4721d2a8-efb5-4fa3-9779-797448455198-config-data-custom\") pod \"barbican-api-686d9f9896-9zsh2\" (UID: \"4721d2a8-efb5-4fa3-9779-797448455198\") " pod="openstack/barbican-api-686d9f9896-9zsh2" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.818566 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58r6b\" (UniqueName: \"kubernetes.io/projected/4721d2a8-efb5-4fa3-9779-797448455198-kube-api-access-58r6b\") pod \"barbican-api-686d9f9896-9zsh2\" (UID: \"4721d2a8-efb5-4fa3-9779-797448455198\") " pod="openstack/barbican-api-686d9f9896-9zsh2" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.818663 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4721d2a8-efb5-4fa3-9779-797448455198-config-data\") pod \"barbican-api-686d9f9896-9zsh2\" (UID: \"4721d2a8-efb5-4fa3-9779-797448455198\") " pod="openstack/barbican-api-686d9f9896-9zsh2" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.818686 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4721d2a8-efb5-4fa3-9779-797448455198-combined-ca-bundle\") pod \"barbican-api-686d9f9896-9zsh2\" (UID: \"4721d2a8-efb5-4fa3-9779-797448455198\") " pod="openstack/barbican-api-686d9f9896-9zsh2" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.818738 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4721d2a8-efb5-4fa3-9779-797448455198-logs\") pod \"barbican-api-686d9f9896-9zsh2\" (UID: \"4721d2a8-efb5-4fa3-9779-797448455198\") " pod="openstack/barbican-api-686d9f9896-9zsh2" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.819120 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4721d2a8-efb5-4fa3-9779-797448455198-logs\") pod \"barbican-api-686d9f9896-9zsh2\" (UID: \"4721d2a8-efb5-4fa3-9779-797448455198\") " pod="openstack/barbican-api-686d9f9896-9zsh2" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.824028 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4721d2a8-efb5-4fa3-9779-797448455198-config-data\") pod \"barbican-api-686d9f9896-9zsh2\" (UID: \"4721d2a8-efb5-4fa3-9779-797448455198\") " pod="openstack/barbican-api-686d9f9896-9zsh2" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.825745 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4721d2a8-efb5-4fa3-9779-797448455198-config-data-custom\") pod \"barbican-api-686d9f9896-9zsh2\" (UID: \"4721d2a8-efb5-4fa3-9779-797448455198\") " pod="openstack/barbican-api-686d9f9896-9zsh2" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.832872 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4721d2a8-efb5-4fa3-9779-797448455198-combined-ca-bundle\") pod \"barbican-api-686d9f9896-9zsh2\" (UID: \"4721d2a8-efb5-4fa3-9779-797448455198\") " pod="openstack/barbican-api-686d9f9896-9zsh2" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.835188 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58r6b\" (UniqueName: \"kubernetes.io/projected/4721d2a8-efb5-4fa3-9779-797448455198-kube-api-access-58r6b\") pod \"barbican-api-686d9f9896-9zsh2\" (UID: \"4721d2a8-efb5-4fa3-9779-797448455198\") " pod="openstack/barbican-api-686d9f9896-9zsh2" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.884237 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-65f4d58895-tvn59" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.897451 4792 generic.go:334] "Generic (PLEG): container finished" podID="737aa0a0-6e53-451e-9d5f-2deada87b5b4" containerID="5753e582a896b76584d26c8b6fbaf1b0c86841fe9960e87056d0ee4ab735dcee" exitCode=0 Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.897613 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.897509 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-gsxqb" event={"ID":"737aa0a0-6e53-451e-9d5f-2deada87b5b4","Type":"ContainerDied","Data":"5753e582a896b76584d26c8b6fbaf1b0c86841fe9960e87056d0ee4ab735dcee"} Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.910189 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-64f98fd86b-96l6n" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.986650 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-686d9f9896-9zsh2" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.001347 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.016938 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.054037 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.057096 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.060382 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.060505 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.066552 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.136860 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpnn8\" (UniqueName: \"kubernetes.io/projected/73747536-7a61-4c63-87c7-9e4c72471fb1-kube-api-access-kpnn8\") pod \"ceilometer-0\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " pod="openstack/ceilometer-0" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.136963 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-config-data\") pod \"ceilometer-0\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " pod="openstack/ceilometer-0" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.136990 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73747536-7a61-4c63-87c7-9e4c72471fb1-log-httpd\") pod \"ceilometer-0\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " pod="openstack/ceilometer-0" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.137020 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73747536-7a61-4c63-87c7-9e4c72471fb1-run-httpd\") pod \"ceilometer-0\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " pod="openstack/ceilometer-0" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.137092 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-scripts\") pod \"ceilometer-0\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " pod="openstack/ceilometer-0" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.137130 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " pod="openstack/ceilometer-0" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.137177 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " pod="openstack/ceilometer-0" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.238505 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-scripts\") pod \"ceilometer-0\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " pod="openstack/ceilometer-0" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.238566 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " pod="openstack/ceilometer-0" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.238614 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " pod="openstack/ceilometer-0" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.238671 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpnn8\" (UniqueName: \"kubernetes.io/projected/73747536-7a61-4c63-87c7-9e4c72471fb1-kube-api-access-kpnn8\") pod \"ceilometer-0\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " pod="openstack/ceilometer-0" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.238707 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-config-data\") pod \"ceilometer-0\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " pod="openstack/ceilometer-0" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.238724 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73747536-7a61-4c63-87c7-9e4c72471fb1-log-httpd\") pod \"ceilometer-0\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " pod="openstack/ceilometer-0" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.238746 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73747536-7a61-4c63-87c7-9e4c72471fb1-run-httpd\") pod \"ceilometer-0\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " pod="openstack/ceilometer-0" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.239268 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73747536-7a61-4c63-87c7-9e4c72471fb1-run-httpd\") pod \"ceilometer-0\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " pod="openstack/ceilometer-0" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.246126 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73747536-7a61-4c63-87c7-9e4c72471fb1-log-httpd\") pod \"ceilometer-0\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " pod="openstack/ceilometer-0" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.249141 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " pod="openstack/ceilometer-0" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.249815 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-scripts\") pod \"ceilometer-0\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " pod="openstack/ceilometer-0" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.250107 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-config-data\") pod \"ceilometer-0\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " pod="openstack/ceilometer-0" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.251126 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " pod="openstack/ceilometer-0" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.264651 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpnn8\" (UniqueName: \"kubernetes.io/projected/73747536-7a61-4c63-87c7-9e4c72471fb1-kube-api-access-kpnn8\") pod \"ceilometer-0\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " pod="openstack/ceilometer-0" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.352514 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-775688cbd9-h9lg8"] Mar 01 09:28:50 crc kubenswrapper[4792]: W0301 09:28:50.355357 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc418cc81_d55e_4678_8aad_caa1573d366a.slice/crio-bdfd8bffec09df88b8baa122fcaea45ce8205a2b6eb0d36b5009c9f753f43ecc WatchSource:0}: Error finding container bdfd8bffec09df88b8baa122fcaea45ce8205a2b6eb0d36b5009c9f753f43ecc: Status 404 returned error can't find the container with id bdfd8bffec09df88b8baa122fcaea45ce8205a2b6eb0d36b5009c9f753f43ecc Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.399187 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.480148 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-65f4d58895-tvn59"] Mar 01 09:28:50 crc kubenswrapper[4792]: W0301 09:28:50.498651 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26fbd30a_a485_4463_9aac_bb695c43e9e3.slice/crio-23f5431953315a66506197a7fed33c29c81e45caf335668eef31542ca24365d9 WatchSource:0}: Error finding container 23f5431953315a66506197a7fed33c29c81e45caf335668eef31542ca24365d9: Status 404 returned error can't find the container with id 23f5431953315a66506197a7fed33c29c81e45caf335668eef31542ca24365d9 Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.673169 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-64f98fd86b-96l6n"] Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.683651 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-686d9f9896-9zsh2"] Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.907527 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-686d9f9896-9zsh2" event={"ID":"4721d2a8-efb5-4fa3-9779-797448455198","Type":"ContainerStarted","Data":"128fecf3528e96b978b34551abfcb59be5f329262f2b7798f19013b2931fb841"} Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.910206 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-64f98fd86b-96l6n" event={"ID":"d30c642c-b4ae-495a-8acd-cc8be4a0f412","Type":"ContainerStarted","Data":"c729a979d8c29010d2c655ac4de93ea2cacda6aa6f0a16f0f9981e0f5dbbf81d"} Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.912065 4792 generic.go:334] "Generic (PLEG): container finished" podID="c418cc81-d55e-4678-8aad-caa1573d366a" containerID="fb4bc5fb705e29cf6511138c6ac8f7d95127aa4a1c119e960a46f1f39b113863" exitCode=0 Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.912132 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" event={"ID":"c418cc81-d55e-4678-8aad-caa1573d366a","Type":"ContainerDied","Data":"fb4bc5fb705e29cf6511138c6ac8f7d95127aa4a1c119e960a46f1f39b113863"} Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.912156 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" event={"ID":"c418cc81-d55e-4678-8aad-caa1573d366a","Type":"ContainerStarted","Data":"bdfd8bffec09df88b8baa122fcaea45ce8205a2b6eb0d36b5009c9f753f43ecc"} Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.918344 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-65f4d58895-tvn59" event={"ID":"26fbd30a-a485-4463-9aac-bb695c43e9e3","Type":"ContainerStarted","Data":"23f5431953315a66506197a7fed33c29c81e45caf335668eef31542ca24365d9"} Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.098898 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.295716 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-gsxqb" Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.371065 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-config-data\") pod \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.371143 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-scripts\") pod \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.371287 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84xmq\" (UniqueName: \"kubernetes.io/projected/737aa0a0-6e53-451e-9d5f-2deada87b5b4-kube-api-access-84xmq\") pod \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.371328 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-combined-ca-bundle\") pod \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.371366 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/737aa0a0-6e53-451e-9d5f-2deada87b5b4-etc-machine-id\") pod \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.371484 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-db-sync-config-data\") pod \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.374296 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/737aa0a0-6e53-451e-9d5f-2deada87b5b4-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "737aa0a0-6e53-451e-9d5f-2deada87b5b4" (UID: "737aa0a0-6e53-451e-9d5f-2deada87b5b4"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.376414 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/737aa0a0-6e53-451e-9d5f-2deada87b5b4-kube-api-access-84xmq" (OuterVolumeSpecName: "kube-api-access-84xmq") pod "737aa0a0-6e53-451e-9d5f-2deada87b5b4" (UID: "737aa0a0-6e53-451e-9d5f-2deada87b5b4"). InnerVolumeSpecName "kube-api-access-84xmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.377592 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "737aa0a0-6e53-451e-9d5f-2deada87b5b4" (UID: "737aa0a0-6e53-451e-9d5f-2deada87b5b4"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.380672 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-scripts" (OuterVolumeSpecName: "scripts") pod "737aa0a0-6e53-451e-9d5f-2deada87b5b4" (UID: "737aa0a0-6e53-451e-9d5f-2deada87b5b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.432315 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "737aa0a0-6e53-451e-9d5f-2deada87b5b4" (UID: "737aa0a0-6e53-451e-9d5f-2deada87b5b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.439322 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3d1f920-cbe0-4883-8030-826eab25677d" path="/var/lib/kubelet/pods/e3d1f920-cbe0-4883-8030-826eab25677d/volumes" Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.496704 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-config-data" (OuterVolumeSpecName: "config-data") pod "737aa0a0-6e53-451e-9d5f-2deada87b5b4" (UID: "737aa0a0-6e53-451e-9d5f-2deada87b5b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.497836 4792 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.497881 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.497894 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.497936 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84xmq\" (UniqueName: \"kubernetes.io/projected/737aa0a0-6e53-451e-9d5f-2deada87b5b4-kube-api-access-84xmq\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.497952 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.497962 4792 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/737aa0a0-6e53-451e-9d5f-2deada87b5b4-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.938383 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-gsxqb" Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.938394 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-gsxqb" event={"ID":"737aa0a0-6e53-451e-9d5f-2deada87b5b4","Type":"ContainerDied","Data":"dc7bf25ff6493b89f8d3d42eee96feaadc16025ec1b5d1ef3c591647a4fb7abf"} Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.938984 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc7bf25ff6493b89f8d3d42eee96feaadc16025ec1b5d1ef3c591647a4fb7abf" Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.947103 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73747536-7a61-4c63-87c7-9e4c72471fb1","Type":"ContainerStarted","Data":"8f407749323a926af0db11e4921c8f80c0b44788d7a0172e925467426ce4a55c"} Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.951864 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-686d9f9896-9zsh2" event={"ID":"4721d2a8-efb5-4fa3-9779-797448455198","Type":"ContainerStarted","Data":"b0ab985ce4b6e50f0fbdce6821fe03c40fba76255764f0fad8a448b7b0c30943"} Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.952141 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-686d9f9896-9zsh2" event={"ID":"4721d2a8-efb5-4fa3-9779-797448455198","Type":"ContainerStarted","Data":"b2421256e4e0d8725502059d1b77eb867deef0339092d9a858244caa156e8a2b"} Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.952606 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-686d9f9896-9zsh2" Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.955143 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-686d9f9896-9zsh2" Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.961522 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" event={"ID":"c418cc81-d55e-4678-8aad-caa1573d366a","Type":"ContainerStarted","Data":"fe493b7561a052b1ecbcb59aceb3c59fff9d9037dafcf4b3ac23d8ab22a13b4e"} Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.962491 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.985194 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-686d9f9896-9zsh2" podStartSLOduration=2.985166409 podStartE2EDuration="2.985166409s" podCreationTimestamp="2026-03-01 09:28:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:28:51.97705835 +0000 UTC m=+1261.218937567" watchObservedRunningTime="2026-03-01 09:28:51.985166409 +0000 UTC m=+1261.227045606" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.013894 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" podStartSLOduration=3.013873264 podStartE2EDuration="3.013873264s" podCreationTimestamp="2026-03-01 09:28:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:28:52.005676223 +0000 UTC m=+1261.247555430" watchObservedRunningTime="2026-03-01 09:28:52.013873264 +0000 UTC m=+1261.255752461" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.371118 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 01 09:28:52 crc kubenswrapper[4792]: E0301 09:28:52.371746 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="737aa0a0-6e53-451e-9d5f-2deada87b5b4" containerName="cinder-db-sync" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.371758 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="737aa0a0-6e53-451e-9d5f-2deada87b5b4" containerName="cinder-db-sync" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.371954 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="737aa0a0-6e53-451e-9d5f-2deada87b5b4" containerName="cinder-db-sync" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.372804 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.384501 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.384719 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.388852 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.397714 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-7rpd7" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.421804 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.444469 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " pod="openstack/cinder-scheduler-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.444530 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-scripts\") pod \"cinder-scheduler-0\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " pod="openstack/cinder-scheduler-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.444557 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5a7e11fe-898a-442c-b619-d7ccea385948-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " pod="openstack/cinder-scheduler-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.444593 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttz97\" (UniqueName: \"kubernetes.io/projected/5a7e11fe-898a-442c-b619-d7ccea385948-kube-api-access-ttz97\") pod \"cinder-scheduler-0\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " pod="openstack/cinder-scheduler-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.444612 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-config-data\") pod \"cinder-scheduler-0\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " pod="openstack/cinder-scheduler-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.444639 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " pod="openstack/cinder-scheduler-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.532095 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-775688cbd9-h9lg8"] Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.546191 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " pod="openstack/cinder-scheduler-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.546279 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-scripts\") pod \"cinder-scheduler-0\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " pod="openstack/cinder-scheduler-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.546304 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5a7e11fe-898a-442c-b619-d7ccea385948-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " pod="openstack/cinder-scheduler-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.546353 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttz97\" (UniqueName: \"kubernetes.io/projected/5a7e11fe-898a-442c-b619-d7ccea385948-kube-api-access-ttz97\") pod \"cinder-scheduler-0\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " pod="openstack/cinder-scheduler-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.546368 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-config-data\") pod \"cinder-scheduler-0\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " pod="openstack/cinder-scheduler-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.546413 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " pod="openstack/cinder-scheduler-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.549235 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5a7e11fe-898a-442c-b619-d7ccea385948-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " pod="openstack/cinder-scheduler-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.553521 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " pod="openstack/cinder-scheduler-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.565180 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " pod="openstack/cinder-scheduler-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.565842 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-config-data\") pod \"cinder-scheduler-0\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " pod="openstack/cinder-scheduler-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.573625 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-scripts\") pod \"cinder-scheduler-0\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " pod="openstack/cinder-scheduler-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.587089 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttz97\" (UniqueName: \"kubernetes.io/projected/5a7e11fe-898a-442c-b619-d7ccea385948-kube-api-access-ttz97\") pod \"cinder-scheduler-0\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " pod="openstack/cinder-scheduler-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.644164 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7675674687-rrbg6"] Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.645624 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7675674687-rrbg6" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.678595 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7675674687-rrbg6"] Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.733692 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.750049 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j7zl\" (UniqueName: \"kubernetes.io/projected/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-kube-api-access-2j7zl\") pod \"dnsmasq-dns-7675674687-rrbg6\" (UID: \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\") " pod="openstack/dnsmasq-dns-7675674687-rrbg6" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.750141 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-dns-svc\") pod \"dnsmasq-dns-7675674687-rrbg6\" (UID: \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\") " pod="openstack/dnsmasq-dns-7675674687-rrbg6" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.750198 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-ovsdbserver-sb\") pod \"dnsmasq-dns-7675674687-rrbg6\" (UID: \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\") " pod="openstack/dnsmasq-dns-7675674687-rrbg6" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.750220 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-ovsdbserver-nb\") pod \"dnsmasq-dns-7675674687-rrbg6\" (UID: \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\") " pod="openstack/dnsmasq-dns-7675674687-rrbg6" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.750244 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-config\") pod \"dnsmasq-dns-7675674687-rrbg6\" (UID: \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\") " pod="openstack/dnsmasq-dns-7675674687-rrbg6" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.807761 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.809152 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.817430 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.832202 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.852027 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-config\") pod \"dnsmasq-dns-7675674687-rrbg6\" (UID: \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\") " pod="openstack/dnsmasq-dns-7675674687-rrbg6" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.852123 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-scripts\") pod \"cinder-api-0\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " pod="openstack/cinder-api-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.852166 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2j7zl\" (UniqueName: \"kubernetes.io/projected/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-kube-api-access-2j7zl\") pod \"dnsmasq-dns-7675674687-rrbg6\" (UID: \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\") " pod="openstack/dnsmasq-dns-7675674687-rrbg6" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.852215 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3261c1a4-1fc3-4584-a04b-a909176a21a7-logs\") pod \"cinder-api-0\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " pod="openstack/cinder-api-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.852249 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-config-data-custom\") pod \"cinder-api-0\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " pod="openstack/cinder-api-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.852281 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-config-data\") pod \"cinder-api-0\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " pod="openstack/cinder-api-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.852315 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3261c1a4-1fc3-4584-a04b-a909176a21a7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " pod="openstack/cinder-api-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.852343 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4cnd\" (UniqueName: \"kubernetes.io/projected/3261c1a4-1fc3-4584-a04b-a909176a21a7-kube-api-access-r4cnd\") pod \"cinder-api-0\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " pod="openstack/cinder-api-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.852366 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-dns-svc\") pod \"dnsmasq-dns-7675674687-rrbg6\" (UID: \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\") " pod="openstack/dnsmasq-dns-7675674687-rrbg6" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.852456 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " pod="openstack/cinder-api-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.852497 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-ovsdbserver-sb\") pod \"dnsmasq-dns-7675674687-rrbg6\" (UID: \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\") " pod="openstack/dnsmasq-dns-7675674687-rrbg6" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.852527 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-ovsdbserver-nb\") pod \"dnsmasq-dns-7675674687-rrbg6\" (UID: \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\") " pod="openstack/dnsmasq-dns-7675674687-rrbg6" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.853474 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-config\") pod \"dnsmasq-dns-7675674687-rrbg6\" (UID: \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\") " pod="openstack/dnsmasq-dns-7675674687-rrbg6" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.853525 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-ovsdbserver-nb\") pod \"dnsmasq-dns-7675674687-rrbg6\" (UID: \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\") " pod="openstack/dnsmasq-dns-7675674687-rrbg6" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.854880 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-dns-svc\") pod \"dnsmasq-dns-7675674687-rrbg6\" (UID: \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\") " pod="openstack/dnsmasq-dns-7675674687-rrbg6" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.855620 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-ovsdbserver-sb\") pod \"dnsmasq-dns-7675674687-rrbg6\" (UID: \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\") " pod="openstack/dnsmasq-dns-7675674687-rrbg6" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.905640 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j7zl\" (UniqueName: \"kubernetes.io/projected/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-kube-api-access-2j7zl\") pod \"dnsmasq-dns-7675674687-rrbg6\" (UID: \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\") " pod="openstack/dnsmasq-dns-7675674687-rrbg6" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.953762 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3261c1a4-1fc3-4584-a04b-a909176a21a7-logs\") pod \"cinder-api-0\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " pod="openstack/cinder-api-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.953815 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-config-data-custom\") pod \"cinder-api-0\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " pod="openstack/cinder-api-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.953846 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-config-data\") pod \"cinder-api-0\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " pod="openstack/cinder-api-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.953874 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3261c1a4-1fc3-4584-a04b-a909176a21a7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " pod="openstack/cinder-api-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.953916 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4cnd\" (UniqueName: \"kubernetes.io/projected/3261c1a4-1fc3-4584-a04b-a909176a21a7-kube-api-access-r4cnd\") pod \"cinder-api-0\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " pod="openstack/cinder-api-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.953962 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " pod="openstack/cinder-api-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.954014 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-scripts\") pod \"cinder-api-0\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " pod="openstack/cinder-api-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.954634 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3261c1a4-1fc3-4584-a04b-a909176a21a7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " pod="openstack/cinder-api-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.955421 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3261c1a4-1fc3-4584-a04b-a909176a21a7-logs\") pod \"cinder-api-0\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " pod="openstack/cinder-api-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.958010 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-config-data-custom\") pod \"cinder-api-0\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " pod="openstack/cinder-api-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.958435 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-config-data\") pod \"cinder-api-0\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " pod="openstack/cinder-api-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.964351 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " pod="openstack/cinder-api-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.966960 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7675674687-rrbg6" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.988057 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4cnd\" (UniqueName: \"kubernetes.io/projected/3261c1a4-1fc3-4584-a04b-a909176a21a7-kube-api-access-r4cnd\") pod \"cinder-api-0\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " pod="openstack/cinder-api-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.994097 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-scripts\") pod \"cinder-api-0\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " pod="openstack/cinder-api-0" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.122974 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7f86869f48-jg6nw"] Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.124824 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.129118 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.129178 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.129276 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.139870 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7f86869f48-jg6nw"] Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.267187 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq2tf\" (UniqueName: \"kubernetes.io/projected/6c97472a-b6b7-4fc4-b872-a318812f0999-kube-api-access-kq2tf\") pod \"barbican-api-7f86869f48-jg6nw\" (UID: \"6c97472a-b6b7-4fc4-b872-a318812f0999\") " pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.267257 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c97472a-b6b7-4fc4-b872-a318812f0999-config-data-custom\") pod \"barbican-api-7f86869f48-jg6nw\" (UID: \"6c97472a-b6b7-4fc4-b872-a318812f0999\") " pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.267291 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c97472a-b6b7-4fc4-b872-a318812f0999-combined-ca-bundle\") pod \"barbican-api-7f86869f48-jg6nw\" (UID: \"6c97472a-b6b7-4fc4-b872-a318812f0999\") " pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.267304 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c97472a-b6b7-4fc4-b872-a318812f0999-internal-tls-certs\") pod \"barbican-api-7f86869f48-jg6nw\" (UID: \"6c97472a-b6b7-4fc4-b872-a318812f0999\") " pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.267326 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c97472a-b6b7-4fc4-b872-a318812f0999-public-tls-certs\") pod \"barbican-api-7f86869f48-jg6nw\" (UID: \"6c97472a-b6b7-4fc4-b872-a318812f0999\") " pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.267356 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c97472a-b6b7-4fc4-b872-a318812f0999-logs\") pod \"barbican-api-7f86869f48-jg6nw\" (UID: \"6c97472a-b6b7-4fc4-b872-a318812f0999\") " pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.267382 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c97472a-b6b7-4fc4-b872-a318812f0999-config-data\") pod \"barbican-api-7f86869f48-jg6nw\" (UID: \"6c97472a-b6b7-4fc4-b872-a318812f0999\") " pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.368777 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c97472a-b6b7-4fc4-b872-a318812f0999-config-data-custom\") pod \"barbican-api-7f86869f48-jg6nw\" (UID: \"6c97472a-b6b7-4fc4-b872-a318812f0999\") " pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.368846 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c97472a-b6b7-4fc4-b872-a318812f0999-combined-ca-bundle\") pod \"barbican-api-7f86869f48-jg6nw\" (UID: \"6c97472a-b6b7-4fc4-b872-a318812f0999\") " pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.368869 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c97472a-b6b7-4fc4-b872-a318812f0999-internal-tls-certs\") pod \"barbican-api-7f86869f48-jg6nw\" (UID: \"6c97472a-b6b7-4fc4-b872-a318812f0999\") " pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.368896 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c97472a-b6b7-4fc4-b872-a318812f0999-public-tls-certs\") pod \"barbican-api-7f86869f48-jg6nw\" (UID: \"6c97472a-b6b7-4fc4-b872-a318812f0999\") " pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.368962 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c97472a-b6b7-4fc4-b872-a318812f0999-logs\") pod \"barbican-api-7f86869f48-jg6nw\" (UID: \"6c97472a-b6b7-4fc4-b872-a318812f0999\") " pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.369001 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c97472a-b6b7-4fc4-b872-a318812f0999-config-data\") pod \"barbican-api-7f86869f48-jg6nw\" (UID: \"6c97472a-b6b7-4fc4-b872-a318812f0999\") " pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.369080 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kq2tf\" (UniqueName: \"kubernetes.io/projected/6c97472a-b6b7-4fc4-b872-a318812f0999-kube-api-access-kq2tf\") pod \"barbican-api-7f86869f48-jg6nw\" (UID: \"6c97472a-b6b7-4fc4-b872-a318812f0999\") " pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.369670 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c97472a-b6b7-4fc4-b872-a318812f0999-logs\") pod \"barbican-api-7f86869f48-jg6nw\" (UID: \"6c97472a-b6b7-4fc4-b872-a318812f0999\") " pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.373071 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c97472a-b6b7-4fc4-b872-a318812f0999-config-data-custom\") pod \"barbican-api-7f86869f48-jg6nw\" (UID: \"6c97472a-b6b7-4fc4-b872-a318812f0999\") " pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.373391 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c97472a-b6b7-4fc4-b872-a318812f0999-combined-ca-bundle\") pod \"barbican-api-7f86869f48-jg6nw\" (UID: \"6c97472a-b6b7-4fc4-b872-a318812f0999\") " pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.373668 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c97472a-b6b7-4fc4-b872-a318812f0999-public-tls-certs\") pod \"barbican-api-7f86869f48-jg6nw\" (UID: \"6c97472a-b6b7-4fc4-b872-a318812f0999\") " pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.374384 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c97472a-b6b7-4fc4-b872-a318812f0999-internal-tls-certs\") pod \"barbican-api-7f86869f48-jg6nw\" (UID: \"6c97472a-b6b7-4fc4-b872-a318812f0999\") " pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.389001 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c97472a-b6b7-4fc4-b872-a318812f0999-config-data\") pod \"barbican-api-7f86869f48-jg6nw\" (UID: \"6c97472a-b6b7-4fc4-b872-a318812f0999\") " pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.396650 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq2tf\" (UniqueName: \"kubernetes.io/projected/6c97472a-b6b7-4fc4-b872-a318812f0999-kube-api-access-kq2tf\") pod \"barbican-api-7f86869f48-jg6nw\" (UID: \"6c97472a-b6b7-4fc4-b872-a318812f0999\") " pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.452454 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:54 crc kubenswrapper[4792]: I0301 09:28:54.024327 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-65f4d58895-tvn59" event={"ID":"26fbd30a-a485-4463-9aac-bb695c43e9e3","Type":"ContainerStarted","Data":"1b653d0f5781794221f385cc2010eeaf82b408f4b2eb6e3922c0102992bc8f4f"} Mar 01 09:28:54 crc kubenswrapper[4792]: I0301 09:28:54.024590 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" podUID="c418cc81-d55e-4678-8aad-caa1573d366a" containerName="dnsmasq-dns" containerID="cri-o://fe493b7561a052b1ecbcb59aceb3c59fff9d9037dafcf4b3ac23d8ab22a13b4e" gracePeriod=10 Mar 01 09:28:54 crc kubenswrapper[4792]: I0301 09:28:54.211404 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7675674687-rrbg6"] Mar 01 09:28:54 crc kubenswrapper[4792]: I0301 09:28:54.236024 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7f86869f48-jg6nw"] Mar 01 09:28:54 crc kubenswrapper[4792]: I0301 09:28:54.389026 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 01 09:28:54 crc kubenswrapper[4792]: W0301 09:28:54.414871 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3261c1a4_1fc3_4584_a04b_a909176a21a7.slice/crio-878d29e29b8947901ac3b55766798c44fc0595df994212304bdbb24caf194ff0 WatchSource:0}: Error finding container 878d29e29b8947901ac3b55766798c44fc0595df994212304bdbb24caf194ff0: Status 404 returned error can't find the container with id 878d29e29b8947901ac3b55766798c44fc0595df994212304bdbb24caf194ff0 Mar 01 09:28:54 crc kubenswrapper[4792]: I0301 09:28:54.437980 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.056478 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.057840 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-64f98fd86b-96l6n" event={"ID":"d30c642c-b4ae-495a-8acd-cc8be4a0f412","Type":"ContainerStarted","Data":"e1d7268f85aab2519b72baa24d075192715425b6b0034f558542dc731de246d1"} Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.057898 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-64f98fd86b-96l6n" event={"ID":"d30c642c-b4ae-495a-8acd-cc8be4a0f412","Type":"ContainerStarted","Data":"5eb771ebcc4fb088cc9d586d34d67eee927b15054e27473a11dea7307e6e8eda"} Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.063563 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5a7e11fe-898a-442c-b619-d7ccea385948","Type":"ContainerStarted","Data":"35af3639b921b65729861f597174631d1ccc2f9baac748d7d2575658d3be08b9"} Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.065051 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3261c1a4-1fc3-4584-a04b-a909176a21a7","Type":"ContainerStarted","Data":"878d29e29b8947901ac3b55766798c44fc0595df994212304bdbb24caf194ff0"} Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.066612 4792 generic.go:334] "Generic (PLEG): container finished" podID="c418cc81-d55e-4678-8aad-caa1573d366a" containerID="fe493b7561a052b1ecbcb59aceb3c59fff9d9037dafcf4b3ac23d8ab22a13b4e" exitCode=0 Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.066658 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" event={"ID":"c418cc81-d55e-4678-8aad-caa1573d366a","Type":"ContainerDied","Data":"fe493b7561a052b1ecbcb59aceb3c59fff9d9037dafcf4b3ac23d8ab22a13b4e"} Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.066675 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" event={"ID":"c418cc81-d55e-4678-8aad-caa1573d366a","Type":"ContainerDied","Data":"bdfd8bffec09df88b8baa122fcaea45ce8205a2b6eb0d36b5009c9f753f43ecc"} Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.066692 4792 scope.go:117] "RemoveContainer" containerID="fe493b7561a052b1ecbcb59aceb3c59fff9d9037dafcf4b3ac23d8ab22a13b4e" Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.066823 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.090310 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-65f4d58895-tvn59" event={"ID":"26fbd30a-a485-4463-9aac-bb695c43e9e3","Type":"ContainerStarted","Data":"6c1dbcb00cf56971be88b525c52f9576a288f8b7e09492e30c48b16b99cb4b57"} Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.099678 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7f86869f48-jg6nw" event={"ID":"6c97472a-b6b7-4fc4-b872-a318812f0999","Type":"ContainerStarted","Data":"05fcac918e336fbdbec1df9312c0287b079540d558f63f641f32c22e0617f400"} Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.099734 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7f86869f48-jg6nw" event={"ID":"6c97472a-b6b7-4fc4-b872-a318812f0999","Type":"ContainerStarted","Data":"6d204bf177c538e6871de5e9d16e20c8bf22d250d590cffb0112e0c43ed73d59"} Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.099747 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7f86869f48-jg6nw" event={"ID":"6c97472a-b6b7-4fc4-b872-a318812f0999","Type":"ContainerStarted","Data":"20f3c43ea4550355b0c719f618fbe77a0797f616d74f172246bcf6104f3d4e32"} Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.132337 4792 generic.go:334] "Generic (PLEG): container finished" podID="b5f281f2-c77c-49cc-93a4-e7ed029f29bb" containerID="1bc8af094ffa7e776635e0ebadb742f3592eec13f050af88c7f45cc46dd6b7ae" exitCode=0 Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.132374 4792 scope.go:117] "RemoveContainer" containerID="fb4bc5fb705e29cf6511138c6ac8f7d95127aa4a1c119e960a46f1f39b113863" Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.132451 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7675674687-rrbg6" event={"ID":"b5f281f2-c77c-49cc-93a4-e7ed029f29bb","Type":"ContainerDied","Data":"1bc8af094ffa7e776635e0ebadb742f3592eec13f050af88c7f45cc46dd6b7ae"} Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.132484 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7675674687-rrbg6" event={"ID":"b5f281f2-c77c-49cc-93a4-e7ed029f29bb","Type":"ContainerStarted","Data":"23c50a76020a45988e337315d0efa1b099af136e2bf3deb4ff1bf47f7e64507f"} Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.140035 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-64f98fd86b-96l6n" podStartSLOduration=3.222343751 podStartE2EDuration="6.140010607s" podCreationTimestamp="2026-03-01 09:28:49 +0000 UTC" firstStartedPulling="2026-03-01 09:28:50.682962685 +0000 UTC m=+1259.924841882" lastFinishedPulling="2026-03-01 09:28:53.600629541 +0000 UTC m=+1262.842508738" observedRunningTime="2026-03-01 09:28:55.110735269 +0000 UTC m=+1264.352614466" watchObservedRunningTime="2026-03-01 09:28:55.140010607 +0000 UTC m=+1264.381889804" Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.171086 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7f86869f48-jg6nw" podStartSLOduration=2.171060759 podStartE2EDuration="2.171060759s" podCreationTimestamp="2026-03-01 09:28:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:28:55.149114051 +0000 UTC m=+1264.390993248" watchObservedRunningTime="2026-03-01 09:28:55.171060759 +0000 UTC m=+1264.412939956" Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.178136 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73747536-7a61-4c63-87c7-9e4c72471fb1","Type":"ContainerStarted","Data":"b90600661de56082bcb7ae56c666abd530a56c37ee4cdb45c25503eb416e5a05"} Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.186070 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-65f4d58895-tvn59" podStartSLOduration=3.184217885 podStartE2EDuration="6.186037487s" podCreationTimestamp="2026-03-01 09:28:49 +0000 UTC" firstStartedPulling="2026-03-01 09:28:50.501273075 +0000 UTC m=+1259.743152272" lastFinishedPulling="2026-03-01 09:28:53.503092687 +0000 UTC m=+1262.744971874" observedRunningTime="2026-03-01 09:28:55.172762401 +0000 UTC m=+1264.414641598" watchObservedRunningTime="2026-03-01 09:28:55.186037487 +0000 UTC m=+1264.427916684" Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.186341 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-dns-svc\") pod \"c418cc81-d55e-4678-8aad-caa1573d366a\" (UID: \"c418cc81-d55e-4678-8aad-caa1573d366a\") " Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.186391 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-ovsdbserver-nb\") pod \"c418cc81-d55e-4678-8aad-caa1573d366a\" (UID: \"c418cc81-d55e-4678-8aad-caa1573d366a\") " Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.186478 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-ovsdbserver-sb\") pod \"c418cc81-d55e-4678-8aad-caa1573d366a\" (UID: \"c418cc81-d55e-4678-8aad-caa1573d366a\") " Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.186696 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcf7g\" (UniqueName: \"kubernetes.io/projected/c418cc81-d55e-4678-8aad-caa1573d366a-kube-api-access-mcf7g\") pod \"c418cc81-d55e-4678-8aad-caa1573d366a\" (UID: \"c418cc81-d55e-4678-8aad-caa1573d366a\") " Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.187274 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-config\") pod \"c418cc81-d55e-4678-8aad-caa1573d366a\" (UID: \"c418cc81-d55e-4678-8aad-caa1573d366a\") " Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.245346 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c418cc81-d55e-4678-8aad-caa1573d366a-kube-api-access-mcf7g" (OuterVolumeSpecName: "kube-api-access-mcf7g") pod "c418cc81-d55e-4678-8aad-caa1573d366a" (UID: "c418cc81-d55e-4678-8aad-caa1573d366a"). InnerVolumeSpecName "kube-api-access-mcf7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.292895 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcf7g\" (UniqueName: \"kubernetes.io/projected/c418cc81-d55e-4678-8aad-caa1573d366a-kube-api-access-mcf7g\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.305017 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c418cc81-d55e-4678-8aad-caa1573d366a" (UID: "c418cc81-d55e-4678-8aad-caa1573d366a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.306750 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c418cc81-d55e-4678-8aad-caa1573d366a" (UID: "c418cc81-d55e-4678-8aad-caa1573d366a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.325301 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c418cc81-d55e-4678-8aad-caa1573d366a" (UID: "c418cc81-d55e-4678-8aad-caa1573d366a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.353083 4792 scope.go:117] "RemoveContainer" containerID="fe493b7561a052b1ecbcb59aceb3c59fff9d9037dafcf4b3ac23d8ab22a13b4e" Mar 01 09:28:55 crc kubenswrapper[4792]: E0301 09:28:55.356064 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe493b7561a052b1ecbcb59aceb3c59fff9d9037dafcf4b3ac23d8ab22a13b4e\": container with ID starting with fe493b7561a052b1ecbcb59aceb3c59fff9d9037dafcf4b3ac23d8ab22a13b4e not found: ID does not exist" containerID="fe493b7561a052b1ecbcb59aceb3c59fff9d9037dafcf4b3ac23d8ab22a13b4e" Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.356105 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe493b7561a052b1ecbcb59aceb3c59fff9d9037dafcf4b3ac23d8ab22a13b4e"} err="failed to get container status \"fe493b7561a052b1ecbcb59aceb3c59fff9d9037dafcf4b3ac23d8ab22a13b4e\": rpc error: code = NotFound desc = could not find container \"fe493b7561a052b1ecbcb59aceb3c59fff9d9037dafcf4b3ac23d8ab22a13b4e\": container with ID starting with fe493b7561a052b1ecbcb59aceb3c59fff9d9037dafcf4b3ac23d8ab22a13b4e not found: ID does not exist" Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.356133 4792 scope.go:117] "RemoveContainer" containerID="fb4bc5fb705e29cf6511138c6ac8f7d95127aa4a1c119e960a46f1f39b113863" Mar 01 09:28:55 crc kubenswrapper[4792]: E0301 09:28:55.356626 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb4bc5fb705e29cf6511138c6ac8f7d95127aa4a1c119e960a46f1f39b113863\": container with ID starting with fb4bc5fb705e29cf6511138c6ac8f7d95127aa4a1c119e960a46f1f39b113863 not found: ID does not exist" containerID="fb4bc5fb705e29cf6511138c6ac8f7d95127aa4a1c119e960a46f1f39b113863" Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.356665 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb4bc5fb705e29cf6511138c6ac8f7d95127aa4a1c119e960a46f1f39b113863"} err="failed to get container status \"fb4bc5fb705e29cf6511138c6ac8f7d95127aa4a1c119e960a46f1f39b113863\": rpc error: code = NotFound desc = could not find container \"fb4bc5fb705e29cf6511138c6ac8f7d95127aa4a1c119e960a46f1f39b113863\": container with ID starting with fb4bc5fb705e29cf6511138c6ac8f7d95127aa4a1c119e960a46f1f39b113863 not found: ID does not exist" Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.397995 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.398024 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.398033 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.434138 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-config" (OuterVolumeSpecName: "config") pod "c418cc81-d55e-4678-8aad-caa1573d366a" (UID: "c418cc81-d55e-4678-8aad-caa1573d366a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.500162 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.741564 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-775688cbd9-h9lg8"] Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.751641 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-775688cbd9-h9lg8"] Mar 01 09:28:56 crc kubenswrapper[4792]: I0301 09:28:56.277456 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73747536-7a61-4c63-87c7-9e4c72471fb1","Type":"ContainerStarted","Data":"30490c34339b14f02bc47ec05535e3863a7df94573da07df5321b635c71d016a"} Mar 01 09:28:56 crc kubenswrapper[4792]: I0301 09:28:56.278445 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:56 crc kubenswrapper[4792]: I0301 09:28:56.278505 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:56 crc kubenswrapper[4792]: I0301 09:28:56.287518 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 01 09:28:57 crc kubenswrapper[4792]: I0301 09:28:57.306402 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3261c1a4-1fc3-4584-a04b-a909176a21a7","Type":"ContainerStarted","Data":"ebb3c90c66d61b3c96e538bb3a90dec662ac00998a3c455a15d931238797819c"} Mar 01 09:28:57 crc kubenswrapper[4792]: I0301 09:28:57.313793 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7675674687-rrbg6" event={"ID":"b5f281f2-c77c-49cc-93a4-e7ed029f29bb","Type":"ContainerStarted","Data":"fa470bfaa3e77451b41cb706fe5366ce1e220ca16dcf36e41c6bb50c6ad8d869"} Mar 01 09:28:57 crc kubenswrapper[4792]: I0301 09:28:57.313879 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7675674687-rrbg6" Mar 01 09:28:57 crc kubenswrapper[4792]: I0301 09:28:57.325301 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73747536-7a61-4c63-87c7-9e4c72471fb1","Type":"ContainerStarted","Data":"23d29fcdda3c38825545e571c946d862613874d7ef2eb012aff4b4af60d5268e"} Mar 01 09:28:57 crc kubenswrapper[4792]: I0301 09:28:57.329060 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5a7e11fe-898a-442c-b619-d7ccea385948","Type":"ContainerStarted","Data":"375e2e53d30f9933a6980133a207824c3a278795869bc631327981b1d5488167"} Mar 01 09:28:57 crc kubenswrapper[4792]: I0301 09:28:57.421030 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c418cc81-d55e-4678-8aad-caa1573d366a" path="/var/lib/kubelet/pods/c418cc81-d55e-4678-8aad-caa1573d366a/volumes" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.310528 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7bbc5b86d6-8b672" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.329665 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7675674687-rrbg6" podStartSLOduration=6.329622159 podStartE2EDuration="6.329622159s" podCreationTimestamp="2026-03-01 09:28:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:28:57.349005048 +0000 UTC m=+1266.590884245" watchObservedRunningTime="2026-03-01 09:28:58.329622159 +0000 UTC m=+1267.571501356" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.339599 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5a7e11fe-898a-442c-b619-d7ccea385948","Type":"ContainerStarted","Data":"70aca8fafacb1a0424dab05fe3248c2f55093cc69642e8fa33c562316f86f9cc"} Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.341611 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3261c1a4-1fc3-4584-a04b-a909176a21a7","Type":"ContainerStarted","Data":"6a2b4266f6e00d6aee5a043cd5566817d0900e0782c4c9c3fd61fe7b047d58d1"} Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.341752 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="3261c1a4-1fc3-4584-a04b-a909176a21a7" containerName="cinder-api" containerID="cri-o://6a2b4266f6e00d6aee5a043cd5566817d0900e0782c4c9c3fd61fe7b047d58d1" gracePeriod=30 Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.341881 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="3261c1a4-1fc3-4584-a04b-a909176a21a7" containerName="cinder-api-log" containerID="cri-o://ebb3c90c66d61b3c96e538bb3a90dec662ac00998a3c455a15d931238797819c" gracePeriod=30 Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.342185 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.405272 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.435118022 podStartE2EDuration="6.405253165s" podCreationTimestamp="2026-03-01 09:28:52 +0000 UTC" firstStartedPulling="2026-03-01 09:28:54.460323354 +0000 UTC m=+1263.702202551" lastFinishedPulling="2026-03-01 09:28:55.430458507 +0000 UTC m=+1264.672337694" observedRunningTime="2026-03-01 09:28:58.3923952 +0000 UTC m=+1267.634274397" watchObservedRunningTime="2026-03-01 09:28:58.405253165 +0000 UTC m=+1267.647132362" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.435618 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.43560435 podStartE2EDuration="6.43560435s" podCreationTimestamp="2026-03-01 09:28:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:28:58.433359045 +0000 UTC m=+1267.675238242" watchObservedRunningTime="2026-03-01 09:28:58.43560435 +0000 UTC m=+1267.677483537" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.662089 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-66955dfdb5-6j2wx"] Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.663404 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-66955dfdb5-6j2wx" podUID="90e1a395-ebc6-49ca-9924-c64283c12ec4" containerName="neutron-httpd" containerID="cri-o://5ed4ad048c8c293d006aad77d5956e915974d390f9663c49f5f3c523ced04257" gracePeriod=30 Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.663630 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-66955dfdb5-6j2wx" podUID="90e1a395-ebc6-49ca-9924-c64283c12ec4" containerName="neutron-api" containerID="cri-o://171f60709b450c4b056a3daad4b651d3e95b5d1e5702d55092b0d107469669dc" gracePeriod=30 Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.676244 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-66955dfdb5-6j2wx" podUID="90e1a395-ebc6-49ca-9924-c64283c12ec4" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.145:9696/\": read tcp 10.217.0.2:59862->10.217.0.145:9696: read: connection reset by peer" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.717346 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6c8bdfb955-kjg92"] Mar 01 09:28:58 crc kubenswrapper[4792]: E0301 09:28:58.717830 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c418cc81-d55e-4678-8aad-caa1573d366a" containerName="dnsmasq-dns" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.717859 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c418cc81-d55e-4678-8aad-caa1573d366a" containerName="dnsmasq-dns" Mar 01 09:28:58 crc kubenswrapper[4792]: E0301 09:28:58.717874 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c418cc81-d55e-4678-8aad-caa1573d366a" containerName="init" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.717881 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c418cc81-d55e-4678-8aad-caa1573d366a" containerName="init" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.718091 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c418cc81-d55e-4678-8aad-caa1573d366a" containerName="dnsmasq-dns" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.719107 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.746354 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6c8bdfb955-kjg92"] Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.802564 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ceced30a-39e5-413f-a498-e5d4500f1eea-internal-tls-certs\") pod \"neutron-6c8bdfb955-kjg92\" (UID: \"ceced30a-39e5-413f-a498-e5d4500f1eea\") " pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.802637 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ceced30a-39e5-413f-a498-e5d4500f1eea-httpd-config\") pod \"neutron-6c8bdfb955-kjg92\" (UID: \"ceced30a-39e5-413f-a498-e5d4500f1eea\") " pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.802694 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhvgv\" (UniqueName: \"kubernetes.io/projected/ceced30a-39e5-413f-a498-e5d4500f1eea-kube-api-access-lhvgv\") pod \"neutron-6c8bdfb955-kjg92\" (UID: \"ceced30a-39e5-413f-a498-e5d4500f1eea\") " pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.802735 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ceced30a-39e5-413f-a498-e5d4500f1eea-config\") pod \"neutron-6c8bdfb955-kjg92\" (UID: \"ceced30a-39e5-413f-a498-e5d4500f1eea\") " pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.802775 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ceced30a-39e5-413f-a498-e5d4500f1eea-ovndb-tls-certs\") pod \"neutron-6c8bdfb955-kjg92\" (UID: \"ceced30a-39e5-413f-a498-e5d4500f1eea\") " pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.802799 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ceced30a-39e5-413f-a498-e5d4500f1eea-public-tls-certs\") pod \"neutron-6c8bdfb955-kjg92\" (UID: \"ceced30a-39e5-413f-a498-e5d4500f1eea\") " pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.802868 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceced30a-39e5-413f-a498-e5d4500f1eea-combined-ca-bundle\") pod \"neutron-6c8bdfb955-kjg92\" (UID: \"ceced30a-39e5-413f-a498-e5d4500f1eea\") " pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.904527 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ceced30a-39e5-413f-a498-e5d4500f1eea-internal-tls-certs\") pod \"neutron-6c8bdfb955-kjg92\" (UID: \"ceced30a-39e5-413f-a498-e5d4500f1eea\") " pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.904899 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ceced30a-39e5-413f-a498-e5d4500f1eea-httpd-config\") pod \"neutron-6c8bdfb955-kjg92\" (UID: \"ceced30a-39e5-413f-a498-e5d4500f1eea\") " pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.904952 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhvgv\" (UniqueName: \"kubernetes.io/projected/ceced30a-39e5-413f-a498-e5d4500f1eea-kube-api-access-lhvgv\") pod \"neutron-6c8bdfb955-kjg92\" (UID: \"ceced30a-39e5-413f-a498-e5d4500f1eea\") " pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.905484 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ceced30a-39e5-413f-a498-e5d4500f1eea-config\") pod \"neutron-6c8bdfb955-kjg92\" (UID: \"ceced30a-39e5-413f-a498-e5d4500f1eea\") " pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.905525 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ceced30a-39e5-413f-a498-e5d4500f1eea-ovndb-tls-certs\") pod \"neutron-6c8bdfb955-kjg92\" (UID: \"ceced30a-39e5-413f-a498-e5d4500f1eea\") " pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.905549 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ceced30a-39e5-413f-a498-e5d4500f1eea-public-tls-certs\") pod \"neutron-6c8bdfb955-kjg92\" (UID: \"ceced30a-39e5-413f-a498-e5d4500f1eea\") " pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.905611 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceced30a-39e5-413f-a498-e5d4500f1eea-combined-ca-bundle\") pod \"neutron-6c8bdfb955-kjg92\" (UID: \"ceced30a-39e5-413f-a498-e5d4500f1eea\") " pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.919081 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ceced30a-39e5-413f-a498-e5d4500f1eea-public-tls-certs\") pod \"neutron-6c8bdfb955-kjg92\" (UID: \"ceced30a-39e5-413f-a498-e5d4500f1eea\") " pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.919492 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ceced30a-39e5-413f-a498-e5d4500f1eea-internal-tls-certs\") pod \"neutron-6c8bdfb955-kjg92\" (UID: \"ceced30a-39e5-413f-a498-e5d4500f1eea\") " pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.924724 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ceced30a-39e5-413f-a498-e5d4500f1eea-ovndb-tls-certs\") pod \"neutron-6c8bdfb955-kjg92\" (UID: \"ceced30a-39e5-413f-a498-e5d4500f1eea\") " pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.943294 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ceced30a-39e5-413f-a498-e5d4500f1eea-config\") pod \"neutron-6c8bdfb955-kjg92\" (UID: \"ceced30a-39e5-413f-a498-e5d4500f1eea\") " pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.943314 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceced30a-39e5-413f-a498-e5d4500f1eea-combined-ca-bundle\") pod \"neutron-6c8bdfb955-kjg92\" (UID: \"ceced30a-39e5-413f-a498-e5d4500f1eea\") " pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.952789 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ceced30a-39e5-413f-a498-e5d4500f1eea-httpd-config\") pod \"neutron-6c8bdfb955-kjg92\" (UID: \"ceced30a-39e5-413f-a498-e5d4500f1eea\") " pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.966078 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhvgv\" (UniqueName: \"kubernetes.io/projected/ceced30a-39e5-413f-a498-e5d4500f1eea-kube-api-access-lhvgv\") pod \"neutron-6c8bdfb955-kjg92\" (UID: \"ceced30a-39e5-413f-a498-e5d4500f1eea\") " pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.055968 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.392082 4792 generic.go:334] "Generic (PLEG): container finished" podID="3261c1a4-1fc3-4584-a04b-a909176a21a7" containerID="6a2b4266f6e00d6aee5a043cd5566817d0900e0782c4c9c3fd61fe7b047d58d1" exitCode=0 Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.392398 4792 generic.go:334] "Generic (PLEG): container finished" podID="3261c1a4-1fc3-4584-a04b-a909176a21a7" containerID="ebb3c90c66d61b3c96e538bb3a90dec662ac00998a3c455a15d931238797819c" exitCode=143 Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.392449 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3261c1a4-1fc3-4584-a04b-a909176a21a7","Type":"ContainerDied","Data":"6a2b4266f6e00d6aee5a043cd5566817d0900e0782c4c9c3fd61fe7b047d58d1"} Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.392479 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3261c1a4-1fc3-4584-a04b-a909176a21a7","Type":"ContainerDied","Data":"ebb3c90c66d61b3c96e538bb3a90dec662ac00998a3c455a15d931238797819c"} Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.412398 4792 generic.go:334] "Generic (PLEG): container finished" podID="90e1a395-ebc6-49ca-9924-c64283c12ec4" containerID="5ed4ad048c8c293d006aad77d5956e915974d390f9663c49f5f3c523ced04257" exitCode=0 Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.472799 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66955dfdb5-6j2wx" event={"ID":"90e1a395-ebc6-49ca-9924-c64283c12ec4","Type":"ContainerDied","Data":"5ed4ad048c8c293d006aad77d5956e915974d390f9663c49f5f3c523ced04257"} Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.479695 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73747536-7a61-4c63-87c7-9e4c72471fb1","Type":"ContainerStarted","Data":"3b83ecfb929c62345bb4c7e97676547d3a0dba6dd25466d0bdaa53eaba514d91"} Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.479776 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.845064 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.878878 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.164376608 podStartE2EDuration="10.878862727s" podCreationTimestamp="2026-03-01 09:28:49 +0000 UTC" firstStartedPulling="2026-03-01 09:28:51.114443956 +0000 UTC m=+1260.356323153" lastFinishedPulling="2026-03-01 09:28:58.828930075 +0000 UTC m=+1268.070809272" observedRunningTime="2026-03-01 09:28:59.569572075 +0000 UTC m=+1268.811451262" watchObservedRunningTime="2026-03-01 09:28:59.878862727 +0000 UTC m=+1269.120741924" Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.899986 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6c8bdfb955-kjg92"] Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.943665 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-scripts\") pod \"3261c1a4-1fc3-4584-a04b-a909176a21a7\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.943740 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-config-data-custom\") pod \"3261c1a4-1fc3-4584-a04b-a909176a21a7\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.943785 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3261c1a4-1fc3-4584-a04b-a909176a21a7-logs\") pod \"3261c1a4-1fc3-4584-a04b-a909176a21a7\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.943807 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-combined-ca-bundle\") pod \"3261c1a4-1fc3-4584-a04b-a909176a21a7\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.943867 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4cnd\" (UniqueName: \"kubernetes.io/projected/3261c1a4-1fc3-4584-a04b-a909176a21a7-kube-api-access-r4cnd\") pod \"3261c1a4-1fc3-4584-a04b-a909176a21a7\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.943966 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-config-data\") pod \"3261c1a4-1fc3-4584-a04b-a909176a21a7\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.944039 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3261c1a4-1fc3-4584-a04b-a909176a21a7-etc-machine-id\") pod \"3261c1a4-1fc3-4584-a04b-a909176a21a7\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.944604 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3261c1a4-1fc3-4584-a04b-a909176a21a7-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3261c1a4-1fc3-4584-a04b-a909176a21a7" (UID: "3261c1a4-1fc3-4584-a04b-a909176a21a7"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.949011 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3261c1a4-1fc3-4584-a04b-a909176a21a7-logs" (OuterVolumeSpecName: "logs") pod "3261c1a4-1fc3-4584-a04b-a909176a21a7" (UID: "3261c1a4-1fc3-4584-a04b-a909176a21a7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.984580 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3261c1a4-1fc3-4584-a04b-a909176a21a7" (UID: "3261c1a4-1fc3-4584-a04b-a909176a21a7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.987353 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-scripts" (OuterVolumeSpecName: "scripts") pod "3261c1a4-1fc3-4584-a04b-a909176a21a7" (UID: "3261c1a4-1fc3-4584-a04b-a909176a21a7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.996064 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3261c1a4-1fc3-4584-a04b-a909176a21a7-kube-api-access-r4cnd" (OuterVolumeSpecName: "kube-api-access-r4cnd") pod "3261c1a4-1fc3-4584-a04b-a909176a21a7" (UID: "3261c1a4-1fc3-4584-a04b-a909176a21a7"). InnerVolumeSpecName "kube-api-access-r4cnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.052331 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4cnd\" (UniqueName: \"kubernetes.io/projected/3261c1a4-1fc3-4584-a04b-a909176a21a7-kube-api-access-r4cnd\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.052377 4792 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3261c1a4-1fc3-4584-a04b-a909176a21a7-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.052389 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.052400 4792 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.052412 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3261c1a4-1fc3-4584-a04b-a909176a21a7-logs\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.091619 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3261c1a4-1fc3-4584-a04b-a909176a21a7" (UID: "3261c1a4-1fc3-4584-a04b-a909176a21a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.125706 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-config-data" (OuterVolumeSpecName: "config-data") pod "3261c1a4-1fc3-4584-a04b-a909176a21a7" (UID: "3261c1a4-1fc3-4584-a04b-a909176a21a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.153932 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.153973 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.324183 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-66955dfdb5-6j2wx" podUID="90e1a395-ebc6-49ca-9924-c64283c12ec4" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.145:9696/\": dial tcp 10.217.0.145:9696: connect: connection refused" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.488843 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c8bdfb955-kjg92" event={"ID":"ceced30a-39e5-413f-a498-e5d4500f1eea","Type":"ContainerStarted","Data":"5ef35c6b23e5b2b25f4b975f83b3da467fa9da4c5c353859a75a722b4fa63404"} Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.488897 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c8bdfb955-kjg92" event={"ID":"ceced30a-39e5-413f-a498-e5d4500f1eea","Type":"ContainerStarted","Data":"167fa02eaeaac4e573ff079dd56096477c58422ab30bb1c424fbc17b68903c31"} Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.491947 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3261c1a4-1fc3-4584-a04b-a909176a21a7","Type":"ContainerDied","Data":"878d29e29b8947901ac3b55766798c44fc0595df994212304bdbb24caf194ff0"} Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.491998 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.492043 4792 scope.go:117] "RemoveContainer" containerID="6a2b4266f6e00d6aee5a043cd5566817d0900e0782c4c9c3fd61fe7b047d58d1" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.563221 4792 scope.go:117] "RemoveContainer" containerID="ebb3c90c66d61b3c96e538bb3a90dec662ac00998a3c455a15d931238797819c" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.595893 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.663242 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.679945 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 01 09:29:00 crc kubenswrapper[4792]: E0301 09:29:00.680306 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3261c1a4-1fc3-4584-a04b-a909176a21a7" containerName="cinder-api-log" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.680323 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3261c1a4-1fc3-4584-a04b-a909176a21a7" containerName="cinder-api-log" Mar 01 09:29:00 crc kubenswrapper[4792]: E0301 09:29:00.680335 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3261c1a4-1fc3-4584-a04b-a909176a21a7" containerName="cinder-api" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.680343 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3261c1a4-1fc3-4584-a04b-a909176a21a7" containerName="cinder-api" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.680509 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="3261c1a4-1fc3-4584-a04b-a909176a21a7" containerName="cinder-api" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.680529 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="3261c1a4-1fc3-4584-a04b-a909176a21a7" containerName="cinder-api-log" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.681375 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.684827 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.687343 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.688177 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.690804 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.766888 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/084f9db1-15eb-458c-8b43-aeb5dbb0555f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.767017 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/084f9db1-15eb-458c-8b43-aeb5dbb0555f-config-data\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.767049 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/084f9db1-15eb-458c-8b43-aeb5dbb0555f-logs\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.767076 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/084f9db1-15eb-458c-8b43-aeb5dbb0555f-scripts\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.767104 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zw59\" (UniqueName: \"kubernetes.io/projected/084f9db1-15eb-458c-8b43-aeb5dbb0555f-kube-api-access-4zw59\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.767150 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/084f9db1-15eb-458c-8b43-aeb5dbb0555f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.767175 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/084f9db1-15eb-458c-8b43-aeb5dbb0555f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.767222 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/084f9db1-15eb-458c-8b43-aeb5dbb0555f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.767246 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/084f9db1-15eb-458c-8b43-aeb5dbb0555f-config-data-custom\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.869271 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/084f9db1-15eb-458c-8b43-aeb5dbb0555f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.869544 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/084f9db1-15eb-458c-8b43-aeb5dbb0555f-config-data\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.869634 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/084f9db1-15eb-458c-8b43-aeb5dbb0555f-logs\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.869735 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/084f9db1-15eb-458c-8b43-aeb5dbb0555f-scripts\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.869823 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zw59\" (UniqueName: \"kubernetes.io/projected/084f9db1-15eb-458c-8b43-aeb5dbb0555f-kube-api-access-4zw59\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.869942 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/084f9db1-15eb-458c-8b43-aeb5dbb0555f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.870022 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/084f9db1-15eb-458c-8b43-aeb5dbb0555f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.870109 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/084f9db1-15eb-458c-8b43-aeb5dbb0555f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.870181 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/084f9db1-15eb-458c-8b43-aeb5dbb0555f-config-data-custom\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.869484 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/084f9db1-15eb-458c-8b43-aeb5dbb0555f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.870123 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/084f9db1-15eb-458c-8b43-aeb5dbb0555f-logs\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.919416 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/084f9db1-15eb-458c-8b43-aeb5dbb0555f-config-data-custom\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.920797 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/084f9db1-15eb-458c-8b43-aeb5dbb0555f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.921034 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/084f9db1-15eb-458c-8b43-aeb5dbb0555f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.922787 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/084f9db1-15eb-458c-8b43-aeb5dbb0555f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.922790 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zw59\" (UniqueName: \"kubernetes.io/projected/084f9db1-15eb-458c-8b43-aeb5dbb0555f-kube-api-access-4zw59\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.923646 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/084f9db1-15eb-458c-8b43-aeb5dbb0555f-config-data\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.926438 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/084f9db1-15eb-458c-8b43-aeb5dbb0555f-scripts\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:01 crc kubenswrapper[4792]: I0301 09:29:01.000963 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 01 09:29:01 crc kubenswrapper[4792]: I0301 09:29:01.032723 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-686d9f9896-9zsh2" podUID="4721d2a8-efb5-4fa3-9779-797448455198" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.152:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 01 09:29:01 crc kubenswrapper[4792]: I0301 09:29:01.437257 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3261c1a4-1fc3-4584-a04b-a909176a21a7" path="/var/lib/kubelet/pods/3261c1a4-1fc3-4584-a04b-a909176a21a7/volumes" Mar 01 09:29:01 crc kubenswrapper[4792]: I0301 09:29:01.529342 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c8bdfb955-kjg92" event={"ID":"ceced30a-39e5-413f-a498-e5d4500f1eea","Type":"ContainerStarted","Data":"361e3576f7f3325679921320011c685a597478c9f150cb082a5ff7ee75562f43"} Mar 01 09:29:01 crc kubenswrapper[4792]: I0301 09:29:01.530318 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:29:01 crc kubenswrapper[4792]: I0301 09:29:01.656154 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6c8bdfb955-kjg92" podStartSLOduration=3.6561348110000003 podStartE2EDuration="3.656134811s" podCreationTimestamp="2026-03-01 09:28:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:29:01.562129293 +0000 UTC m=+1270.804008510" watchObservedRunningTime="2026-03-01 09:29:01.656134811 +0000 UTC m=+1270.898014008" Mar 01 09:29:01 crc kubenswrapper[4792]: I0301 09:29:01.656456 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 01 09:29:02 crc kubenswrapper[4792]: I0301 09:29:02.579597 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"084f9db1-15eb-458c-8b43-aeb5dbb0555f","Type":"ContainerStarted","Data":"8673a2c51f6ab102b805447b07d56e08fcc32edb4292ec021588a926f6ef7f67"} Mar 01 09:29:02 crc kubenswrapper[4792]: I0301 09:29:02.580103 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"084f9db1-15eb-458c-8b43-aeb5dbb0555f","Type":"ContainerStarted","Data":"1c0429271d7fd689692116225410decbc6ae777d83db08e21c9aa60aeac32e90"} Mar 01 09:29:02 crc kubenswrapper[4792]: I0301 09:29:02.739073 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 01 09:29:02 crc kubenswrapper[4792]: I0301 09:29:02.969103 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7675674687-rrbg6" Mar 01 09:29:03 crc kubenswrapper[4792]: I0301 09:29:03.032478 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76fc8568c7-drpr8"] Mar 01 09:29:03 crc kubenswrapper[4792]: I0301 09:29:03.032720 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" podUID="82467164-5e77-4ea0-beee-b3a70126c075" containerName="dnsmasq-dns" containerID="cri-o://4fab450c58bb439c06a393afb4862347d838c516e4e1d271f6a55936b2e24c70" gracePeriod=10 Mar 01 09:29:03 crc kubenswrapper[4792]: I0301 09:29:03.213351 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" podUID="82467164-5e77-4ea0-beee-b3a70126c075" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.143:5353: connect: connection refused" Mar 01 09:29:03 crc kubenswrapper[4792]: I0301 09:29:03.606372 4792 generic.go:334] "Generic (PLEG): container finished" podID="82467164-5e77-4ea0-beee-b3a70126c075" containerID="4fab450c58bb439c06a393afb4862347d838c516e4e1d271f6a55936b2e24c70" exitCode=0 Mar 01 09:29:03 crc kubenswrapper[4792]: I0301 09:29:03.606794 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" event={"ID":"82467164-5e77-4ea0-beee-b3a70126c075","Type":"ContainerDied","Data":"4fab450c58bb439c06a393afb4862347d838c516e4e1d271f6a55936b2e24c70"} Mar 01 09:29:03 crc kubenswrapper[4792]: I0301 09:29:03.928963 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.030113 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-686d9f9896-9zsh2" podUID="4721d2a8-efb5-4fa3-9779-797448455198" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.152:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.046296 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-ovsdbserver-nb\") pod \"82467164-5e77-4ea0-beee-b3a70126c075\" (UID: \"82467164-5e77-4ea0-beee-b3a70126c075\") " Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.046343 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-ovsdbserver-sb\") pod \"82467164-5e77-4ea0-beee-b3a70126c075\" (UID: \"82467164-5e77-4ea0-beee-b3a70126c075\") " Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.046557 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-dns-svc\") pod \"82467164-5e77-4ea0-beee-b3a70126c075\" (UID: \"82467164-5e77-4ea0-beee-b3a70126c075\") " Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.046681 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-config\") pod \"82467164-5e77-4ea0-beee-b3a70126c075\" (UID: \"82467164-5e77-4ea0-beee-b3a70126c075\") " Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.046707 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hg4gm\" (UniqueName: \"kubernetes.io/projected/82467164-5e77-4ea0-beee-b3a70126c075-kube-api-access-hg4gm\") pod \"82467164-5e77-4ea0-beee-b3a70126c075\" (UID: \"82467164-5e77-4ea0-beee-b3a70126c075\") " Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.070290 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82467164-5e77-4ea0-beee-b3a70126c075-kube-api-access-hg4gm" (OuterVolumeSpecName: "kube-api-access-hg4gm") pod "82467164-5e77-4ea0-beee-b3a70126c075" (UID: "82467164-5e77-4ea0-beee-b3a70126c075"). InnerVolumeSpecName "kube-api-access-hg4gm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.155112 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hg4gm\" (UniqueName: \"kubernetes.io/projected/82467164-5e77-4ea0-beee-b3a70126c075-kube-api-access-hg4gm\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.222594 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "82467164-5e77-4ea0-beee-b3a70126c075" (UID: "82467164-5e77-4ea0-beee-b3a70126c075"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.229334 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "82467164-5e77-4ea0-beee-b3a70126c075" (UID: "82467164-5e77-4ea0-beee-b3a70126c075"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.230463 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "82467164-5e77-4ea0-beee-b3a70126c075" (UID: "82467164-5e77-4ea0-beee-b3a70126c075"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.259132 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.259164 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.259175 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.260350 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-config" (OuterVolumeSpecName: "config") pod "82467164-5e77-4ea0-beee-b3a70126c075" (UID: "82467164-5e77-4ea0-beee-b3a70126c075"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.360806 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.537147 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-7f86869f48-jg6nw" podUID="6c97472a-b6b7-4fc4-b872-a318812f0999" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.157:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.537199 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="5a7e11fe-898a-442c-b619-d7ccea385948" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.537211 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-7f86869f48-jg6nw" podUID="6c97472a-b6b7-4fc4-b872-a318812f0999" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.157:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.537296 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.615666 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" event={"ID":"82467164-5e77-4ea0-beee-b3a70126c075","Type":"ContainerDied","Data":"8b9ae7283e8be3bed9877439415e05f84a3d2c818de0aa317ad4a53c2c6bc4d6"} Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.615713 4792 scope.go:117] "RemoveContainer" containerID="4fab450c58bb439c06a393afb4862347d838c516e4e1d271f6a55936b2e24c70" Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.615822 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.629788 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"084f9db1-15eb-458c-8b43-aeb5dbb0555f","Type":"ContainerStarted","Data":"a0703157303b9cf20b3406286f8ec3e802bff9f1ce1298238119085d904b579c"} Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.630000 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.652677 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76fc8568c7-drpr8"] Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.654742 4792 scope.go:117] "RemoveContainer" containerID="610fd4808253a760f048f5c379d2e39f3eec919b6601aa983fb2a5f9ece83ce6" Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.669523 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76fc8568c7-drpr8"] Mar 01 09:29:05 crc kubenswrapper[4792]: I0301 09:29:05.071183 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-686d9f9896-9zsh2" podUID="4721d2a8-efb5-4fa3-9779-797448455198" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.152:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 01 09:29:05 crc kubenswrapper[4792]: I0301 09:29:05.071253 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-686d9f9896-9zsh2" podUID="4721d2a8-efb5-4fa3-9779-797448455198" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.152:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 01 09:29:05 crc kubenswrapper[4792]: I0301 09:29:05.419018 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82467164-5e77-4ea0-beee-b3a70126c075" path="/var/lib/kubelet/pods/82467164-5e77-4ea0-beee-b3a70126c075/volumes" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.031600 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.058650 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.058634044 podStartE2EDuration="6.058634044s" podCreationTimestamp="2026-03-01 09:29:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:29:04.692440519 +0000 UTC m=+1273.934319716" watchObservedRunningTime="2026-03-01 09:29:06.058634044 +0000 UTC m=+1275.300513241" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.074089 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-686d9f9896-9zsh2" podUID="4721d2a8-efb5-4fa3-9779-797448455198" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.152:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.315087 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-84f9696594-qdwsv"] Mar 01 09:29:06 crc kubenswrapper[4792]: E0301 09:29:06.315426 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82467164-5e77-4ea0-beee-b3a70126c075" containerName="init" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.315441 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="82467164-5e77-4ea0-beee-b3a70126c075" containerName="init" Mar 01 09:29:06 crc kubenswrapper[4792]: E0301 09:29:06.315480 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82467164-5e77-4ea0-beee-b3a70126c075" containerName="dnsmasq-dns" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.315486 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="82467164-5e77-4ea0-beee-b3a70126c075" containerName="dnsmasq-dns" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.315635 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="82467164-5e77-4ea0-beee-b3a70126c075" containerName="dnsmasq-dns" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.316483 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.371678 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-84f9696594-qdwsv"] Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.412098 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18f9e703-dec0-46e1-a428-580bdb68e54e-logs\") pod \"placement-84f9696594-qdwsv\" (UID: \"18f9e703-dec0-46e1-a428-580bdb68e54e\") " pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.412193 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/18f9e703-dec0-46e1-a428-580bdb68e54e-internal-tls-certs\") pod \"placement-84f9696594-qdwsv\" (UID: \"18f9e703-dec0-46e1-a428-580bdb68e54e\") " pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.412261 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qskh\" (UniqueName: \"kubernetes.io/projected/18f9e703-dec0-46e1-a428-580bdb68e54e-kube-api-access-9qskh\") pod \"placement-84f9696594-qdwsv\" (UID: \"18f9e703-dec0-46e1-a428-580bdb68e54e\") " pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.412286 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18f9e703-dec0-46e1-a428-580bdb68e54e-combined-ca-bundle\") pod \"placement-84f9696594-qdwsv\" (UID: \"18f9e703-dec0-46e1-a428-580bdb68e54e\") " pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.412306 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18f9e703-dec0-46e1-a428-580bdb68e54e-scripts\") pod \"placement-84f9696594-qdwsv\" (UID: \"18f9e703-dec0-46e1-a428-580bdb68e54e\") " pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.412343 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f9e703-dec0-46e1-a428-580bdb68e54e-config-data\") pod \"placement-84f9696594-qdwsv\" (UID: \"18f9e703-dec0-46e1-a428-580bdb68e54e\") " pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.412394 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/18f9e703-dec0-46e1-a428-580bdb68e54e-public-tls-certs\") pod \"placement-84f9696594-qdwsv\" (UID: \"18f9e703-dec0-46e1-a428-580bdb68e54e\") " pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.515879 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18f9e703-dec0-46e1-a428-580bdb68e54e-logs\") pod \"placement-84f9696594-qdwsv\" (UID: \"18f9e703-dec0-46e1-a428-580bdb68e54e\") " pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.515973 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/18f9e703-dec0-46e1-a428-580bdb68e54e-internal-tls-certs\") pod \"placement-84f9696594-qdwsv\" (UID: \"18f9e703-dec0-46e1-a428-580bdb68e54e\") " pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.516032 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qskh\" (UniqueName: \"kubernetes.io/projected/18f9e703-dec0-46e1-a428-580bdb68e54e-kube-api-access-9qskh\") pod \"placement-84f9696594-qdwsv\" (UID: \"18f9e703-dec0-46e1-a428-580bdb68e54e\") " pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.516052 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18f9e703-dec0-46e1-a428-580bdb68e54e-combined-ca-bundle\") pod \"placement-84f9696594-qdwsv\" (UID: \"18f9e703-dec0-46e1-a428-580bdb68e54e\") " pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.516070 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18f9e703-dec0-46e1-a428-580bdb68e54e-scripts\") pod \"placement-84f9696594-qdwsv\" (UID: \"18f9e703-dec0-46e1-a428-580bdb68e54e\") " pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.516099 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f9e703-dec0-46e1-a428-580bdb68e54e-config-data\") pod \"placement-84f9696594-qdwsv\" (UID: \"18f9e703-dec0-46e1-a428-580bdb68e54e\") " pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.516153 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/18f9e703-dec0-46e1-a428-580bdb68e54e-public-tls-certs\") pod \"placement-84f9696594-qdwsv\" (UID: \"18f9e703-dec0-46e1-a428-580bdb68e54e\") " pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.519350 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18f9e703-dec0-46e1-a428-580bdb68e54e-logs\") pod \"placement-84f9696594-qdwsv\" (UID: \"18f9e703-dec0-46e1-a428-580bdb68e54e\") " pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.528671 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/18f9e703-dec0-46e1-a428-580bdb68e54e-internal-tls-certs\") pod \"placement-84f9696594-qdwsv\" (UID: \"18f9e703-dec0-46e1-a428-580bdb68e54e\") " pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.531302 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18f9e703-dec0-46e1-a428-580bdb68e54e-scripts\") pod \"placement-84f9696594-qdwsv\" (UID: \"18f9e703-dec0-46e1-a428-580bdb68e54e\") " pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.534683 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18f9e703-dec0-46e1-a428-580bdb68e54e-combined-ca-bundle\") pod \"placement-84f9696594-qdwsv\" (UID: \"18f9e703-dec0-46e1-a428-580bdb68e54e\") " pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.536458 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/18f9e703-dec0-46e1-a428-580bdb68e54e-public-tls-certs\") pod \"placement-84f9696594-qdwsv\" (UID: \"18f9e703-dec0-46e1-a428-580bdb68e54e\") " pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.548566 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f9e703-dec0-46e1-a428-580bdb68e54e-config-data\") pod \"placement-84f9696594-qdwsv\" (UID: \"18f9e703-dec0-46e1-a428-580bdb68e54e\") " pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.561557 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qskh\" (UniqueName: \"kubernetes.io/projected/18f9e703-dec0-46e1-a428-580bdb68e54e-kube-api-access-9qskh\") pod \"placement-84f9696594-qdwsv\" (UID: \"18f9e703-dec0-46e1-a428-580bdb68e54e\") " pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.648592 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:07 crc kubenswrapper[4792]: I0301 09:29:07.209174 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-84f9696594-qdwsv"] Mar 01 09:29:07 crc kubenswrapper[4792]: I0301 09:29:07.603739 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:29:07 crc kubenswrapper[4792]: I0301 09:29:07.683470 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-84f9696594-qdwsv" event={"ID":"18f9e703-dec0-46e1-a428-580bdb68e54e","Type":"ContainerStarted","Data":"c568ea25201f17a718f34a8b7570230a932f887e8964f3c3604abb9921beee61"} Mar 01 09:29:07 crc kubenswrapper[4792]: I0301 09:29:07.683533 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-84f9696594-qdwsv" event={"ID":"18f9e703-dec0-46e1-a428-580bdb68e54e","Type":"ContainerStarted","Data":"b751d48eb0ce6cde01933eb86ab298561ebf12321c14d198ca69621f3d6bb2f9"} Mar 01 09:29:07 crc kubenswrapper[4792]: I0301 09:29:07.752312 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 01 09:29:07 crc kubenswrapper[4792]: I0301 09:29:07.795492 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 01 09:29:08 crc kubenswrapper[4792]: I0301 09:29:08.219605 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:29:08 crc kubenswrapper[4792]: I0301 09:29:08.315640 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-686d9f9896-9zsh2"] Mar 01 09:29:08 crc kubenswrapper[4792]: I0301 09:29:08.315945 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-686d9f9896-9zsh2" podUID="4721d2a8-efb5-4fa3-9779-797448455198" containerName="barbican-api-log" containerID="cri-o://b0ab985ce4b6e50f0fbdce6821fe03c40fba76255764f0fad8a448b7b0c30943" gracePeriod=30 Mar 01 09:29:08 crc kubenswrapper[4792]: I0301 09:29:08.316456 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-686d9f9896-9zsh2" podUID="4721d2a8-efb5-4fa3-9779-797448455198" containerName="barbican-api" containerID="cri-o://b2421256e4e0d8725502059d1b77eb867deef0339092d9a858244caa156e8a2b" gracePeriod=30 Mar 01 09:29:08 crc kubenswrapper[4792]: I0301 09:29:08.332121 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-686d9f9896-9zsh2" podUID="4721d2a8-efb5-4fa3-9779-797448455198" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.152:9311/healthcheck\": EOF" Mar 01 09:29:08 crc kubenswrapper[4792]: I0301 09:29:08.332206 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-686d9f9896-9zsh2" podUID="4721d2a8-efb5-4fa3-9779-797448455198" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.152:9311/healthcheck\": EOF" Mar 01 09:29:08 crc kubenswrapper[4792]: I0301 09:29:08.344114 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-686d9f9896-9zsh2" podUID="4721d2a8-efb5-4fa3-9779-797448455198" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.152:9311/healthcheck\": EOF" Mar 01 09:29:08 crc kubenswrapper[4792]: I0301 09:29:08.344503 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-686d9f9896-9zsh2" podUID="4721d2a8-efb5-4fa3-9779-797448455198" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.152:9311/healthcheck\": EOF" Mar 01 09:29:08 crc kubenswrapper[4792]: I0301 09:29:08.344688 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-686d9f9896-9zsh2" podUID="4721d2a8-efb5-4fa3-9779-797448455198" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.152:9311/healthcheck\": EOF" Mar 01 09:29:08 crc kubenswrapper[4792]: I0301 09:29:08.711065 4792 generic.go:334] "Generic (PLEG): container finished" podID="4721d2a8-efb5-4fa3-9779-797448455198" containerID="b0ab985ce4b6e50f0fbdce6821fe03c40fba76255764f0fad8a448b7b0c30943" exitCode=143 Mar 01 09:29:08 crc kubenswrapper[4792]: I0301 09:29:08.711152 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-686d9f9896-9zsh2" event={"ID":"4721d2a8-efb5-4fa3-9779-797448455198","Type":"ContainerDied","Data":"b0ab985ce4b6e50f0fbdce6821fe03c40fba76255764f0fad8a448b7b0c30943"} Mar 01 09:29:08 crc kubenswrapper[4792]: I0301 09:29:08.716093 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="5a7e11fe-898a-442c-b619-d7ccea385948" containerName="cinder-scheduler" containerID="cri-o://375e2e53d30f9933a6980133a207824c3a278795869bc631327981b1d5488167" gracePeriod=30 Mar 01 09:29:08 crc kubenswrapper[4792]: I0301 09:29:08.716667 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-84f9696594-qdwsv" event={"ID":"18f9e703-dec0-46e1-a428-580bdb68e54e","Type":"ContainerStarted","Data":"7b8baaece986d008808fd338ff18190d583e93de1447e07f1b636893c7f2019e"} Mar 01 09:29:08 crc kubenswrapper[4792]: I0301 09:29:08.716658 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="5a7e11fe-898a-442c-b619-d7ccea385948" containerName="probe" containerID="cri-o://70aca8fafacb1a0424dab05fe3248c2f55093cc69642e8fa33c562316f86f9cc" gracePeriod=30 Mar 01 09:29:08 crc kubenswrapper[4792]: I0301 09:29:08.717017 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:08 crc kubenswrapper[4792]: I0301 09:29:08.717056 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:08 crc kubenswrapper[4792]: I0301 09:29:08.766145 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-84f9696594-qdwsv" podStartSLOduration=2.766123392 podStartE2EDuration="2.766123392s" podCreationTimestamp="2026-03-01 09:29:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:29:08.76567031 +0000 UTC m=+1278.007549507" watchObservedRunningTime="2026-03-01 09:29:08.766123392 +0000 UTC m=+1278.008002589" Mar 01 09:29:09 crc kubenswrapper[4792]: I0301 09:29:09.542121 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-7f86869f48-jg6nw" podUID="6c97472a-b6b7-4fc4-b872-a318812f0999" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.157:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 01 09:29:09 crc kubenswrapper[4792]: I0301 09:29:09.725271 4792 generic.go:334] "Generic (PLEG): container finished" podID="5a7e11fe-898a-442c-b619-d7ccea385948" containerID="70aca8fafacb1a0424dab05fe3248c2f55093cc69642e8fa33c562316f86f9cc" exitCode=0 Mar 01 09:29:09 crc kubenswrapper[4792]: I0301 09:29:09.725357 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5a7e11fe-898a-442c-b619-d7ccea385948","Type":"ContainerDied","Data":"70aca8fafacb1a0424dab05fe3248c2f55093cc69642e8fa33c562316f86f9cc"} Mar 01 09:29:09 crc kubenswrapper[4792]: I0301 09:29:09.727985 4792 generic.go:334] "Generic (PLEG): container finished" podID="90e1a395-ebc6-49ca-9924-c64283c12ec4" containerID="171f60709b450c4b056a3daad4b651d3e95b5d1e5702d55092b0d107469669dc" exitCode=0 Mar 01 09:29:09 crc kubenswrapper[4792]: I0301 09:29:09.728337 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66955dfdb5-6j2wx" event={"ID":"90e1a395-ebc6-49ca-9924-c64283c12ec4","Type":"ContainerDied","Data":"171f60709b450c4b056a3daad4b651d3e95b5d1e5702d55092b0d107469669dc"} Mar 01 09:29:10 crc kubenswrapper[4792]: I0301 09:29:10.200223 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:29:10 crc kubenswrapper[4792]: I0301 09:29:10.746256 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:29:10 crc kubenswrapper[4792]: I0301 09:29:10.746726 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66955dfdb5-6j2wx" event={"ID":"90e1a395-ebc6-49ca-9924-c64283c12ec4","Type":"ContainerDied","Data":"0e01f5820150bb847cd98736209f40a8cfae4c1d42fc832b6f738e299bc2db88"} Mar 01 09:29:10 crc kubenswrapper[4792]: I0301 09:29:10.746771 4792 scope.go:117] "RemoveContainer" containerID="5ed4ad048c8c293d006aad77d5956e915974d390f9663c49f5f3c523ced04257" Mar 01 09:29:10 crc kubenswrapper[4792]: I0301 09:29:10.759286 4792 generic.go:334] "Generic (PLEG): container finished" podID="5a7e11fe-898a-442c-b619-d7ccea385948" containerID="375e2e53d30f9933a6980133a207824c3a278795869bc631327981b1d5488167" exitCode=0 Mar 01 09:29:10 crc kubenswrapper[4792]: I0301 09:29:10.759331 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5a7e11fe-898a-442c-b619-d7ccea385948","Type":"ContainerDied","Data":"375e2e53d30f9933a6980133a207824c3a278795869bc631327981b1d5488167"} Mar 01 09:29:10 crc kubenswrapper[4792]: I0301 09:29:10.818536 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-public-tls-certs\") pod \"90e1a395-ebc6-49ca-9924-c64283c12ec4\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " Mar 01 09:29:10 crc kubenswrapper[4792]: I0301 09:29:10.818593 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8fsh\" (UniqueName: \"kubernetes.io/projected/90e1a395-ebc6-49ca-9924-c64283c12ec4-kube-api-access-h8fsh\") pod \"90e1a395-ebc6-49ca-9924-c64283c12ec4\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " Mar 01 09:29:10 crc kubenswrapper[4792]: I0301 09:29:10.818623 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-internal-tls-certs\") pod \"90e1a395-ebc6-49ca-9924-c64283c12ec4\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " Mar 01 09:29:10 crc kubenswrapper[4792]: I0301 09:29:10.818686 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-httpd-config\") pod \"90e1a395-ebc6-49ca-9924-c64283c12ec4\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " Mar 01 09:29:10 crc kubenswrapper[4792]: I0301 09:29:10.818709 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-config\") pod \"90e1a395-ebc6-49ca-9924-c64283c12ec4\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " Mar 01 09:29:10 crc kubenswrapper[4792]: I0301 09:29:10.818789 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-combined-ca-bundle\") pod \"90e1a395-ebc6-49ca-9924-c64283c12ec4\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " Mar 01 09:29:10 crc kubenswrapper[4792]: I0301 09:29:10.818897 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-ovndb-tls-certs\") pod \"90e1a395-ebc6-49ca-9924-c64283c12ec4\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " Mar 01 09:29:10 crc kubenswrapper[4792]: I0301 09:29:10.819362 4792 scope.go:117] "RemoveContainer" containerID="171f60709b450c4b056a3daad4b651d3e95b5d1e5702d55092b0d107469669dc" Mar 01 09:29:10 crc kubenswrapper[4792]: I0301 09:29:10.830473 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90e1a395-ebc6-49ca-9924-c64283c12ec4-kube-api-access-h8fsh" (OuterVolumeSpecName: "kube-api-access-h8fsh") pod "90e1a395-ebc6-49ca-9924-c64283c12ec4" (UID: "90e1a395-ebc6-49ca-9924-c64283c12ec4"). InnerVolumeSpecName "kube-api-access-h8fsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:29:10 crc kubenswrapper[4792]: I0301 09:29:10.841778 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "90e1a395-ebc6-49ca-9924-c64283c12ec4" (UID: "90e1a395-ebc6-49ca-9924-c64283c12ec4"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:10 crc kubenswrapper[4792]: I0301 09:29:10.903281 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "90e1a395-ebc6-49ca-9924-c64283c12ec4" (UID: "90e1a395-ebc6-49ca-9924-c64283c12ec4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:10 crc kubenswrapper[4792]: I0301 09:29:10.920483 4792 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:10 crc kubenswrapper[4792]: I0301 09:29:10.920657 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8fsh\" (UniqueName: \"kubernetes.io/projected/90e1a395-ebc6-49ca-9924-c64283c12ec4-kube-api-access-h8fsh\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:10 crc kubenswrapper[4792]: I0301 09:29:10.920717 4792 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:10 crc kubenswrapper[4792]: I0301 09:29:10.975460 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-config" (OuterVolumeSpecName: "config") pod "90e1a395-ebc6-49ca-9924-c64283c12ec4" (UID: "90e1a395-ebc6-49ca-9924-c64283c12ec4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.029775 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "90e1a395-ebc6-49ca-9924-c64283c12ec4" (UID: "90e1a395-ebc6-49ca-9924-c64283c12ec4"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.033154 4792 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.033183 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.046046 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "90e1a395-ebc6-49ca-9924-c64283c12ec4" (UID: "90e1a395-ebc6-49ca-9924-c64283c12ec4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.060770 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90e1a395-ebc6-49ca-9924-c64283c12ec4" (UID: "90e1a395-ebc6-49ca-9924-c64283c12ec4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.140224 4792 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.140267 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.203715 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.343107 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-combined-ca-bundle\") pod \"5a7e11fe-898a-442c-b619-d7ccea385948\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.343227 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-scripts\") pod \"5a7e11fe-898a-442c-b619-d7ccea385948\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.343290 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-config-data\") pod \"5a7e11fe-898a-442c-b619-d7ccea385948\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.343320 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttz97\" (UniqueName: \"kubernetes.io/projected/5a7e11fe-898a-442c-b619-d7ccea385948-kube-api-access-ttz97\") pod \"5a7e11fe-898a-442c-b619-d7ccea385948\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.343860 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5a7e11fe-898a-442c-b619-d7ccea385948-etc-machine-id\") pod \"5a7e11fe-898a-442c-b619-d7ccea385948\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.343892 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-config-data-custom\") pod \"5a7e11fe-898a-442c-b619-d7ccea385948\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.343977 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5a7e11fe-898a-442c-b619-d7ccea385948-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5a7e11fe-898a-442c-b619-d7ccea385948" (UID: "5a7e11fe-898a-442c-b619-d7ccea385948"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.344493 4792 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5a7e11fe-898a-442c-b619-d7ccea385948-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.348386 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-scripts" (OuterVolumeSpecName: "scripts") pod "5a7e11fe-898a-442c-b619-d7ccea385948" (UID: "5a7e11fe-898a-442c-b619-d7ccea385948"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.352163 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a7e11fe-898a-442c-b619-d7ccea385948-kube-api-access-ttz97" (OuterVolumeSpecName: "kube-api-access-ttz97") pod "5a7e11fe-898a-442c-b619-d7ccea385948" (UID: "5a7e11fe-898a-442c-b619-d7ccea385948"). InnerVolumeSpecName "kube-api-access-ttz97". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.353046 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5a7e11fe-898a-442c-b619-d7ccea385948" (UID: "5a7e11fe-898a-442c-b619-d7ccea385948"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.403258 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a7e11fe-898a-442c-b619-d7ccea385948" (UID: "5a7e11fe-898a-442c-b619-d7ccea385948"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.446448 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.446481 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.446490 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttz97\" (UniqueName: \"kubernetes.io/projected/5a7e11fe-898a-442c-b619-d7ccea385948-kube-api-access-ttz97\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.446502 4792 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.451077 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-config-data" (OuterVolumeSpecName: "config-data") pod "5a7e11fe-898a-442c-b619-d7ccea385948" (UID: "5a7e11fe-898a-442c-b619-d7ccea385948"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.548544 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.768280 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5a7e11fe-898a-442c-b619-d7ccea385948","Type":"ContainerDied","Data":"35af3639b921b65729861f597174631d1ccc2f9baac748d7d2575658d3be08b9"} Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.768581 4792 scope.go:117] "RemoveContainer" containerID="70aca8fafacb1a0424dab05fe3248c2f55093cc69642e8fa33c562316f86f9cc" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.768509 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.771044 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.805937 4792 scope.go:117] "RemoveContainer" containerID="375e2e53d30f9933a6980133a207824c3a278795869bc631327981b1d5488167" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.815629 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-66955dfdb5-6j2wx"] Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.834066 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-66955dfdb5-6j2wx"] Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.850966 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.863436 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.880205 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 01 09:29:11 crc kubenswrapper[4792]: E0301 09:29:11.880608 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a7e11fe-898a-442c-b619-d7ccea385948" containerName="cinder-scheduler" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.880625 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a7e11fe-898a-442c-b619-d7ccea385948" containerName="cinder-scheduler" Mar 01 09:29:11 crc kubenswrapper[4792]: E0301 09:29:11.880635 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a7e11fe-898a-442c-b619-d7ccea385948" containerName="probe" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.880641 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a7e11fe-898a-442c-b619-d7ccea385948" containerName="probe" Mar 01 09:29:11 crc kubenswrapper[4792]: E0301 09:29:11.880653 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90e1a395-ebc6-49ca-9924-c64283c12ec4" containerName="neutron-httpd" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.880659 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="90e1a395-ebc6-49ca-9924-c64283c12ec4" containerName="neutron-httpd" Mar 01 09:29:11 crc kubenswrapper[4792]: E0301 09:29:11.880671 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90e1a395-ebc6-49ca-9924-c64283c12ec4" containerName="neutron-api" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.880677 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="90e1a395-ebc6-49ca-9924-c64283c12ec4" containerName="neutron-api" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.880849 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="90e1a395-ebc6-49ca-9924-c64283c12ec4" containerName="neutron-httpd" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.880861 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a7e11fe-898a-442c-b619-d7ccea385948" containerName="probe" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.880871 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="90e1a395-ebc6-49ca-9924-c64283c12ec4" containerName="neutron-api" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.880884 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a7e11fe-898a-442c-b619-d7ccea385948" containerName="cinder-scheduler" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.881708 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.884198 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.914468 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.956062 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688f590f-ae5c-4caf-b8c7-013a118f42c5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"688f590f-ae5c-4caf-b8c7-013a118f42c5\") " pod="openstack/cinder-scheduler-0" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.956152 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/688f590f-ae5c-4caf-b8c7-013a118f42c5-scripts\") pod \"cinder-scheduler-0\" (UID: \"688f590f-ae5c-4caf-b8c7-013a118f42c5\") " pod="openstack/cinder-scheduler-0" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.956214 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk84z\" (UniqueName: \"kubernetes.io/projected/688f590f-ae5c-4caf-b8c7-013a118f42c5-kube-api-access-lk84z\") pod \"cinder-scheduler-0\" (UID: \"688f590f-ae5c-4caf-b8c7-013a118f42c5\") " pod="openstack/cinder-scheduler-0" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.956268 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/688f590f-ae5c-4caf-b8c7-013a118f42c5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"688f590f-ae5c-4caf-b8c7-013a118f42c5\") " pod="openstack/cinder-scheduler-0" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.956384 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/688f590f-ae5c-4caf-b8c7-013a118f42c5-config-data\") pod \"cinder-scheduler-0\" (UID: \"688f590f-ae5c-4caf-b8c7-013a118f42c5\") " pod="openstack/cinder-scheduler-0" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.956423 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/688f590f-ae5c-4caf-b8c7-013a118f42c5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"688f590f-ae5c-4caf-b8c7-013a118f42c5\") " pod="openstack/cinder-scheduler-0" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.057926 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/688f590f-ae5c-4caf-b8c7-013a118f42c5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"688f590f-ae5c-4caf-b8c7-013a118f42c5\") " pod="openstack/cinder-scheduler-0" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.058026 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/688f590f-ae5c-4caf-b8c7-013a118f42c5-config-data\") pod \"cinder-scheduler-0\" (UID: \"688f590f-ae5c-4caf-b8c7-013a118f42c5\") " pod="openstack/cinder-scheduler-0" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.058066 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/688f590f-ae5c-4caf-b8c7-013a118f42c5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"688f590f-ae5c-4caf-b8c7-013a118f42c5\") " pod="openstack/cinder-scheduler-0" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.058117 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688f590f-ae5c-4caf-b8c7-013a118f42c5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"688f590f-ae5c-4caf-b8c7-013a118f42c5\") " pod="openstack/cinder-scheduler-0" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.058135 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/688f590f-ae5c-4caf-b8c7-013a118f42c5-scripts\") pod \"cinder-scheduler-0\" (UID: \"688f590f-ae5c-4caf-b8c7-013a118f42c5\") " pod="openstack/cinder-scheduler-0" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.058165 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk84z\" (UniqueName: \"kubernetes.io/projected/688f590f-ae5c-4caf-b8c7-013a118f42c5-kube-api-access-lk84z\") pod \"cinder-scheduler-0\" (UID: \"688f590f-ae5c-4caf-b8c7-013a118f42c5\") " pod="openstack/cinder-scheduler-0" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.058457 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/688f590f-ae5c-4caf-b8c7-013a118f42c5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"688f590f-ae5c-4caf-b8c7-013a118f42c5\") " pod="openstack/cinder-scheduler-0" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.064145 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/688f590f-ae5c-4caf-b8c7-013a118f42c5-scripts\") pod \"cinder-scheduler-0\" (UID: \"688f590f-ae5c-4caf-b8c7-013a118f42c5\") " pod="openstack/cinder-scheduler-0" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.064833 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688f590f-ae5c-4caf-b8c7-013a118f42c5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"688f590f-ae5c-4caf-b8c7-013a118f42c5\") " pod="openstack/cinder-scheduler-0" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.066041 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/688f590f-ae5c-4caf-b8c7-013a118f42c5-config-data\") pod \"cinder-scheduler-0\" (UID: \"688f590f-ae5c-4caf-b8c7-013a118f42c5\") " pod="openstack/cinder-scheduler-0" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.073445 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/688f590f-ae5c-4caf-b8c7-013a118f42c5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"688f590f-ae5c-4caf-b8c7-013a118f42c5\") " pod="openstack/cinder-scheduler-0" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.099172 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk84z\" (UniqueName: \"kubernetes.io/projected/688f590f-ae5c-4caf-b8c7-013a118f42c5-kube-api-access-lk84z\") pod \"cinder-scheduler-0\" (UID: \"688f590f-ae5c-4caf-b8c7-013a118f42c5\") " pod="openstack/cinder-scheduler-0" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.197092 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.488073 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.489512 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.493336 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.493863 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.496221 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-f6v2c" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.539984 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.572914 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5bj9\" (UniqueName: \"kubernetes.io/projected/fecafda6-dcf9-46ea-8678-8da499154ad7-kube-api-access-p5bj9\") pod \"openstackclient\" (UID: \"fecafda6-dcf9-46ea-8678-8da499154ad7\") " pod="openstack/openstackclient" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.573059 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fecafda6-dcf9-46ea-8678-8da499154ad7-openstack-config-secret\") pod \"openstackclient\" (UID: \"fecafda6-dcf9-46ea-8678-8da499154ad7\") " pod="openstack/openstackclient" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.573095 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fecafda6-dcf9-46ea-8678-8da499154ad7-openstack-config\") pod \"openstackclient\" (UID: \"fecafda6-dcf9-46ea-8678-8da499154ad7\") " pod="openstack/openstackclient" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.573155 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fecafda6-dcf9-46ea-8678-8da499154ad7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"fecafda6-dcf9-46ea-8678-8da499154ad7\") " pod="openstack/openstackclient" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.675587 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fecafda6-dcf9-46ea-8678-8da499154ad7-openstack-config-secret\") pod \"openstackclient\" (UID: \"fecafda6-dcf9-46ea-8678-8da499154ad7\") " pod="openstack/openstackclient" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.675656 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fecafda6-dcf9-46ea-8678-8da499154ad7-openstack-config\") pod \"openstackclient\" (UID: \"fecafda6-dcf9-46ea-8678-8da499154ad7\") " pod="openstack/openstackclient" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.675709 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fecafda6-dcf9-46ea-8678-8da499154ad7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"fecafda6-dcf9-46ea-8678-8da499154ad7\") " pod="openstack/openstackclient" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.675742 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5bj9\" (UniqueName: \"kubernetes.io/projected/fecafda6-dcf9-46ea-8678-8da499154ad7-kube-api-access-p5bj9\") pod \"openstackclient\" (UID: \"fecafda6-dcf9-46ea-8678-8da499154ad7\") " pod="openstack/openstackclient" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.677601 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fecafda6-dcf9-46ea-8678-8da499154ad7-openstack-config\") pod \"openstackclient\" (UID: \"fecafda6-dcf9-46ea-8678-8da499154ad7\") " pod="openstack/openstackclient" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.680376 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fecafda6-dcf9-46ea-8678-8da499154ad7-openstack-config-secret\") pod \"openstackclient\" (UID: \"fecafda6-dcf9-46ea-8678-8da499154ad7\") " pod="openstack/openstackclient" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.687641 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fecafda6-dcf9-46ea-8678-8da499154ad7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"fecafda6-dcf9-46ea-8678-8da499154ad7\") " pod="openstack/openstackclient" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.735984 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5bj9\" (UniqueName: \"kubernetes.io/projected/fecafda6-dcf9-46ea-8678-8da499154ad7-kube-api-access-p5bj9\") pod \"openstackclient\" (UID: \"fecafda6-dcf9-46ea-8678-8da499154ad7\") " pod="openstack/openstackclient" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.823205 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.908376 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 01 09:29:12 crc kubenswrapper[4792]: W0301 09:29:12.912549 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod688f590f_ae5c_4caf_b8c7_013a118f42c5.slice/crio-7e55159b365aaa12efa912d1ed7a3716f8dae566216ff02c51088f4e8c77d169 WatchSource:0}: Error finding container 7e55159b365aaa12efa912d1ed7a3716f8dae566216ff02c51088f4e8c77d169: Status 404 returned error can't find the container with id 7e55159b365aaa12efa912d1ed7a3716f8dae566216ff02c51088f4e8c77d169 Mar 01 09:29:13 crc kubenswrapper[4792]: I0301 09:29:13.379657 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 01 09:29:13 crc kubenswrapper[4792]: I0301 09:29:13.385193 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-686d9f9896-9zsh2" podUID="4721d2a8-efb5-4fa3-9779-797448455198" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.152:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 01 09:29:13 crc kubenswrapper[4792]: I0301 09:29:13.385314 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/barbican-api-686d9f9896-9zsh2" Mar 01 09:29:13 crc kubenswrapper[4792]: W0301 09:29:13.398294 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfecafda6_dcf9_46ea_8678_8da499154ad7.slice/crio-945cdd534bc89f7e8b59ea5fd1ca3574fe34009c41b218483277963fb4e4ef6c WatchSource:0}: Error finding container 945cdd534bc89f7e8b59ea5fd1ca3574fe34009c41b218483277963fb4e4ef6c: Status 404 returned error can't find the container with id 945cdd534bc89f7e8b59ea5fd1ca3574fe34009c41b218483277963fb4e4ef6c Mar 01 09:29:13 crc kubenswrapper[4792]: I0301 09:29:13.419133 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a7e11fe-898a-442c-b619-d7ccea385948" path="/var/lib/kubelet/pods/5a7e11fe-898a-442c-b619-d7ccea385948/volumes" Mar 01 09:29:13 crc kubenswrapper[4792]: I0301 09:29:13.419870 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90e1a395-ebc6-49ca-9924-c64283c12ec4" path="/var/lib/kubelet/pods/90e1a395-ebc6-49ca-9924-c64283c12ec4/volumes" Mar 01 09:29:13 crc kubenswrapper[4792]: I0301 09:29:13.796662 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"688f590f-ae5c-4caf-b8c7-013a118f42c5","Type":"ContainerStarted","Data":"33ccd7b9bd7cd18961dff5220e934d271bdba777b71b912d25cd7d44e78f7eeb"} Mar 01 09:29:13 crc kubenswrapper[4792]: I0301 09:29:13.797069 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"688f590f-ae5c-4caf-b8c7-013a118f42c5","Type":"ContainerStarted","Data":"7e55159b365aaa12efa912d1ed7a3716f8dae566216ff02c51088f4e8c77d169"} Mar 01 09:29:13 crc kubenswrapper[4792]: I0301 09:29:13.800399 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"fecafda6-dcf9-46ea-8678-8da499154ad7","Type":"ContainerStarted","Data":"945cdd534bc89f7e8b59ea5fd1ca3574fe34009c41b218483277963fb4e4ef6c"} Mar 01 09:29:14 crc kubenswrapper[4792]: I0301 09:29:14.811969 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"688f590f-ae5c-4caf-b8c7-013a118f42c5","Type":"ContainerStarted","Data":"06eb3da651cd4d4a1436808973840acba906224bea877ac66132b9f76f86965a"} Mar 01 09:29:14 crc kubenswrapper[4792]: I0301 09:29:14.838475 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.838459822 podStartE2EDuration="3.838459822s" podCreationTimestamp="2026-03-01 09:29:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:29:14.836524464 +0000 UTC m=+1284.078403661" watchObservedRunningTime="2026-03-01 09:29:14.838459822 +0000 UTC m=+1284.080339019" Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.031451 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-686d9f9896-9zsh2" podUID="4721d2a8-efb5-4fa3-9779-797448455198" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.152:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.079343 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-api-0" podUID="084f9db1-15eb-458c-8b43-aeb5dbb0555f" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.159:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.079875 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-686d9f9896-9zsh2" podUID="4721d2a8-efb5-4fa3-9779-797448455198" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.152:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.092067 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-686d9f9896-9zsh2" podUID="4721d2a8-efb5-4fa3-9779-797448455198" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.152:9311/healthcheck\": read tcp 10.217.0.2:34784->10.217.0.152:9311: read: connection reset by peer" Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.092108 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-686d9f9896-9zsh2" podUID="4721d2a8-efb5-4fa3-9779-797448455198" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.152:9311/healthcheck\": read tcp 10.217.0.2:34794->10.217.0.152:9311: read: connection reset by peer" Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.597588 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.609579 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-686d9f9896-9zsh2" Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.656332 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4721d2a8-efb5-4fa3-9779-797448455198-config-data\") pod \"4721d2a8-efb5-4fa3-9779-797448455198\" (UID: \"4721d2a8-efb5-4fa3-9779-797448455198\") " Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.656411 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58r6b\" (UniqueName: \"kubernetes.io/projected/4721d2a8-efb5-4fa3-9779-797448455198-kube-api-access-58r6b\") pod \"4721d2a8-efb5-4fa3-9779-797448455198\" (UID: \"4721d2a8-efb5-4fa3-9779-797448455198\") " Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.656507 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4721d2a8-efb5-4fa3-9779-797448455198-config-data-custom\") pod \"4721d2a8-efb5-4fa3-9779-797448455198\" (UID: \"4721d2a8-efb5-4fa3-9779-797448455198\") " Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.656535 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4721d2a8-efb5-4fa3-9779-797448455198-combined-ca-bundle\") pod \"4721d2a8-efb5-4fa3-9779-797448455198\" (UID: \"4721d2a8-efb5-4fa3-9779-797448455198\") " Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.656584 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4721d2a8-efb5-4fa3-9779-797448455198-logs\") pod \"4721d2a8-efb5-4fa3-9779-797448455198\" (UID: \"4721d2a8-efb5-4fa3-9779-797448455198\") " Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.658104 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4721d2a8-efb5-4fa3-9779-797448455198-logs" (OuterVolumeSpecName: "logs") pod "4721d2a8-efb5-4fa3-9779-797448455198" (UID: "4721d2a8-efb5-4fa3-9779-797448455198"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.673558 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4721d2a8-efb5-4fa3-9779-797448455198-kube-api-access-58r6b" (OuterVolumeSpecName: "kube-api-access-58r6b") pod "4721d2a8-efb5-4fa3-9779-797448455198" (UID: "4721d2a8-efb5-4fa3-9779-797448455198"). InnerVolumeSpecName "kube-api-access-58r6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.675234 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4721d2a8-efb5-4fa3-9779-797448455198-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4721d2a8-efb5-4fa3-9779-797448455198" (UID: "4721d2a8-efb5-4fa3-9779-797448455198"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.757880 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4721d2a8-efb5-4fa3-9779-797448455198-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4721d2a8-efb5-4fa3-9779-797448455198" (UID: "4721d2a8-efb5-4fa3-9779-797448455198"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.759672 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4721d2a8-efb5-4fa3-9779-797448455198-logs\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.759696 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58r6b\" (UniqueName: \"kubernetes.io/projected/4721d2a8-efb5-4fa3-9779-797448455198-kube-api-access-58r6b\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.759707 4792 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4721d2a8-efb5-4fa3-9779-797448455198-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.759714 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4721d2a8-efb5-4fa3-9779-797448455198-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.769847 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4721d2a8-efb5-4fa3-9779-797448455198-config-data" (OuterVolumeSpecName: "config-data") pod "4721d2a8-efb5-4fa3-9779-797448455198" (UID: "4721d2a8-efb5-4fa3-9779-797448455198"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.857123 4792 generic.go:334] "Generic (PLEG): container finished" podID="4721d2a8-efb5-4fa3-9779-797448455198" containerID="b2421256e4e0d8725502059d1b77eb867deef0339092d9a858244caa156e8a2b" exitCode=0 Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.858116 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-686d9f9896-9zsh2" event={"ID":"4721d2a8-efb5-4fa3-9779-797448455198","Type":"ContainerDied","Data":"b2421256e4e0d8725502059d1b77eb867deef0339092d9a858244caa156e8a2b"} Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.858147 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-686d9f9896-9zsh2" event={"ID":"4721d2a8-efb5-4fa3-9779-797448455198","Type":"ContainerDied","Data":"128fecf3528e96b978b34551abfcb59be5f329262f2b7798f19013b2931fb841"} Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.858164 4792 scope.go:117] "RemoveContainer" containerID="b2421256e4e0d8725502059d1b77eb867deef0339092d9a858244caa156e8a2b" Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.858266 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-686d9f9896-9zsh2" Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.863103 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4721d2a8-efb5-4fa3-9779-797448455198-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.922964 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-686d9f9896-9zsh2"] Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.957882 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-686d9f9896-9zsh2"] Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.962026 4792 scope.go:117] "RemoveContainer" containerID="b0ab985ce4b6e50f0fbdce6821fe03c40fba76255764f0fad8a448b7b0c30943" Mar 01 09:29:16 crc kubenswrapper[4792]: I0301 09:29:16.042281 4792 scope.go:117] "RemoveContainer" containerID="b2421256e4e0d8725502059d1b77eb867deef0339092d9a858244caa156e8a2b" Mar 01 09:29:16 crc kubenswrapper[4792]: E0301 09:29:16.051509 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2421256e4e0d8725502059d1b77eb867deef0339092d9a858244caa156e8a2b\": container with ID starting with b2421256e4e0d8725502059d1b77eb867deef0339092d9a858244caa156e8a2b not found: ID does not exist" containerID="b2421256e4e0d8725502059d1b77eb867deef0339092d9a858244caa156e8a2b" Mar 01 09:29:16 crc kubenswrapper[4792]: I0301 09:29:16.051600 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2421256e4e0d8725502059d1b77eb867deef0339092d9a858244caa156e8a2b"} err="failed to get container status \"b2421256e4e0d8725502059d1b77eb867deef0339092d9a858244caa156e8a2b\": rpc error: code = NotFound desc = could not find container \"b2421256e4e0d8725502059d1b77eb867deef0339092d9a858244caa156e8a2b\": container with ID starting with b2421256e4e0d8725502059d1b77eb867deef0339092d9a858244caa156e8a2b not found: ID does not exist" Mar 01 09:29:16 crc kubenswrapper[4792]: I0301 09:29:16.051629 4792 scope.go:117] "RemoveContainer" containerID="b0ab985ce4b6e50f0fbdce6821fe03c40fba76255764f0fad8a448b7b0c30943" Mar 01 09:29:16 crc kubenswrapper[4792]: E0301 09:29:16.053340 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0ab985ce4b6e50f0fbdce6821fe03c40fba76255764f0fad8a448b7b0c30943\": container with ID starting with b0ab985ce4b6e50f0fbdce6821fe03c40fba76255764f0fad8a448b7b0c30943 not found: ID does not exist" containerID="b0ab985ce4b6e50f0fbdce6821fe03c40fba76255764f0fad8a448b7b0c30943" Mar 01 09:29:16 crc kubenswrapper[4792]: I0301 09:29:16.053380 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0ab985ce4b6e50f0fbdce6821fe03c40fba76255764f0fad8a448b7b0c30943"} err="failed to get container status \"b0ab985ce4b6e50f0fbdce6821fe03c40fba76255764f0fad8a448b7b0c30943\": rpc error: code = NotFound desc = could not find container \"b0ab985ce4b6e50f0fbdce6821fe03c40fba76255764f0fad8a448b7b0c30943\": container with ID starting with b0ab985ce4b6e50f0fbdce6821fe03c40fba76255764f0fad8a448b7b0c30943 not found: ID does not exist" Mar 01 09:29:17 crc kubenswrapper[4792]: I0301 09:29:17.197836 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 01 09:29:17 crc kubenswrapper[4792]: I0301 09:29:17.417285 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4721d2a8-efb5-4fa3-9779-797448455198" path="/var/lib/kubelet/pods/4721d2a8-efb5-4fa3-9779-797448455198/volumes" Mar 01 09:29:20 crc kubenswrapper[4792]: I0301 09:29:20.421709 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 01 09:29:22 crc kubenswrapper[4792]: I0301 09:29:22.426882 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 01 09:29:23 crc kubenswrapper[4792]: I0301 09:29:23.124709 4792 scope.go:117] "RemoveContainer" containerID="b3416cff442b7b3bec1893fb5c0aa2d61087db5d4679dae5ed62f8ea4a150ca7" Mar 01 09:29:24 crc kubenswrapper[4792]: I0301 09:29:24.607201 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:29:24 crc kubenswrapper[4792]: I0301 09:29:24.607452 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="73747536-7a61-4c63-87c7-9e4c72471fb1" containerName="ceilometer-central-agent" containerID="cri-o://b90600661de56082bcb7ae56c666abd530a56c37ee4cdb45c25503eb416e5a05" gracePeriod=30 Mar 01 09:29:24 crc kubenswrapper[4792]: I0301 09:29:24.607539 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="73747536-7a61-4c63-87c7-9e4c72471fb1" containerName="proxy-httpd" containerID="cri-o://3b83ecfb929c62345bb4c7e97676547d3a0dba6dd25466d0bdaa53eaba514d91" gracePeriod=30 Mar 01 09:29:24 crc kubenswrapper[4792]: I0301 09:29:24.607566 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="73747536-7a61-4c63-87c7-9e4c72471fb1" containerName="ceilometer-notification-agent" containerID="cri-o://30490c34339b14f02bc47ec05535e3863a7df94573da07df5321b635c71d016a" gracePeriod=30 Mar 01 09:29:24 crc kubenswrapper[4792]: I0301 09:29:24.607574 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="73747536-7a61-4c63-87c7-9e4c72471fb1" containerName="sg-core" containerID="cri-o://23d29fcdda3c38825545e571c946d862613874d7ef2eb012aff4b4af60d5268e" gracePeriod=30 Mar 01 09:29:24 crc kubenswrapper[4792]: I0301 09:29:24.950302 4792 generic.go:334] "Generic (PLEG): container finished" podID="73747536-7a61-4c63-87c7-9e4c72471fb1" containerID="3b83ecfb929c62345bb4c7e97676547d3a0dba6dd25466d0bdaa53eaba514d91" exitCode=0 Mar 01 09:29:24 crc kubenswrapper[4792]: I0301 09:29:24.950552 4792 generic.go:334] "Generic (PLEG): container finished" podID="73747536-7a61-4c63-87c7-9e4c72471fb1" containerID="23d29fcdda3c38825545e571c946d862613874d7ef2eb012aff4b4af60d5268e" exitCode=2 Mar 01 09:29:24 crc kubenswrapper[4792]: I0301 09:29:24.950582 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73747536-7a61-4c63-87c7-9e4c72471fb1","Type":"ContainerDied","Data":"3b83ecfb929c62345bb4c7e97676547d3a0dba6dd25466d0bdaa53eaba514d91"} Mar 01 09:29:24 crc kubenswrapper[4792]: I0301 09:29:24.950607 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73747536-7a61-4c63-87c7-9e4c72471fb1","Type":"ContainerDied","Data":"23d29fcdda3c38825545e571c946d862613874d7ef2eb012aff4b4af60d5268e"} Mar 01 09:29:25 crc kubenswrapper[4792]: I0301 09:29:25.961693 4792 generic.go:334] "Generic (PLEG): container finished" podID="73747536-7a61-4c63-87c7-9e4c72471fb1" containerID="30490c34339b14f02bc47ec05535e3863a7df94573da07df5321b635c71d016a" exitCode=0 Mar 01 09:29:25 crc kubenswrapper[4792]: I0301 09:29:25.961731 4792 generic.go:334] "Generic (PLEG): container finished" podID="73747536-7a61-4c63-87c7-9e4c72471fb1" containerID="b90600661de56082bcb7ae56c666abd530a56c37ee4cdb45c25503eb416e5a05" exitCode=0 Mar 01 09:29:25 crc kubenswrapper[4792]: I0301 09:29:25.961754 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73747536-7a61-4c63-87c7-9e4c72471fb1","Type":"ContainerDied","Data":"30490c34339b14f02bc47ec05535e3863a7df94573da07df5321b635c71d016a"} Mar 01 09:29:25 crc kubenswrapper[4792]: I0301 09:29:25.961781 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73747536-7a61-4c63-87c7-9e4c72471fb1","Type":"ContainerDied","Data":"b90600661de56082bcb7ae56c666abd530a56c37ee4cdb45c25503eb416e5a05"} Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.085214 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.169042 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-config-data\") pod \"73747536-7a61-4c63-87c7-9e4c72471fb1\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.169104 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-scripts\") pod \"73747536-7a61-4c63-87c7-9e4c72471fb1\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.169138 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-combined-ca-bundle\") pod \"73747536-7a61-4c63-87c7-9e4c72471fb1\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.169223 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-sg-core-conf-yaml\") pod \"73747536-7a61-4c63-87c7-9e4c72471fb1\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.169284 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpnn8\" (UniqueName: \"kubernetes.io/projected/73747536-7a61-4c63-87c7-9e4c72471fb1-kube-api-access-kpnn8\") pod \"73747536-7a61-4c63-87c7-9e4c72471fb1\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.169366 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73747536-7a61-4c63-87c7-9e4c72471fb1-log-httpd\") pod \"73747536-7a61-4c63-87c7-9e4c72471fb1\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.169610 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73747536-7a61-4c63-87c7-9e4c72471fb1-run-httpd\") pod \"73747536-7a61-4c63-87c7-9e4c72471fb1\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.171065 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73747536-7a61-4c63-87c7-9e4c72471fb1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "73747536-7a61-4c63-87c7-9e4c72471fb1" (UID: "73747536-7a61-4c63-87c7-9e4c72471fb1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.189148 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73747536-7a61-4c63-87c7-9e4c72471fb1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "73747536-7a61-4c63-87c7-9e4c72471fb1" (UID: "73747536-7a61-4c63-87c7-9e4c72471fb1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.201211 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73747536-7a61-4c63-87c7-9e4c72471fb1-kube-api-access-kpnn8" (OuterVolumeSpecName: "kube-api-access-kpnn8") pod "73747536-7a61-4c63-87c7-9e4c72471fb1" (UID: "73747536-7a61-4c63-87c7-9e4c72471fb1"). InnerVolumeSpecName "kube-api-access-kpnn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.204150 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-scripts" (OuterVolumeSpecName: "scripts") pod "73747536-7a61-4c63-87c7-9e4c72471fb1" (UID: "73747536-7a61-4c63-87c7-9e4c72471fb1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.274483 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.274513 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpnn8\" (UniqueName: \"kubernetes.io/projected/73747536-7a61-4c63-87c7-9e4c72471fb1-kube-api-access-kpnn8\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.274522 4792 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73747536-7a61-4c63-87c7-9e4c72471fb1-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.274533 4792 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73747536-7a61-4c63-87c7-9e4c72471fb1-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.292119 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "73747536-7a61-4c63-87c7-9e4c72471fb1" (UID: "73747536-7a61-4c63-87c7-9e4c72471fb1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.376548 4792 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.480366 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73747536-7a61-4c63-87c7-9e4c72471fb1" (UID: "73747536-7a61-4c63-87c7-9e4c72471fb1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.487060 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-config-data" (OuterVolumeSpecName: "config-data") pod "73747536-7a61-4c63-87c7-9e4c72471fb1" (UID: "73747536-7a61-4c63-87c7-9e4c72471fb1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.579181 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.579217 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.982681 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.982676 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73747536-7a61-4c63-87c7-9e4c72471fb1","Type":"ContainerDied","Data":"8f407749323a926af0db11e4921c8f80c0b44788d7a0172e925467426ce4a55c"} Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.982967 4792 scope.go:117] "RemoveContainer" containerID="3b83ecfb929c62345bb4c7e97676547d3a0dba6dd25466d0bdaa53eaba514d91" Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.985029 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"fecafda6-dcf9-46ea-8678-8da499154ad7","Type":"ContainerStarted","Data":"2eee12d1f27a3adae4e9750f69a737edde8e026f3558c981659a70a138181bbd"} Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.001113 4792 scope.go:117] "RemoveContainer" containerID="23d29fcdda3c38825545e571c946d862613874d7ef2eb012aff4b4af60d5268e" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.016169 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.7124536 podStartE2EDuration="16.01614824s" podCreationTimestamp="2026-03-01 09:29:12 +0000 UTC" firstStartedPulling="2026-03-01 09:29:13.403697735 +0000 UTC m=+1282.645576932" lastFinishedPulling="2026-03-01 09:29:26.707392375 +0000 UTC m=+1295.949271572" observedRunningTime="2026-03-01 09:29:28.003458588 +0000 UTC m=+1297.245337785" watchObservedRunningTime="2026-03-01 09:29:28.01614824 +0000 UTC m=+1297.258027437" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.035831 4792 scope.go:117] "RemoveContainer" containerID="30490c34339b14f02bc47ec05535e3863a7df94573da07df5321b635c71d016a" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.080938 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.101062 4792 scope.go:117] "RemoveContainer" containerID="b90600661de56082bcb7ae56c666abd530a56c37ee4cdb45c25503eb416e5a05" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.103400 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.115494 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:29:28 crc kubenswrapper[4792]: E0301 09:29:28.115946 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73747536-7a61-4c63-87c7-9e4c72471fb1" containerName="sg-core" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.115963 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="73747536-7a61-4c63-87c7-9e4c72471fb1" containerName="sg-core" Mar 01 09:29:28 crc kubenswrapper[4792]: E0301 09:29:28.115983 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4721d2a8-efb5-4fa3-9779-797448455198" containerName="barbican-api" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.116007 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4721d2a8-efb5-4fa3-9779-797448455198" containerName="barbican-api" Mar 01 09:29:28 crc kubenswrapper[4792]: E0301 09:29:28.116017 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73747536-7a61-4c63-87c7-9e4c72471fb1" containerName="ceilometer-notification-agent" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.116025 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="73747536-7a61-4c63-87c7-9e4c72471fb1" containerName="ceilometer-notification-agent" Mar 01 09:29:28 crc kubenswrapper[4792]: E0301 09:29:28.116046 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73747536-7a61-4c63-87c7-9e4c72471fb1" containerName="ceilometer-central-agent" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.116053 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="73747536-7a61-4c63-87c7-9e4c72471fb1" containerName="ceilometer-central-agent" Mar 01 09:29:28 crc kubenswrapper[4792]: E0301 09:29:28.116068 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73747536-7a61-4c63-87c7-9e4c72471fb1" containerName="proxy-httpd" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.116075 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="73747536-7a61-4c63-87c7-9e4c72471fb1" containerName="proxy-httpd" Mar 01 09:29:28 crc kubenswrapper[4792]: E0301 09:29:28.116086 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4721d2a8-efb5-4fa3-9779-797448455198" containerName="barbican-api-log" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.116093 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4721d2a8-efb5-4fa3-9779-797448455198" containerName="barbican-api-log" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.116308 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="4721d2a8-efb5-4fa3-9779-797448455198" containerName="barbican-api-log" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.116329 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="73747536-7a61-4c63-87c7-9e4c72471fb1" containerName="ceilometer-central-agent" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.116340 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="73747536-7a61-4c63-87c7-9e4c72471fb1" containerName="proxy-httpd" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.116354 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="4721d2a8-efb5-4fa3-9779-797448455198" containerName="barbican-api" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.116370 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="73747536-7a61-4c63-87c7-9e4c72471fb1" containerName="sg-core" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.116384 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="73747536-7a61-4c63-87c7-9e4c72471fb1" containerName="ceilometer-notification-agent" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.121433 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.124001 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.125383 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.152498 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.292784 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-scripts\") pod \"ceilometer-0\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " pod="openstack/ceilometer-0" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.292831 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-log-httpd\") pod \"ceilometer-0\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " pod="openstack/ceilometer-0" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.292857 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " pod="openstack/ceilometer-0" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.292883 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptvq9\" (UniqueName: \"kubernetes.io/projected/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-kube-api-access-ptvq9\") pod \"ceilometer-0\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " pod="openstack/ceilometer-0" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.292972 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " pod="openstack/ceilometer-0" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.293005 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-config-data\") pod \"ceilometer-0\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " pod="openstack/ceilometer-0" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.293297 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-run-httpd\") pod \"ceilometer-0\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " pod="openstack/ceilometer-0" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.394837 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-run-httpd\") pod \"ceilometer-0\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " pod="openstack/ceilometer-0" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.395154 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-scripts\") pod \"ceilometer-0\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " pod="openstack/ceilometer-0" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.395183 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-log-httpd\") pod \"ceilometer-0\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " pod="openstack/ceilometer-0" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.395202 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " pod="openstack/ceilometer-0" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.395226 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptvq9\" (UniqueName: \"kubernetes.io/projected/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-kube-api-access-ptvq9\") pod \"ceilometer-0\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " pod="openstack/ceilometer-0" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.395267 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " pod="openstack/ceilometer-0" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.395284 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-config-data\") pod \"ceilometer-0\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " pod="openstack/ceilometer-0" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.395344 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-run-httpd\") pod \"ceilometer-0\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " pod="openstack/ceilometer-0" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.395671 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-log-httpd\") pod \"ceilometer-0\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " pod="openstack/ceilometer-0" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.398503 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " pod="openstack/ceilometer-0" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.399742 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-config-data\") pod \"ceilometer-0\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " pod="openstack/ceilometer-0" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.400663 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-scripts\") pod \"ceilometer-0\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " pod="openstack/ceilometer-0" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.401527 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " pod="openstack/ceilometer-0" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.422287 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptvq9\" (UniqueName: \"kubernetes.io/projected/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-kube-api-access-ptvq9\") pod \"ceilometer-0\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " pod="openstack/ceilometer-0" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.448485 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:29:29 crc kubenswrapper[4792]: I0301 09:29:29.007614 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:29:29 crc kubenswrapper[4792]: I0301 09:29:29.017300 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"594710f1-32aa-4acc-a8ea-8cfec7b2c28c","Type":"ContainerStarted","Data":"a26311f7480277741aeeb123ba5ce9e74cfc4cc3c5c9582b395475f7202f8016"} Mar 01 09:29:29 crc kubenswrapper[4792]: I0301 09:29:29.089070 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:29:29 crc kubenswrapper[4792]: I0301 09:29:29.166929 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7bbc5b86d6-8b672"] Mar 01 09:29:29 crc kubenswrapper[4792]: I0301 09:29:29.167169 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7bbc5b86d6-8b672" podUID="013566fd-5627-422a-809a-e81a8ec059d9" containerName="neutron-api" containerID="cri-o://ca55616f2e5de805229eb50d3f643de3b33e4b53039ccf7569dc0337fc8e14a5" gracePeriod=30 Mar 01 09:29:29 crc kubenswrapper[4792]: I0301 09:29:29.167602 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7bbc5b86d6-8b672" podUID="013566fd-5627-422a-809a-e81a8ec059d9" containerName="neutron-httpd" containerID="cri-o://2f7d0d1b6918c9e962de8e492d201469a86475bd623f4086bd8a9e1f30a74d63" gracePeriod=30 Mar 01 09:29:29 crc kubenswrapper[4792]: I0301 09:29:29.476231 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73747536-7a61-4c63-87c7-9e4c72471fb1" path="/var/lib/kubelet/pods/73747536-7a61-4c63-87c7-9e4c72471fb1/volumes" Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.025112 4792 generic.go:334] "Generic (PLEG): container finished" podID="013566fd-5627-422a-809a-e81a8ec059d9" containerID="2f7d0d1b6918c9e962de8e492d201469a86475bd623f4086bd8a9e1f30a74d63" exitCode=0 Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.025183 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bbc5b86d6-8b672" event={"ID":"013566fd-5627-422a-809a-e81a8ec059d9","Type":"ContainerDied","Data":"2f7d0d1b6918c9e962de8e492d201469a86475bd623f4086bd8a9e1f30a74d63"} Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.615223 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-zj224"] Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.616715 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zj224" Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.642636 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-zj224"] Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.719790 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-qt42r"] Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.726491 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qt42r" Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.741467 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-qt42r"] Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.749442 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z52tq\" (UniqueName: \"kubernetes.io/projected/a069955e-f546-4522-97ec-5a529f79b1aa-kube-api-access-z52tq\") pod \"nova-api-db-create-zj224\" (UID: \"a069955e-f546-4522-97ec-5a529f79b1aa\") " pod="openstack/nova-api-db-create-zj224" Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.749545 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a069955e-f546-4522-97ec-5a529f79b1aa-operator-scripts\") pod \"nova-api-db-create-zj224\" (UID: \"a069955e-f546-4522-97ec-5a529f79b1aa\") " pod="openstack/nova-api-db-create-zj224" Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.810800 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-8f2r2"] Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.811868 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8f2r2" Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.823309 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-474a-account-create-update-dlgkl"] Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.824664 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-474a-account-create-update-dlgkl" Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.828550 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.838054 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-8f2r2"] Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.851060 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a069955e-f546-4522-97ec-5a529f79b1aa-operator-scripts\") pod \"nova-api-db-create-zj224\" (UID: \"a069955e-f546-4522-97ec-5a529f79b1aa\") " pod="openstack/nova-api-db-create-zj224" Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.851152 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj6j5\" (UniqueName: \"kubernetes.io/projected/21b0442e-f4b4-4f59-b3c5-1510ae4d792c-kube-api-access-lj6j5\") pod \"nova-cell0-db-create-qt42r\" (UID: \"21b0442e-f4b4-4f59-b3c5-1510ae4d792c\") " pod="openstack/nova-cell0-db-create-qt42r" Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.851188 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21b0442e-f4b4-4f59-b3c5-1510ae4d792c-operator-scripts\") pod \"nova-cell0-db-create-qt42r\" (UID: \"21b0442e-f4b4-4f59-b3c5-1510ae4d792c\") " pod="openstack/nova-cell0-db-create-qt42r" Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.851210 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z52tq\" (UniqueName: \"kubernetes.io/projected/a069955e-f546-4522-97ec-5a529f79b1aa-kube-api-access-z52tq\") pod \"nova-api-db-create-zj224\" (UID: \"a069955e-f546-4522-97ec-5a529f79b1aa\") " pod="openstack/nova-api-db-create-zj224" Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.852122 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a069955e-f546-4522-97ec-5a529f79b1aa-operator-scripts\") pod \"nova-api-db-create-zj224\" (UID: \"a069955e-f546-4522-97ec-5a529f79b1aa\") " pod="openstack/nova-api-db-create-zj224" Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.868138 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-474a-account-create-update-dlgkl"] Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.890737 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z52tq\" (UniqueName: \"kubernetes.io/projected/a069955e-f546-4522-97ec-5a529f79b1aa-kube-api-access-z52tq\") pod \"nova-api-db-create-zj224\" (UID: \"a069955e-f546-4522-97ec-5a529f79b1aa\") " pod="openstack/nova-api-db-create-zj224" Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.934110 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zj224" Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.959038 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lj6j5\" (UniqueName: \"kubernetes.io/projected/21b0442e-f4b4-4f59-b3c5-1510ae4d792c-kube-api-access-lj6j5\") pod \"nova-cell0-db-create-qt42r\" (UID: \"21b0442e-f4b4-4f59-b3c5-1510ae4d792c\") " pod="openstack/nova-cell0-db-create-qt42r" Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.959107 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21b0442e-f4b4-4f59-b3c5-1510ae4d792c-operator-scripts\") pod \"nova-cell0-db-create-qt42r\" (UID: \"21b0442e-f4b4-4f59-b3c5-1510ae4d792c\") " pod="openstack/nova-cell0-db-create-qt42r" Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.959148 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztqnk\" (UniqueName: \"kubernetes.io/projected/f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe-kube-api-access-ztqnk\") pod \"nova-cell1-db-create-8f2r2\" (UID: \"f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe\") " pod="openstack/nova-cell1-db-create-8f2r2" Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.959182 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe-operator-scripts\") pod \"nova-cell1-db-create-8f2r2\" (UID: \"f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe\") " pod="openstack/nova-cell1-db-create-8f2r2" Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.959240 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw6bw\" (UniqueName: \"kubernetes.io/projected/09b4c86e-31ba-4d91-a602-39fa3a57c798-kube-api-access-fw6bw\") pod \"nova-api-474a-account-create-update-dlgkl\" (UID: \"09b4c86e-31ba-4d91-a602-39fa3a57c798\") " pod="openstack/nova-api-474a-account-create-update-dlgkl" Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.959294 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09b4c86e-31ba-4d91-a602-39fa3a57c798-operator-scripts\") pod \"nova-api-474a-account-create-update-dlgkl\" (UID: \"09b4c86e-31ba-4d91-a602-39fa3a57c798\") " pod="openstack/nova-api-474a-account-create-update-dlgkl" Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.960357 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21b0442e-f4b4-4f59-b3c5-1510ae4d792c-operator-scripts\") pod \"nova-cell0-db-create-qt42r\" (UID: \"21b0442e-f4b4-4f59-b3c5-1510ae4d792c\") " pod="openstack/nova-cell0-db-create-qt42r" Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.983033 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj6j5\" (UniqueName: \"kubernetes.io/projected/21b0442e-f4b4-4f59-b3c5-1510ae4d792c-kube-api-access-lj6j5\") pod \"nova-cell0-db-create-qt42r\" (UID: \"21b0442e-f4b4-4f59-b3c5-1510ae4d792c\") " pod="openstack/nova-cell0-db-create-qt42r" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.025447 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-3b7a-account-create-update-2vc26"] Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.026703 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3b7a-account-create-update-2vc26" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.029124 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.051746 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qt42r" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.067664 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztqnk\" (UniqueName: \"kubernetes.io/projected/f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe-kube-api-access-ztqnk\") pod \"nova-cell1-db-create-8f2r2\" (UID: \"f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe\") " pod="openstack/nova-cell1-db-create-8f2r2" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.067725 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe-operator-scripts\") pod \"nova-cell1-db-create-8f2r2\" (UID: \"f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe\") " pod="openstack/nova-cell1-db-create-8f2r2" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.067785 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw6bw\" (UniqueName: \"kubernetes.io/projected/09b4c86e-31ba-4d91-a602-39fa3a57c798-kube-api-access-fw6bw\") pod \"nova-api-474a-account-create-update-dlgkl\" (UID: \"09b4c86e-31ba-4d91-a602-39fa3a57c798\") " pod="openstack/nova-api-474a-account-create-update-dlgkl" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.067831 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09b4c86e-31ba-4d91-a602-39fa3a57c798-operator-scripts\") pod \"nova-api-474a-account-create-update-dlgkl\" (UID: \"09b4c86e-31ba-4d91-a602-39fa3a57c798\") " pod="openstack/nova-api-474a-account-create-update-dlgkl" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.069688 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09b4c86e-31ba-4d91-a602-39fa3a57c798-operator-scripts\") pod \"nova-api-474a-account-create-update-dlgkl\" (UID: \"09b4c86e-31ba-4d91-a602-39fa3a57c798\") " pod="openstack/nova-api-474a-account-create-update-dlgkl" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.076864 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe-operator-scripts\") pod \"nova-cell1-db-create-8f2r2\" (UID: \"f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe\") " pod="openstack/nova-cell1-db-create-8f2r2" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.099103 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-3b7a-account-create-update-2vc26"] Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.109829 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"594710f1-32aa-4acc-a8ea-8cfec7b2c28c","Type":"ContainerStarted","Data":"20059357b7d0dcf77c24de3fd3876e0d7701b71cfa87b07f6aaaa2491b33046c"} Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.118153 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw6bw\" (UniqueName: \"kubernetes.io/projected/09b4c86e-31ba-4d91-a602-39fa3a57c798-kube-api-access-fw6bw\") pod \"nova-api-474a-account-create-update-dlgkl\" (UID: \"09b4c86e-31ba-4d91-a602-39fa3a57c798\") " pod="openstack/nova-api-474a-account-create-update-dlgkl" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.145212 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztqnk\" (UniqueName: \"kubernetes.io/projected/f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe-kube-api-access-ztqnk\") pod \"nova-cell1-db-create-8f2r2\" (UID: \"f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe\") " pod="openstack/nova-cell1-db-create-8f2r2" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.149979 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-474a-account-create-update-dlgkl" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.173893 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkcdk\" (UniqueName: \"kubernetes.io/projected/e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2-kube-api-access-dkcdk\") pod \"nova-cell0-3b7a-account-create-update-2vc26\" (UID: \"e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2\") " pod="openstack/nova-cell0-3b7a-account-create-update-2vc26" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.173964 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2-operator-scripts\") pod \"nova-cell0-3b7a-account-create-update-2vc26\" (UID: \"e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2\") " pod="openstack/nova-cell0-3b7a-account-create-update-2vc26" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.255221 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-c72f-account-create-update-x9vvj"] Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.258185 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c72f-account-create-update-x9vvj" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.273315 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.275029 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkcdk\" (UniqueName: \"kubernetes.io/projected/e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2-kube-api-access-dkcdk\") pod \"nova-cell0-3b7a-account-create-update-2vc26\" (UID: \"e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2\") " pod="openstack/nova-cell0-3b7a-account-create-update-2vc26" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.275062 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2-operator-scripts\") pod \"nova-cell0-3b7a-account-create-update-2vc26\" (UID: \"e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2\") " pod="openstack/nova-cell0-3b7a-account-create-update-2vc26" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.275749 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2-operator-scripts\") pod \"nova-cell0-3b7a-account-create-update-2vc26\" (UID: \"e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2\") " pod="openstack/nova-cell0-3b7a-account-create-update-2vc26" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.303287 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-c72f-account-create-update-x9vvj"] Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.314456 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkcdk\" (UniqueName: \"kubernetes.io/projected/e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2-kube-api-access-dkcdk\") pod \"nova-cell0-3b7a-account-create-update-2vc26\" (UID: \"e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2\") " pod="openstack/nova-cell0-3b7a-account-create-update-2vc26" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.353414 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3b7a-account-create-update-2vc26" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.377093 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2be4f49-c20a-4e25-bff3-e4617d275fa1-operator-scripts\") pod \"nova-cell1-c72f-account-create-update-x9vvj\" (UID: \"f2be4f49-c20a-4e25-bff3-e4617d275fa1\") " pod="openstack/nova-cell1-c72f-account-create-update-x9vvj" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.377266 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dgpl\" (UniqueName: \"kubernetes.io/projected/f2be4f49-c20a-4e25-bff3-e4617d275fa1-kube-api-access-9dgpl\") pod \"nova-cell1-c72f-account-create-update-x9vvj\" (UID: \"f2be4f49-c20a-4e25-bff3-e4617d275fa1\") " pod="openstack/nova-cell1-c72f-account-create-update-x9vvj" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.432525 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8f2r2" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.480145 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dgpl\" (UniqueName: \"kubernetes.io/projected/f2be4f49-c20a-4e25-bff3-e4617d275fa1-kube-api-access-9dgpl\") pod \"nova-cell1-c72f-account-create-update-x9vvj\" (UID: \"f2be4f49-c20a-4e25-bff3-e4617d275fa1\") " pod="openstack/nova-cell1-c72f-account-create-update-x9vvj" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.480252 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2be4f49-c20a-4e25-bff3-e4617d275fa1-operator-scripts\") pod \"nova-cell1-c72f-account-create-update-x9vvj\" (UID: \"f2be4f49-c20a-4e25-bff3-e4617d275fa1\") " pod="openstack/nova-cell1-c72f-account-create-update-x9vvj" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.481494 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2be4f49-c20a-4e25-bff3-e4617d275fa1-operator-scripts\") pod \"nova-cell1-c72f-account-create-update-x9vvj\" (UID: \"f2be4f49-c20a-4e25-bff3-e4617d275fa1\") " pod="openstack/nova-cell1-c72f-account-create-update-x9vvj" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.500672 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dgpl\" (UniqueName: \"kubernetes.io/projected/f2be4f49-c20a-4e25-bff3-e4617d275fa1-kube-api-access-9dgpl\") pod \"nova-cell1-c72f-account-create-update-x9vvj\" (UID: \"f2be4f49-c20a-4e25-bff3-e4617d275fa1\") " pod="openstack/nova-cell1-c72f-account-create-update-x9vvj" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.576424 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.669614 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c72f-account-create-update-x9vvj" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.748819 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-zj224"] Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.908687 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-qt42r"] Mar 01 09:29:32 crc kubenswrapper[4792]: I0301 09:29:32.075091 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-474a-account-create-update-dlgkl"] Mar 01 09:29:32 crc kubenswrapper[4792]: I0301 09:29:32.195764 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qt42r" event={"ID":"21b0442e-f4b4-4f59-b3c5-1510ae4d792c","Type":"ContainerStarted","Data":"5ac29da5fa52816d6af4f4e774013cba308601b08c0ed36851caa33072f6b41d"} Mar 01 09:29:32 crc kubenswrapper[4792]: I0301 09:29:32.222070 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zj224" event={"ID":"a069955e-f546-4522-97ec-5a529f79b1aa","Type":"ContainerStarted","Data":"82dc2fec75535cf7d5ba98c257213dcf0b978f1770576dc67153fee7dc3473af"} Mar 01 09:29:32 crc kubenswrapper[4792]: I0301 09:29:32.222116 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zj224" event={"ID":"a069955e-f546-4522-97ec-5a529f79b1aa","Type":"ContainerStarted","Data":"035975688c011495ce93c6c1b1155e999f1292fecaac7c3671be568df507bd87"} Mar 01 09:29:32 crc kubenswrapper[4792]: I0301 09:29:32.239692 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-3b7a-account-create-update-2vc26"] Mar 01 09:29:32 crc kubenswrapper[4792]: I0301 09:29:32.263860 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-zj224" podStartSLOduration=2.263840258 podStartE2EDuration="2.263840258s" podCreationTimestamp="2026-03-01 09:29:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:29:32.258270465 +0000 UTC m=+1301.500149682" watchObservedRunningTime="2026-03-01 09:29:32.263840258 +0000 UTC m=+1301.505719455" Mar 01 09:29:32 crc kubenswrapper[4792]: I0301 09:29:32.263973 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-8f2r2"] Mar 01 09:29:32 crc kubenswrapper[4792]: I0301 09:29:32.283619 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"594710f1-32aa-4acc-a8ea-8cfec7b2c28c","Type":"ContainerStarted","Data":"2974b52b9fe70c71e1067f1e48fbfe7f66447a69916e1c15a62b5a2377bf5549"} Mar 01 09:29:32 crc kubenswrapper[4792]: I0301 09:29:32.481540 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-c72f-account-create-update-x9vvj"] Mar 01 09:29:33 crc kubenswrapper[4792]: I0301 09:29:33.312693 4792 generic.go:334] "Generic (PLEG): container finished" podID="21b0442e-f4b4-4f59-b3c5-1510ae4d792c" containerID="7a8d1321567d66f2ccb1955a5edf06d8800b1b50205fae01644b45e7fa573653" exitCode=0 Mar 01 09:29:33 crc kubenswrapper[4792]: I0301 09:29:33.312885 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qt42r" event={"ID":"21b0442e-f4b4-4f59-b3c5-1510ae4d792c","Type":"ContainerDied","Data":"7a8d1321567d66f2ccb1955a5edf06d8800b1b50205fae01644b45e7fa573653"} Mar 01 09:29:33 crc kubenswrapper[4792]: I0301 09:29:33.319474 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-8f2r2" event={"ID":"f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe","Type":"ContainerStarted","Data":"dffb4e0c554e65ca661a311e46e8c09863ee0d9c8fb1783c221a0323e637a5f5"} Mar 01 09:29:33 crc kubenswrapper[4792]: I0301 09:29:33.319512 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-8f2r2" event={"ID":"f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe","Type":"ContainerStarted","Data":"dadc3c08b41a7bc9b02729db8b51278d209590e1ed422026617fbb3c1754363b"} Mar 01 09:29:33 crc kubenswrapper[4792]: I0301 09:29:33.323563 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-474a-account-create-update-dlgkl" event={"ID":"09b4c86e-31ba-4d91-a602-39fa3a57c798","Type":"ContainerStarted","Data":"92439593a89d55067268c29652937630710da016f1c0b75141189d39ceeced86"} Mar 01 09:29:33 crc kubenswrapper[4792]: I0301 09:29:33.323621 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-474a-account-create-update-dlgkl" event={"ID":"09b4c86e-31ba-4d91-a602-39fa3a57c798","Type":"ContainerStarted","Data":"3feefdb24fc4187fb81280730c9fa62f502ce358f1aaf7380b427d95a1866213"} Mar 01 09:29:33 crc kubenswrapper[4792]: I0301 09:29:33.330383 4792 generic.go:334] "Generic (PLEG): container finished" podID="a069955e-f546-4522-97ec-5a529f79b1aa" containerID="82dc2fec75535cf7d5ba98c257213dcf0b978f1770576dc67153fee7dc3473af" exitCode=0 Mar 01 09:29:33 crc kubenswrapper[4792]: I0301 09:29:33.330456 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zj224" event={"ID":"a069955e-f546-4522-97ec-5a529f79b1aa","Type":"ContainerDied","Data":"82dc2fec75535cf7d5ba98c257213dcf0b978f1770576dc67153fee7dc3473af"} Mar 01 09:29:33 crc kubenswrapper[4792]: I0301 09:29:33.353882 4792 generic.go:334] "Generic (PLEG): container finished" podID="e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2" containerID="05d5887b441a9b375453d0ad6f9bd8826e5d3d116043c948abf3e299df007d6e" exitCode=0 Mar 01 09:29:33 crc kubenswrapper[4792]: I0301 09:29:33.353993 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-3b7a-account-create-update-2vc26" event={"ID":"e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2","Type":"ContainerDied","Data":"05d5887b441a9b375453d0ad6f9bd8826e5d3d116043c948abf3e299df007d6e"} Mar 01 09:29:33 crc kubenswrapper[4792]: I0301 09:29:33.354018 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-3b7a-account-create-update-2vc26" event={"ID":"e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2","Type":"ContainerStarted","Data":"e15a28a43dd6c1126f3b22e8480bf4ba4434e945130f45feec86bf90e7913964"} Mar 01 09:29:33 crc kubenswrapper[4792]: I0301 09:29:33.397343 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c72f-account-create-update-x9vvj" event={"ID":"f2be4f49-c20a-4e25-bff3-e4617d275fa1","Type":"ContainerStarted","Data":"d36c7b1a79f4c4f6b03c61a96da47d8288af453f4f9225ccf2c3d4099f44d0df"} Mar 01 09:29:33 crc kubenswrapper[4792]: I0301 09:29:33.398807 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c72f-account-create-update-x9vvj" event={"ID":"f2be4f49-c20a-4e25-bff3-e4617d275fa1","Type":"ContainerStarted","Data":"771711ce9c2752c3012e04bcade441dfda2723a81435feaa4ce6f2649bebee72"} Mar 01 09:29:33 crc kubenswrapper[4792]: I0301 09:29:33.402016 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-8f2r2" podStartSLOduration=3.401996338 podStartE2EDuration="3.401996338s" podCreationTimestamp="2026-03-01 09:29:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:29:33.38038928 +0000 UTC m=+1302.622268477" watchObservedRunningTime="2026-03-01 09:29:33.401996338 +0000 UTC m=+1302.643875535" Mar 01 09:29:33 crc kubenswrapper[4792]: I0301 09:29:33.447855 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-474a-account-create-update-dlgkl" podStartSLOduration=3.447836667 podStartE2EDuration="3.447836667s" podCreationTimestamp="2026-03-01 09:29:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:29:33.423159266 +0000 UTC m=+1302.665038463" watchObservedRunningTime="2026-03-01 09:29:33.447836667 +0000 UTC m=+1302.689715864" Mar 01 09:29:33 crc kubenswrapper[4792]: I0301 09:29:33.489433 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"594710f1-32aa-4acc-a8ea-8cfec7b2c28c","Type":"ContainerStarted","Data":"f1c336e0ebe268dd9ca94d30a4f01dbdf7925917a75d6e87aa3514ba966658df"} Mar 01 09:29:33 crc kubenswrapper[4792]: I0301 09:29:33.500352 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-c72f-account-create-update-x9vvj" podStartSLOduration=2.500334086 podStartE2EDuration="2.500334086s" podCreationTimestamp="2026-03-01 09:29:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:29:33.466582467 +0000 UTC m=+1302.708461664" watchObservedRunningTime="2026-03-01 09:29:33.500334086 +0000 UTC m=+1302.742213283" Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.437283 4792 generic.go:334] "Generic (PLEG): container finished" podID="f2be4f49-c20a-4e25-bff3-e4617d275fa1" containerID="d36c7b1a79f4c4f6b03c61a96da47d8288af453f4f9225ccf2c3d4099f44d0df" exitCode=0 Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.437648 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c72f-account-create-update-x9vvj" event={"ID":"f2be4f49-c20a-4e25-bff3-e4617d275fa1","Type":"ContainerDied","Data":"d36c7b1a79f4c4f6b03c61a96da47d8288af453f4f9225ccf2c3d4099f44d0df"} Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.439944 4792 generic.go:334] "Generic (PLEG): container finished" podID="f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe" containerID="dffb4e0c554e65ca661a311e46e8c09863ee0d9c8fb1783c221a0323e637a5f5" exitCode=0 Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.440004 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-8f2r2" event={"ID":"f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe","Type":"ContainerDied","Data":"dffb4e0c554e65ca661a311e46e8c09863ee0d9c8fb1783c221a0323e637a5f5"} Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.443785 4792 generic.go:334] "Generic (PLEG): container finished" podID="09b4c86e-31ba-4d91-a602-39fa3a57c798" containerID="92439593a89d55067268c29652937630710da016f1c0b75141189d39ceeced86" exitCode=0 Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.443856 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-474a-account-create-update-dlgkl" event={"ID":"09b4c86e-31ba-4d91-a602-39fa3a57c798","Type":"ContainerDied","Data":"92439593a89d55067268c29652937630710da016f1c0b75141189d39ceeced86"} Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.446086 4792 generic.go:334] "Generic (PLEG): container finished" podID="013566fd-5627-422a-809a-e81a8ec059d9" containerID="ca55616f2e5de805229eb50d3f643de3b33e4b53039ccf7569dc0337fc8e14a5" exitCode=0 Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.446188 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bbc5b86d6-8b672" event={"ID":"013566fd-5627-422a-809a-e81a8ec059d9","Type":"ContainerDied","Data":"ca55616f2e5de805229eb50d3f643de3b33e4b53039ccf7569dc0337fc8e14a5"} Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.703678 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7bbc5b86d6-8b672" Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.820086 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3b7a-account-create-update-2vc26" Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.882879 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-ovndb-tls-certs\") pod \"013566fd-5627-422a-809a-e81a8ec059d9\" (UID: \"013566fd-5627-422a-809a-e81a8ec059d9\") " Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.883006 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-httpd-config\") pod \"013566fd-5627-422a-809a-e81a8ec059d9\" (UID: \"013566fd-5627-422a-809a-e81a8ec059d9\") " Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.883093 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-combined-ca-bundle\") pod \"013566fd-5627-422a-809a-e81a8ec059d9\" (UID: \"013566fd-5627-422a-809a-e81a8ec059d9\") " Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.883236 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsctq\" (UniqueName: \"kubernetes.io/projected/013566fd-5627-422a-809a-e81a8ec059d9-kube-api-access-hsctq\") pod \"013566fd-5627-422a-809a-e81a8ec059d9\" (UID: \"013566fd-5627-422a-809a-e81a8ec059d9\") " Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.883263 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-config\") pod \"013566fd-5627-422a-809a-e81a8ec059d9\" (UID: \"013566fd-5627-422a-809a-e81a8ec059d9\") " Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.897417 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/013566fd-5627-422a-809a-e81a8ec059d9-kube-api-access-hsctq" (OuterVolumeSpecName: "kube-api-access-hsctq") pod "013566fd-5627-422a-809a-e81a8ec059d9" (UID: "013566fd-5627-422a-809a-e81a8ec059d9"). InnerVolumeSpecName "kube-api-access-hsctq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.900827 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "013566fd-5627-422a-809a-e81a8ec059d9" (UID: "013566fd-5627-422a-809a-e81a8ec059d9"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.960515 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.960584 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.985234 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2-operator-scripts\") pod \"e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2\" (UID: \"e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2\") " Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.985323 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkcdk\" (UniqueName: \"kubernetes.io/projected/e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2-kube-api-access-dkcdk\") pod \"e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2\" (UID: \"e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2\") " Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.985761 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsctq\" (UniqueName: \"kubernetes.io/projected/013566fd-5627-422a-809a-e81a8ec059d9-kube-api-access-hsctq\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.985775 4792 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.985962 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2" (UID: "e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.007707 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2-kube-api-access-dkcdk" (OuterVolumeSpecName: "kube-api-access-dkcdk") pod "e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2" (UID: "e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2"). InnerVolumeSpecName "kube-api-access-dkcdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.049465 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-config" (OuterVolumeSpecName: "config") pod "013566fd-5627-422a-809a-e81a8ec059d9" (UID: "013566fd-5627-422a-809a-e81a8ec059d9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.050080 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "013566fd-5627-422a-809a-e81a8ec059d9" (UID: "013566fd-5627-422a-809a-e81a8ec059d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.076477 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zj224" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.081543 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qt42r" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.086967 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.086994 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.087003 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkcdk\" (UniqueName: \"kubernetes.io/projected/e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2-kube-api-access-dkcdk\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.087014 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.089034 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "013566fd-5627-422a-809a-e81a8ec059d9" (UID: "013566fd-5627-422a-809a-e81a8ec059d9"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.188266 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lj6j5\" (UniqueName: \"kubernetes.io/projected/21b0442e-f4b4-4f59-b3c5-1510ae4d792c-kube-api-access-lj6j5\") pod \"21b0442e-f4b4-4f59-b3c5-1510ae4d792c\" (UID: \"21b0442e-f4b4-4f59-b3c5-1510ae4d792c\") " Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.188362 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21b0442e-f4b4-4f59-b3c5-1510ae4d792c-operator-scripts\") pod \"21b0442e-f4b4-4f59-b3c5-1510ae4d792c\" (UID: \"21b0442e-f4b4-4f59-b3c5-1510ae4d792c\") " Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.188436 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z52tq\" (UniqueName: \"kubernetes.io/projected/a069955e-f546-4522-97ec-5a529f79b1aa-kube-api-access-z52tq\") pod \"a069955e-f546-4522-97ec-5a529f79b1aa\" (UID: \"a069955e-f546-4522-97ec-5a529f79b1aa\") " Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.188515 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a069955e-f546-4522-97ec-5a529f79b1aa-operator-scripts\") pod \"a069955e-f546-4522-97ec-5a529f79b1aa\" (UID: \"a069955e-f546-4522-97ec-5a529f79b1aa\") " Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.189083 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21b0442e-f4b4-4f59-b3c5-1510ae4d792c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "21b0442e-f4b4-4f59-b3c5-1510ae4d792c" (UID: "21b0442e-f4b4-4f59-b3c5-1510ae4d792c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.189089 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a069955e-f546-4522-97ec-5a529f79b1aa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a069955e-f546-4522-97ec-5a529f79b1aa" (UID: "a069955e-f546-4522-97ec-5a529f79b1aa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.189398 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21b0442e-f4b4-4f59-b3c5-1510ae4d792c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.189418 4792 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.189427 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a069955e-f546-4522-97ec-5a529f79b1aa-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.191590 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a069955e-f546-4522-97ec-5a529f79b1aa-kube-api-access-z52tq" (OuterVolumeSpecName: "kube-api-access-z52tq") pod "a069955e-f546-4522-97ec-5a529f79b1aa" (UID: "a069955e-f546-4522-97ec-5a529f79b1aa"). InnerVolumeSpecName "kube-api-access-z52tq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.195028 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21b0442e-f4b4-4f59-b3c5-1510ae4d792c-kube-api-access-lj6j5" (OuterVolumeSpecName: "kube-api-access-lj6j5") pod "21b0442e-f4b4-4f59-b3c5-1510ae4d792c" (UID: "21b0442e-f4b4-4f59-b3c5-1510ae4d792c"). InnerVolumeSpecName "kube-api-access-lj6j5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.291456 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lj6j5\" (UniqueName: \"kubernetes.io/projected/21b0442e-f4b4-4f59-b3c5-1510ae4d792c-kube-api-access-lj6j5\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.291700 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z52tq\" (UniqueName: \"kubernetes.io/projected/a069955e-f546-4522-97ec-5a529f79b1aa-kube-api-access-z52tq\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.454774 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qt42r" event={"ID":"21b0442e-f4b4-4f59-b3c5-1510ae4d792c","Type":"ContainerDied","Data":"5ac29da5fa52816d6af4f4e774013cba308601b08c0ed36851caa33072f6b41d"} Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.454800 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qt42r" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.454805 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ac29da5fa52816d6af4f4e774013cba308601b08c0ed36851caa33072f6b41d" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.456520 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zj224" event={"ID":"a069955e-f546-4522-97ec-5a529f79b1aa","Type":"ContainerDied","Data":"035975688c011495ce93c6c1b1155e999f1292fecaac7c3671be568df507bd87"} Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.456568 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="035975688c011495ce93c6c1b1155e999f1292fecaac7c3671be568df507bd87" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.456537 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zj224" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.458789 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3b7a-account-create-update-2vc26" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.458842 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-3b7a-account-create-update-2vc26" event={"ID":"e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2","Type":"ContainerDied","Data":"e15a28a43dd6c1126f3b22e8480bf4ba4434e945130f45feec86bf90e7913964"} Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.458877 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e15a28a43dd6c1126f3b22e8480bf4ba4434e945130f45feec86bf90e7913964" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.462351 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bbc5b86d6-8b672" event={"ID":"013566fd-5627-422a-809a-e81a8ec059d9","Type":"ContainerDied","Data":"120df1c67b7935983b4052512da03cb57b70749fae0a4306db0f53d6bed8199c"} Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.462389 4792 scope.go:117] "RemoveContainer" containerID="2f7d0d1b6918c9e962de8e492d201469a86475bd623f4086bd8a9e1f30a74d63" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.462544 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7bbc5b86d6-8b672" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.471477 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="594710f1-32aa-4acc-a8ea-8cfec7b2c28c" containerName="ceilometer-central-agent" containerID="cri-o://20059357b7d0dcf77c24de3fd3876e0d7701b71cfa87b07f6aaaa2491b33046c" gracePeriod=30 Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.471662 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"594710f1-32aa-4acc-a8ea-8cfec7b2c28c","Type":"ContainerStarted","Data":"387825e4421b63de17f8a95a35da8706ef3e20db6f2a3819f7e3c7def308bb9b"} Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.471919 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.471982 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="594710f1-32aa-4acc-a8ea-8cfec7b2c28c" containerName="proxy-httpd" containerID="cri-o://387825e4421b63de17f8a95a35da8706ef3e20db6f2a3819f7e3c7def308bb9b" gracePeriod=30 Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.472028 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="594710f1-32aa-4acc-a8ea-8cfec7b2c28c" containerName="sg-core" containerID="cri-o://f1c336e0ebe268dd9ca94d30a4f01dbdf7925917a75d6e87aa3514ba966658df" gracePeriod=30 Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.472091 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="594710f1-32aa-4acc-a8ea-8cfec7b2c28c" containerName="ceilometer-notification-agent" containerID="cri-o://2974b52b9fe70c71e1067f1e48fbfe7f66447a69916e1c15a62b5a2377bf5549" gracePeriod=30 Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.499733 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7bbc5b86d6-8b672"] Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.503860 4792 scope.go:117] "RemoveContainer" containerID="ca55616f2e5de805229eb50d3f643de3b33e4b53039ccf7569dc0337fc8e14a5" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.509500 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7bbc5b86d6-8b672"] Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.526188 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.19990666 podStartE2EDuration="7.526024389s" podCreationTimestamp="2026-03-01 09:29:28 +0000 UTC" firstStartedPulling="2026-03-01 09:29:28.978710237 +0000 UTC m=+1298.220589434" lastFinishedPulling="2026-03-01 09:29:34.304827966 +0000 UTC m=+1303.546707163" observedRunningTime="2026-03-01 09:29:35.519742168 +0000 UTC m=+1304.761621365" watchObservedRunningTime="2026-03-01 09:29:35.526024389 +0000 UTC m=+1304.767903586" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.912056 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c72f-account-create-update-x9vvj" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.994179 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8f2r2" Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.000893 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-474a-account-create-update-dlgkl" Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.016739 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2be4f49-c20a-4e25-bff3-e4617d275fa1-operator-scripts\") pod \"f2be4f49-c20a-4e25-bff3-e4617d275fa1\" (UID: \"f2be4f49-c20a-4e25-bff3-e4617d275fa1\") " Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.016800 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dgpl\" (UniqueName: \"kubernetes.io/projected/f2be4f49-c20a-4e25-bff3-e4617d275fa1-kube-api-access-9dgpl\") pod \"f2be4f49-c20a-4e25-bff3-e4617d275fa1\" (UID: \"f2be4f49-c20a-4e25-bff3-e4617d275fa1\") " Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.019963 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2be4f49-c20a-4e25-bff3-e4617d275fa1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f2be4f49-c20a-4e25-bff3-e4617d275fa1" (UID: "f2be4f49-c20a-4e25-bff3-e4617d275fa1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.022576 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2be4f49-c20a-4e25-bff3-e4617d275fa1-kube-api-access-9dgpl" (OuterVolumeSpecName: "kube-api-access-9dgpl") pod "f2be4f49-c20a-4e25-bff3-e4617d275fa1" (UID: "f2be4f49-c20a-4e25-bff3-e4617d275fa1"). InnerVolumeSpecName "kube-api-access-9dgpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.150072 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09b4c86e-31ba-4d91-a602-39fa3a57c798-operator-scripts\") pod \"09b4c86e-31ba-4d91-a602-39fa3a57c798\" (UID: \"09b4c86e-31ba-4d91-a602-39fa3a57c798\") " Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.150318 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztqnk\" (UniqueName: \"kubernetes.io/projected/f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe-kube-api-access-ztqnk\") pod \"f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe\" (UID: \"f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe\") " Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.150348 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe-operator-scripts\") pod \"f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe\" (UID: \"f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe\") " Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.150421 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fw6bw\" (UniqueName: \"kubernetes.io/projected/09b4c86e-31ba-4d91-a602-39fa3a57c798-kube-api-access-fw6bw\") pod \"09b4c86e-31ba-4d91-a602-39fa3a57c798\" (UID: \"09b4c86e-31ba-4d91-a602-39fa3a57c798\") " Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.151171 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2be4f49-c20a-4e25-bff3-e4617d275fa1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.151192 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dgpl\" (UniqueName: \"kubernetes.io/projected/f2be4f49-c20a-4e25-bff3-e4617d275fa1-kube-api-access-9dgpl\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.151972 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe" (UID: "f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.154311 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe-kube-api-access-ztqnk" (OuterVolumeSpecName: "kube-api-access-ztqnk") pod "f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe" (UID: "f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe"). InnerVolumeSpecName "kube-api-access-ztqnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.155247 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09b4c86e-31ba-4d91-a602-39fa3a57c798-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "09b4c86e-31ba-4d91-a602-39fa3a57c798" (UID: "09b4c86e-31ba-4d91-a602-39fa3a57c798"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.160345 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09b4c86e-31ba-4d91-a602-39fa3a57c798-kube-api-access-fw6bw" (OuterVolumeSpecName: "kube-api-access-fw6bw") pod "09b4c86e-31ba-4d91-a602-39fa3a57c798" (UID: "09b4c86e-31ba-4d91-a602-39fa3a57c798"). InnerVolumeSpecName "kube-api-access-fw6bw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.252631 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09b4c86e-31ba-4d91-a602-39fa3a57c798-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.252668 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztqnk\" (UniqueName: \"kubernetes.io/projected/f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe-kube-api-access-ztqnk\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.252679 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.252688 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fw6bw\" (UniqueName: \"kubernetes.io/projected/09b4c86e-31ba-4d91-a602-39fa3a57c798-kube-api-access-fw6bw\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.481037 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-8f2r2" event={"ID":"f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe","Type":"ContainerDied","Data":"dadc3c08b41a7bc9b02729db8b51278d209590e1ed422026617fbb3c1754363b"} Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.481549 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dadc3c08b41a7bc9b02729db8b51278d209590e1ed422026617fbb3c1754363b" Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.481125 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8f2r2" Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.493141 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-474a-account-create-update-dlgkl" event={"ID":"09b4c86e-31ba-4d91-a602-39fa3a57c798","Type":"ContainerDied","Data":"3feefdb24fc4187fb81280730c9fa62f502ce358f1aaf7380b427d95a1866213"} Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.493183 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3feefdb24fc4187fb81280730c9fa62f502ce358f1aaf7380b427d95a1866213" Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.493958 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-474a-account-create-update-dlgkl" Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.499679 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c72f-account-create-update-x9vvj" Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.499679 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c72f-account-create-update-x9vvj" event={"ID":"f2be4f49-c20a-4e25-bff3-e4617d275fa1","Type":"ContainerDied","Data":"771711ce9c2752c3012e04bcade441dfda2723a81435feaa4ce6f2649bebee72"} Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.499889 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="771711ce9c2752c3012e04bcade441dfda2723a81435feaa4ce6f2649bebee72" Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.508396 4792 generic.go:334] "Generic (PLEG): container finished" podID="594710f1-32aa-4acc-a8ea-8cfec7b2c28c" containerID="387825e4421b63de17f8a95a35da8706ef3e20db6f2a3819f7e3c7def308bb9b" exitCode=0 Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.508423 4792 generic.go:334] "Generic (PLEG): container finished" podID="594710f1-32aa-4acc-a8ea-8cfec7b2c28c" containerID="f1c336e0ebe268dd9ca94d30a4f01dbdf7925917a75d6e87aa3514ba966658df" exitCode=2 Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.508431 4792 generic.go:334] "Generic (PLEG): container finished" podID="594710f1-32aa-4acc-a8ea-8cfec7b2c28c" containerID="2974b52b9fe70c71e1067f1e48fbfe7f66447a69916e1c15a62b5a2377bf5549" exitCode=0 Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.508448 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"594710f1-32aa-4acc-a8ea-8cfec7b2c28c","Type":"ContainerDied","Data":"387825e4421b63de17f8a95a35da8706ef3e20db6f2a3819f7e3c7def308bb9b"} Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.508468 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"594710f1-32aa-4acc-a8ea-8cfec7b2c28c","Type":"ContainerDied","Data":"f1c336e0ebe268dd9ca94d30a4f01dbdf7925917a75d6e87aa3514ba966658df"} Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.508478 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"594710f1-32aa-4acc-a8ea-8cfec7b2c28c","Type":"ContainerDied","Data":"2974b52b9fe70c71e1067f1e48fbfe7f66447a69916e1c15a62b5a2377bf5549"} Mar 01 09:29:37 crc kubenswrapper[4792]: I0301 09:29:37.421587 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="013566fd-5627-422a-809a-e81a8ec059d9" path="/var/lib/kubelet/pods/013566fd-5627-422a-809a-e81a8ec059d9/volumes" Mar 01 09:29:37 crc kubenswrapper[4792]: I0301 09:29:37.526240 4792 generic.go:334] "Generic (PLEG): container finished" podID="594710f1-32aa-4acc-a8ea-8cfec7b2c28c" containerID="20059357b7d0dcf77c24de3fd3876e0d7701b71cfa87b07f6aaaa2491b33046c" exitCode=0 Mar 01 09:29:37 crc kubenswrapper[4792]: I0301 09:29:37.526283 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"594710f1-32aa-4acc-a8ea-8cfec7b2c28c","Type":"ContainerDied","Data":"20059357b7d0dcf77c24de3fd3876e0d7701b71cfa87b07f6aaaa2491b33046c"} Mar 01 09:29:37 crc kubenswrapper[4792]: I0301 09:29:37.781587 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:29:37 crc kubenswrapper[4792]: I0301 09:29:37.879668 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptvq9\" (UniqueName: \"kubernetes.io/projected/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-kube-api-access-ptvq9\") pod \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " Mar 01 09:29:37 crc kubenswrapper[4792]: I0301 09:29:37.879777 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-combined-ca-bundle\") pod \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " Mar 01 09:29:37 crc kubenswrapper[4792]: I0301 09:29:37.879929 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-scripts\") pod \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " Mar 01 09:29:37 crc kubenswrapper[4792]: I0301 09:29:37.879948 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-sg-core-conf-yaml\") pod \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " Mar 01 09:29:37 crc kubenswrapper[4792]: I0301 09:29:37.880050 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-config-data\") pod \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " Mar 01 09:29:37 crc kubenswrapper[4792]: I0301 09:29:37.880554 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-log-httpd\") pod \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " Mar 01 09:29:37 crc kubenswrapper[4792]: I0301 09:29:37.880612 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-run-httpd\") pod \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " Mar 01 09:29:37 crc kubenswrapper[4792]: I0301 09:29:37.880836 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "594710f1-32aa-4acc-a8ea-8cfec7b2c28c" (UID: "594710f1-32aa-4acc-a8ea-8cfec7b2c28c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:29:37 crc kubenswrapper[4792]: I0301 09:29:37.881198 4792 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:37 crc kubenswrapper[4792]: I0301 09:29:37.882685 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "594710f1-32aa-4acc-a8ea-8cfec7b2c28c" (UID: "594710f1-32aa-4acc-a8ea-8cfec7b2c28c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:29:37 crc kubenswrapper[4792]: I0301 09:29:37.885638 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-scripts" (OuterVolumeSpecName: "scripts") pod "594710f1-32aa-4acc-a8ea-8cfec7b2c28c" (UID: "594710f1-32aa-4acc-a8ea-8cfec7b2c28c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:37 crc kubenswrapper[4792]: I0301 09:29:37.911590 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-kube-api-access-ptvq9" (OuterVolumeSpecName: "kube-api-access-ptvq9") pod "594710f1-32aa-4acc-a8ea-8cfec7b2c28c" (UID: "594710f1-32aa-4acc-a8ea-8cfec7b2c28c"). InnerVolumeSpecName "kube-api-access-ptvq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:29:37 crc kubenswrapper[4792]: I0301 09:29:37.951881 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "594710f1-32aa-4acc-a8ea-8cfec7b2c28c" (UID: "594710f1-32aa-4acc-a8ea-8cfec7b2c28c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:37 crc kubenswrapper[4792]: I0301 09:29:37.980989 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "594710f1-32aa-4acc-a8ea-8cfec7b2c28c" (UID: "594710f1-32aa-4acc-a8ea-8cfec7b2c28c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:37 crc kubenswrapper[4792]: I0301 09:29:37.983045 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:37 crc kubenswrapper[4792]: I0301 09:29:37.983245 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:37 crc kubenswrapper[4792]: I0301 09:29:37.983329 4792 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:37 crc kubenswrapper[4792]: I0301 09:29:37.983387 4792 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:37 crc kubenswrapper[4792]: I0301 09:29:37.983591 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptvq9\" (UniqueName: \"kubernetes.io/projected/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-kube-api-access-ptvq9\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.015729 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-config-data" (OuterVolumeSpecName: "config-data") pod "594710f1-32aa-4acc-a8ea-8cfec7b2c28c" (UID: "594710f1-32aa-4acc-a8ea-8cfec7b2c28c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.072316 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.085989 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.208326 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.284206 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7f7447dcd6-cpnn5"] Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.284532 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-7f7447dcd6-cpnn5" podUID="947b32da-5664-42ff-a665-ac182dea1433" containerName="placement-log" containerID="cri-o://4555af75d6b8f403b02d065ff405189b215772f534b2956e72f7441d250ba2de" gracePeriod=30 Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.284681 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-7f7447dcd6-cpnn5" podUID="947b32da-5664-42ff-a665-ac182dea1433" containerName="placement-api" containerID="cri-o://9288e4694d44c535f5db6a3588b5f57161da3432c7fdcf10f8abb56af7bd4be9" gracePeriod=30 Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.537066 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"594710f1-32aa-4acc-a8ea-8cfec7b2c28c","Type":"ContainerDied","Data":"a26311f7480277741aeeb123ba5ce9e74cfc4cc3c5c9582b395475f7202f8016"} Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.537122 4792 scope.go:117] "RemoveContainer" containerID="387825e4421b63de17f8a95a35da8706ef3e20db6f2a3819f7e3c7def308bb9b" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.537130 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.540539 4792 generic.go:334] "Generic (PLEG): container finished" podID="947b32da-5664-42ff-a665-ac182dea1433" containerID="4555af75d6b8f403b02d065ff405189b215772f534b2956e72f7441d250ba2de" exitCode=143 Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.540790 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f7447dcd6-cpnn5" event={"ID":"947b32da-5664-42ff-a665-ac182dea1433","Type":"ContainerDied","Data":"4555af75d6b8f403b02d065ff405189b215772f534b2956e72f7441d250ba2de"} Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.568248 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.577619 4792 scope.go:117] "RemoveContainer" containerID="f1c336e0ebe268dd9ca94d30a4f01dbdf7925917a75d6e87aa3514ba966658df" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.586556 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.615592 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:29:38 crc kubenswrapper[4792]: E0301 09:29:38.616006 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21b0442e-f4b4-4f59-b3c5-1510ae4d792c" containerName="mariadb-database-create" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616022 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="21b0442e-f4b4-4f59-b3c5-1510ae4d792c" containerName="mariadb-database-create" Mar 01 09:29:38 crc kubenswrapper[4792]: E0301 09:29:38.616032 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="594710f1-32aa-4acc-a8ea-8cfec7b2c28c" containerName="sg-core" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616037 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="594710f1-32aa-4acc-a8ea-8cfec7b2c28c" containerName="sg-core" Mar 01 09:29:38 crc kubenswrapper[4792]: E0301 09:29:38.616049 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe" containerName="mariadb-database-create" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616056 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe" containerName="mariadb-database-create" Mar 01 09:29:38 crc kubenswrapper[4792]: E0301 09:29:38.616064 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2" containerName="mariadb-account-create-update" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616069 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2" containerName="mariadb-account-create-update" Mar 01 09:29:38 crc kubenswrapper[4792]: E0301 09:29:38.616086 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="594710f1-32aa-4acc-a8ea-8cfec7b2c28c" containerName="ceilometer-notification-agent" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616092 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="594710f1-32aa-4acc-a8ea-8cfec7b2c28c" containerName="ceilometer-notification-agent" Mar 01 09:29:38 crc kubenswrapper[4792]: E0301 09:29:38.616108 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2be4f49-c20a-4e25-bff3-e4617d275fa1" containerName="mariadb-account-create-update" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616114 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2be4f49-c20a-4e25-bff3-e4617d275fa1" containerName="mariadb-account-create-update" Mar 01 09:29:38 crc kubenswrapper[4792]: E0301 09:29:38.616123 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="594710f1-32aa-4acc-a8ea-8cfec7b2c28c" containerName="ceilometer-central-agent" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616129 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="594710f1-32aa-4acc-a8ea-8cfec7b2c28c" containerName="ceilometer-central-agent" Mar 01 09:29:38 crc kubenswrapper[4792]: E0301 09:29:38.616140 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="013566fd-5627-422a-809a-e81a8ec059d9" containerName="neutron-httpd" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616147 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="013566fd-5627-422a-809a-e81a8ec059d9" containerName="neutron-httpd" Mar 01 09:29:38 crc kubenswrapper[4792]: E0301 09:29:38.616158 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="594710f1-32aa-4acc-a8ea-8cfec7b2c28c" containerName="proxy-httpd" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616163 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="594710f1-32aa-4acc-a8ea-8cfec7b2c28c" containerName="proxy-httpd" Mar 01 09:29:38 crc kubenswrapper[4792]: E0301 09:29:38.616172 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="013566fd-5627-422a-809a-e81a8ec059d9" containerName="neutron-api" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616178 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="013566fd-5627-422a-809a-e81a8ec059d9" containerName="neutron-api" Mar 01 09:29:38 crc kubenswrapper[4792]: E0301 09:29:38.616185 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09b4c86e-31ba-4d91-a602-39fa3a57c798" containerName="mariadb-account-create-update" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616191 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="09b4c86e-31ba-4d91-a602-39fa3a57c798" containerName="mariadb-account-create-update" Mar 01 09:29:38 crc kubenswrapper[4792]: E0301 09:29:38.616205 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a069955e-f546-4522-97ec-5a529f79b1aa" containerName="mariadb-database-create" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616211 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a069955e-f546-4522-97ec-5a529f79b1aa" containerName="mariadb-database-create" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616363 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="09b4c86e-31ba-4d91-a602-39fa3a57c798" containerName="mariadb-account-create-update" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616394 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="594710f1-32aa-4acc-a8ea-8cfec7b2c28c" containerName="ceilometer-central-agent" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616408 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="594710f1-32aa-4acc-a8ea-8cfec7b2c28c" containerName="proxy-httpd" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616419 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="21b0442e-f4b4-4f59-b3c5-1510ae4d792c" containerName="mariadb-database-create" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616433 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2" containerName="mariadb-account-create-update" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616446 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2be4f49-c20a-4e25-bff3-e4617d275fa1" containerName="mariadb-account-create-update" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616453 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="013566fd-5627-422a-809a-e81a8ec059d9" containerName="neutron-httpd" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616468 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="013566fd-5627-422a-809a-e81a8ec059d9" containerName="neutron-api" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616488 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="594710f1-32aa-4acc-a8ea-8cfec7b2c28c" containerName="ceilometer-notification-agent" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616502 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a069955e-f546-4522-97ec-5a529f79b1aa" containerName="mariadb-database-create" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616511 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe" containerName="mariadb-database-create" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616520 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="594710f1-32aa-4acc-a8ea-8cfec7b2c28c" containerName="sg-core" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.618513 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.619601 4792 scope.go:117] "RemoveContainer" containerID="2974b52b9fe70c71e1067f1e48fbfe7f66447a69916e1c15a62b5a2377bf5549" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.621044 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.621163 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.680636 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.684684 4792 scope.go:117] "RemoveContainer" containerID="20059357b7d0dcf77c24de3fd3876e0d7701b71cfa87b07f6aaaa2491b33046c" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.702062 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " pod="openstack/ceilometer-0" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.702607 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-run-httpd\") pod \"ceilometer-0\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " pod="openstack/ceilometer-0" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.702761 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b67jc\" (UniqueName: \"kubernetes.io/projected/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-kube-api-access-b67jc\") pod \"ceilometer-0\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " pod="openstack/ceilometer-0" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.702834 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-scripts\") pod \"ceilometer-0\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " pod="openstack/ceilometer-0" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.702918 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-log-httpd\") pod \"ceilometer-0\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " pod="openstack/ceilometer-0" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.702997 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " pod="openstack/ceilometer-0" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.703075 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-config-data\") pod \"ceilometer-0\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " pod="openstack/ceilometer-0" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.804503 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-config-data\") pod \"ceilometer-0\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " pod="openstack/ceilometer-0" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.805374 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " pod="openstack/ceilometer-0" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.805397 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-run-httpd\") pod \"ceilometer-0\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " pod="openstack/ceilometer-0" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.805749 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b67jc\" (UniqueName: \"kubernetes.io/projected/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-kube-api-access-b67jc\") pod \"ceilometer-0\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " pod="openstack/ceilometer-0" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.805779 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-scripts\") pod \"ceilometer-0\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " pod="openstack/ceilometer-0" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.805799 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-log-httpd\") pod \"ceilometer-0\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " pod="openstack/ceilometer-0" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.805831 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " pod="openstack/ceilometer-0" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.806174 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-run-httpd\") pod \"ceilometer-0\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " pod="openstack/ceilometer-0" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.806516 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-log-httpd\") pod \"ceilometer-0\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " pod="openstack/ceilometer-0" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.810364 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " pod="openstack/ceilometer-0" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.810988 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " pod="openstack/ceilometer-0" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.811599 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-config-data\") pod \"ceilometer-0\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " pod="openstack/ceilometer-0" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.822575 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-scripts\") pod \"ceilometer-0\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " pod="openstack/ceilometer-0" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.828589 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b67jc\" (UniqueName: \"kubernetes.io/projected/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-kube-api-access-b67jc\") pod \"ceilometer-0\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " pod="openstack/ceilometer-0" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.947109 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:29:39 crc kubenswrapper[4792]: I0301 09:29:39.418092 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="594710f1-32aa-4acc-a8ea-8cfec7b2c28c" path="/var/lib/kubelet/pods/594710f1-32aa-4acc-a8ea-8cfec7b2c28c/volumes" Mar 01 09:29:39 crc kubenswrapper[4792]: I0301 09:29:39.445074 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:29:39 crc kubenswrapper[4792]: I0301 09:29:39.552322 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf","Type":"ContainerStarted","Data":"0f890f6f19c9d8250258adc7aeb8d26ea84175d18054680a23e399f6e29382bc"} Mar 01 09:29:40 crc kubenswrapper[4792]: I0301 09:29:40.561802 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf","Type":"ContainerStarted","Data":"03d77111dbe6af8e832c82b26c1b62e1ab0b322bd6de059889982bff5460bbf9"} Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.077854 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-tp5l7"] Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.079781 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-tp5l7" Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.088152 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-hfrkw" Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.089332 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.092623 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.102095 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-tp5l7"] Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.178073 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-tp5l7\" (UID: \"de6ead5c-face-41ff-ab6e-aebb7ca73c1c\") " pod="openstack/nova-cell0-conductor-db-sync-tp5l7" Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.178145 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-config-data\") pod \"nova-cell0-conductor-db-sync-tp5l7\" (UID: \"de6ead5c-face-41ff-ab6e-aebb7ca73c1c\") " pod="openstack/nova-cell0-conductor-db-sync-tp5l7" Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.178196 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4jjf\" (UniqueName: \"kubernetes.io/projected/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-kube-api-access-j4jjf\") pod \"nova-cell0-conductor-db-sync-tp5l7\" (UID: \"de6ead5c-face-41ff-ab6e-aebb7ca73c1c\") " pod="openstack/nova-cell0-conductor-db-sync-tp5l7" Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.178225 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-scripts\") pod \"nova-cell0-conductor-db-sync-tp5l7\" (UID: \"de6ead5c-face-41ff-ab6e-aebb7ca73c1c\") " pod="openstack/nova-cell0-conductor-db-sync-tp5l7" Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.279477 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-config-data\") pod \"nova-cell0-conductor-db-sync-tp5l7\" (UID: \"de6ead5c-face-41ff-ab6e-aebb7ca73c1c\") " pod="openstack/nova-cell0-conductor-db-sync-tp5l7" Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.279756 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4jjf\" (UniqueName: \"kubernetes.io/projected/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-kube-api-access-j4jjf\") pod \"nova-cell0-conductor-db-sync-tp5l7\" (UID: \"de6ead5c-face-41ff-ab6e-aebb7ca73c1c\") " pod="openstack/nova-cell0-conductor-db-sync-tp5l7" Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.279779 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-scripts\") pod \"nova-cell0-conductor-db-sync-tp5l7\" (UID: \"de6ead5c-face-41ff-ab6e-aebb7ca73c1c\") " pod="openstack/nova-cell0-conductor-db-sync-tp5l7" Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.279869 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-tp5l7\" (UID: \"de6ead5c-face-41ff-ab6e-aebb7ca73c1c\") " pod="openstack/nova-cell0-conductor-db-sync-tp5l7" Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.305138 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-tp5l7\" (UID: \"de6ead5c-face-41ff-ab6e-aebb7ca73c1c\") " pod="openstack/nova-cell0-conductor-db-sync-tp5l7" Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.305161 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-config-data\") pod \"nova-cell0-conductor-db-sync-tp5l7\" (UID: \"de6ead5c-face-41ff-ab6e-aebb7ca73c1c\") " pod="openstack/nova-cell0-conductor-db-sync-tp5l7" Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.305488 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-scripts\") pod \"nova-cell0-conductor-db-sync-tp5l7\" (UID: \"de6ead5c-face-41ff-ab6e-aebb7ca73c1c\") " pod="openstack/nova-cell0-conductor-db-sync-tp5l7" Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.319401 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4jjf\" (UniqueName: \"kubernetes.io/projected/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-kube-api-access-j4jjf\") pod \"nova-cell0-conductor-db-sync-tp5l7\" (UID: \"de6ead5c-face-41ff-ab6e-aebb7ca73c1c\") " pod="openstack/nova-cell0-conductor-db-sync-tp5l7" Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.543245 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-tp5l7" Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.606551 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf","Type":"ContainerStarted","Data":"821a69d721c5f5e39af061016fb4dd3aae85412f19901bf33a586d3b5889a0bf"} Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.606592 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf","Type":"ContainerStarted","Data":"11adf9995e830da132d5cee2f25399dd37eec4af12f052d976a4771af2aa9ab7"} Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.614387 4792 generic.go:334] "Generic (PLEG): container finished" podID="947b32da-5664-42ff-a665-ac182dea1433" containerID="9288e4694d44c535f5db6a3588b5f57161da3432c7fdcf10f8abb56af7bd4be9" exitCode=0 Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.614426 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f7447dcd6-cpnn5" event={"ID":"947b32da-5664-42ff-a665-ac182dea1433","Type":"ContainerDied","Data":"9288e4694d44c535f5db6a3588b5f57161da3432c7fdcf10f8abb56af7bd4be9"} Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.952274 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.995785 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-internal-tls-certs\") pod \"947b32da-5664-42ff-a665-ac182dea1433\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.995848 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-public-tls-certs\") pod \"947b32da-5664-42ff-a665-ac182dea1433\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.995893 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9trll\" (UniqueName: \"kubernetes.io/projected/947b32da-5664-42ff-a665-ac182dea1433-kube-api-access-9trll\") pod \"947b32da-5664-42ff-a665-ac182dea1433\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.995995 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/947b32da-5664-42ff-a665-ac182dea1433-logs\") pod \"947b32da-5664-42ff-a665-ac182dea1433\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.996013 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-config-data\") pod \"947b32da-5664-42ff-a665-ac182dea1433\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.996064 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-scripts\") pod \"947b32da-5664-42ff-a665-ac182dea1433\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.996088 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-combined-ca-bundle\") pod \"947b32da-5664-42ff-a665-ac182dea1433\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.997310 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/947b32da-5664-42ff-a665-ac182dea1433-logs" (OuterVolumeSpecName: "logs") pod "947b32da-5664-42ff-a665-ac182dea1433" (UID: "947b32da-5664-42ff-a665-ac182dea1433"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:29:42 crc kubenswrapper[4792]: I0301 09:29:42.008546 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/947b32da-5664-42ff-a665-ac182dea1433-kube-api-access-9trll" (OuterVolumeSpecName: "kube-api-access-9trll") pod "947b32da-5664-42ff-a665-ac182dea1433" (UID: "947b32da-5664-42ff-a665-ac182dea1433"). InnerVolumeSpecName "kube-api-access-9trll". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:29:42 crc kubenswrapper[4792]: I0301 09:29:42.020040 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-scripts" (OuterVolumeSpecName: "scripts") pod "947b32da-5664-42ff-a665-ac182dea1433" (UID: "947b32da-5664-42ff-a665-ac182dea1433"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:42 crc kubenswrapper[4792]: I0301 09:29:42.090217 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-tp5l7"] Mar 01 09:29:42 crc kubenswrapper[4792]: I0301 09:29:42.092022 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-config-data" (OuterVolumeSpecName: "config-data") pod "947b32da-5664-42ff-a665-ac182dea1433" (UID: "947b32da-5664-42ff-a665-ac182dea1433"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:42 crc kubenswrapper[4792]: I0301 09:29:42.097891 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9trll\" (UniqueName: \"kubernetes.io/projected/947b32da-5664-42ff-a665-ac182dea1433-kube-api-access-9trll\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:42 crc kubenswrapper[4792]: I0301 09:29:42.097927 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/947b32da-5664-42ff-a665-ac182dea1433-logs\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:42 crc kubenswrapper[4792]: I0301 09:29:42.097937 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:42 crc kubenswrapper[4792]: I0301 09:29:42.098033 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:42 crc kubenswrapper[4792]: I0301 09:29:42.112335 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "947b32da-5664-42ff-a665-ac182dea1433" (UID: "947b32da-5664-42ff-a665-ac182dea1433"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:42 crc kubenswrapper[4792]: I0301 09:29:42.134237 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "947b32da-5664-42ff-a665-ac182dea1433" (UID: "947b32da-5664-42ff-a665-ac182dea1433"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:42 crc kubenswrapper[4792]: I0301 09:29:42.139767 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "947b32da-5664-42ff-a665-ac182dea1433" (UID: "947b32da-5664-42ff-a665-ac182dea1433"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:42 crc kubenswrapper[4792]: I0301 09:29:42.199169 4792 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:42 crc kubenswrapper[4792]: I0301 09:29:42.199200 4792 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:42 crc kubenswrapper[4792]: I0301 09:29:42.199209 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:42 crc kubenswrapper[4792]: I0301 09:29:42.624186 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-tp5l7" event={"ID":"de6ead5c-face-41ff-ab6e-aebb7ca73c1c","Type":"ContainerStarted","Data":"69bc546e1a44d03d4232782332f19345cbd8f578e040e6ba8b8c80c9abe934e9"} Mar 01 09:29:42 crc kubenswrapper[4792]: I0301 09:29:42.626810 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f7447dcd6-cpnn5" event={"ID":"947b32da-5664-42ff-a665-ac182dea1433","Type":"ContainerDied","Data":"5d91235d937c34cd2f7190314bb2d67a56ac04c2b63da01a91c659eee91c9179"} Mar 01 09:29:42 crc kubenswrapper[4792]: I0301 09:29:42.626858 4792 scope.go:117] "RemoveContainer" containerID="9288e4694d44c535f5db6a3588b5f57161da3432c7fdcf10f8abb56af7bd4be9" Mar 01 09:29:42 crc kubenswrapper[4792]: I0301 09:29:42.627006 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:29:42 crc kubenswrapper[4792]: I0301 09:29:42.672229 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7f7447dcd6-cpnn5"] Mar 01 09:29:42 crc kubenswrapper[4792]: I0301 09:29:42.678397 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-7f7447dcd6-cpnn5"] Mar 01 09:29:42 crc kubenswrapper[4792]: I0301 09:29:42.689308 4792 scope.go:117] "RemoveContainer" containerID="4555af75d6b8f403b02d065ff405189b215772f534b2956e72f7441d250ba2de" Mar 01 09:29:43 crc kubenswrapper[4792]: I0301 09:29:43.419628 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="947b32da-5664-42ff-a665-ac182dea1433" path="/var/lib/kubelet/pods/947b32da-5664-42ff-a665-ac182dea1433/volumes" Mar 01 09:29:43 crc kubenswrapper[4792]: I0301 09:29:43.644385 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf","Type":"ContainerStarted","Data":"b5e11cb105cfba6cc6777dcc36d6a7386137980e60c28e5e4cb193f8ef3736fb"} Mar 01 09:29:43 crc kubenswrapper[4792]: I0301 09:29:43.645573 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 01 09:29:43 crc kubenswrapper[4792]: I0301 09:29:43.673046 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.319227069 podStartE2EDuration="5.672999376s" podCreationTimestamp="2026-03-01 09:29:38 +0000 UTC" firstStartedPulling="2026-03-01 09:29:39.445057809 +0000 UTC m=+1308.686937016" lastFinishedPulling="2026-03-01 09:29:42.798830126 +0000 UTC m=+1312.040709323" observedRunningTime="2026-03-01 09:29:43.668606111 +0000 UTC m=+1312.910485308" watchObservedRunningTime="2026-03-01 09:29:43.672999376 +0000 UTC m=+1312.914878573" Mar 01 09:29:45 crc kubenswrapper[4792]: I0301 09:29:45.321371 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:29:46 crc kubenswrapper[4792]: I0301 09:29:46.665763 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" containerName="ceilometer-central-agent" containerID="cri-o://03d77111dbe6af8e832c82b26c1b62e1ab0b322bd6de059889982bff5460bbf9" gracePeriod=30 Mar 01 09:29:46 crc kubenswrapper[4792]: I0301 09:29:46.665816 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" containerName="ceilometer-notification-agent" containerID="cri-o://11adf9995e830da132d5cee2f25399dd37eec4af12f052d976a4771af2aa9ab7" gracePeriod=30 Mar 01 09:29:46 crc kubenswrapper[4792]: I0301 09:29:46.665810 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" containerName="proxy-httpd" containerID="cri-o://b5e11cb105cfba6cc6777dcc36d6a7386137980e60c28e5e4cb193f8ef3736fb" gracePeriod=30 Mar 01 09:29:46 crc kubenswrapper[4792]: I0301 09:29:46.665840 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" containerName="sg-core" containerID="cri-o://821a69d721c5f5e39af061016fb4dd3aae85412f19901bf33a586d3b5889a0bf" gracePeriod=30 Mar 01 09:29:47 crc kubenswrapper[4792]: I0301 09:29:47.682980 4792 generic.go:334] "Generic (PLEG): container finished" podID="f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" containerID="b5e11cb105cfba6cc6777dcc36d6a7386137980e60c28e5e4cb193f8ef3736fb" exitCode=0 Mar 01 09:29:47 crc kubenswrapper[4792]: I0301 09:29:47.683259 4792 generic.go:334] "Generic (PLEG): container finished" podID="f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" containerID="821a69d721c5f5e39af061016fb4dd3aae85412f19901bf33a586d3b5889a0bf" exitCode=2 Mar 01 09:29:47 crc kubenswrapper[4792]: I0301 09:29:47.683269 4792 generic.go:334] "Generic (PLEG): container finished" podID="f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" containerID="11adf9995e830da132d5cee2f25399dd37eec4af12f052d976a4771af2aa9ab7" exitCode=0 Mar 01 09:29:47 crc kubenswrapper[4792]: I0301 09:29:47.683276 4792 generic.go:334] "Generic (PLEG): container finished" podID="f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" containerID="03d77111dbe6af8e832c82b26c1b62e1ab0b322bd6de059889982bff5460bbf9" exitCode=0 Mar 01 09:29:47 crc kubenswrapper[4792]: I0301 09:29:47.683294 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf","Type":"ContainerDied","Data":"b5e11cb105cfba6cc6777dcc36d6a7386137980e60c28e5e4cb193f8ef3736fb"} Mar 01 09:29:47 crc kubenswrapper[4792]: I0301 09:29:47.683318 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf","Type":"ContainerDied","Data":"821a69d721c5f5e39af061016fb4dd3aae85412f19901bf33a586d3b5889a0bf"} Mar 01 09:29:47 crc kubenswrapper[4792]: I0301 09:29:47.683328 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf","Type":"ContainerDied","Data":"11adf9995e830da132d5cee2f25399dd37eec4af12f052d976a4771af2aa9ab7"} Mar 01 09:29:47 crc kubenswrapper[4792]: I0301 09:29:47.683337 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf","Type":"ContainerDied","Data":"03d77111dbe6af8e832c82b26c1b62e1ab0b322bd6de059889982bff5460bbf9"} Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.245922 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.398304 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-run-httpd\") pod \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.398380 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-combined-ca-bundle\") pod \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.398430 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b67jc\" (UniqueName: \"kubernetes.io/projected/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-kube-api-access-b67jc\") pod \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.398487 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-log-httpd\") pod \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.398511 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-config-data\") pod \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.398527 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-scripts\") pod \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.398570 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-sg-core-conf-yaml\") pod \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.399102 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" (UID: "f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.399373 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" (UID: "f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.399683 4792 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.399703 4792 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.403229 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-scripts" (OuterVolumeSpecName: "scripts") pod "f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" (UID: "f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.403543 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-kube-api-access-b67jc" (OuterVolumeSpecName: "kube-api-access-b67jc") pod "f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" (UID: "f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf"). InnerVolumeSpecName "kube-api-access-b67jc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.448533 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" (UID: "f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.499139 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" (UID: "f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.501359 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.501402 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b67jc\" (UniqueName: \"kubernetes.io/projected/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-kube-api-access-b67jc\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.501425 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.501442 4792 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.538053 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-config-data" (OuterVolumeSpecName: "config-data") pod "f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" (UID: "f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.603101 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.726428 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-tp5l7" event={"ID":"de6ead5c-face-41ff-ab6e-aebb7ca73c1c","Type":"ContainerStarted","Data":"b0e1e09f850992f4661d6be2a0a76260a157f0e5c0875fd88ff0bc92644a8d13"} Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.731939 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf","Type":"ContainerDied","Data":"0f890f6f19c9d8250258adc7aeb8d26ea84175d18054680a23e399f6e29382bc"} Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.732002 4792 scope.go:117] "RemoveContainer" containerID="b5e11cb105cfba6cc6777dcc36d6a7386137980e60c28e5e4cb193f8ef3736fb" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.732174 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.769678 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-tp5l7" podStartSLOduration=1.869845159 podStartE2EDuration="10.769653728s" podCreationTimestamp="2026-03-01 09:29:41 +0000 UTC" firstStartedPulling="2026-03-01 09:29:42.099768424 +0000 UTC m=+1311.341647621" lastFinishedPulling="2026-03-01 09:29:50.999576993 +0000 UTC m=+1320.241456190" observedRunningTime="2026-03-01 09:29:51.743455629 +0000 UTC m=+1320.985334866" watchObservedRunningTime="2026-03-01 09:29:51.769653728 +0000 UTC m=+1321.011532925" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.771769 4792 scope.go:117] "RemoveContainer" containerID="821a69d721c5f5e39af061016fb4dd3aae85412f19901bf33a586d3b5889a0bf" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.787237 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.794639 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.834451 4792 scope.go:117] "RemoveContainer" containerID="11adf9995e830da132d5cee2f25399dd37eec4af12f052d976a4771af2aa9ab7" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.842730 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:29:51 crc kubenswrapper[4792]: E0301 09:29:51.843177 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" containerName="sg-core" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.843191 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" containerName="sg-core" Mar 01 09:29:51 crc kubenswrapper[4792]: E0301 09:29:51.843213 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="947b32da-5664-42ff-a665-ac182dea1433" containerName="placement-log" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.843220 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="947b32da-5664-42ff-a665-ac182dea1433" containerName="placement-log" Mar 01 09:29:51 crc kubenswrapper[4792]: E0301 09:29:51.843234 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" containerName="proxy-httpd" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.843240 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" containerName="proxy-httpd" Mar 01 09:29:51 crc kubenswrapper[4792]: E0301 09:29:51.843252 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="947b32da-5664-42ff-a665-ac182dea1433" containerName="placement-api" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.843257 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="947b32da-5664-42ff-a665-ac182dea1433" containerName="placement-api" Mar 01 09:29:51 crc kubenswrapper[4792]: E0301 09:29:51.843273 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" containerName="ceilometer-central-agent" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.843279 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" containerName="ceilometer-central-agent" Mar 01 09:29:51 crc kubenswrapper[4792]: E0301 09:29:51.843291 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" containerName="ceilometer-notification-agent" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.843297 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" containerName="ceilometer-notification-agent" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.843454 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" containerName="ceilometer-central-agent" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.843471 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" containerName="ceilometer-notification-agent" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.843480 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="947b32da-5664-42ff-a665-ac182dea1433" containerName="placement-log" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.843490 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="947b32da-5664-42ff-a665-ac182dea1433" containerName="placement-api" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.843499 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" containerName="proxy-httpd" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.843509 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" containerName="sg-core" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.845407 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.849583 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.852142 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.859221 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.880075 4792 scope.go:117] "RemoveContainer" containerID="03d77111dbe6af8e832c82b26c1b62e1ab0b322bd6de059889982bff5460bbf9" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.909452 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " pod="openstack/ceilometer-0" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.910280 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-config-data\") pod \"ceilometer-0\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " pod="openstack/ceilometer-0" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.910312 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a01d08c-d6df-4d6f-8541-1900fdc49572-log-httpd\") pod \"ceilometer-0\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " pod="openstack/ceilometer-0" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.910344 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gxqc\" (UniqueName: \"kubernetes.io/projected/0a01d08c-d6df-4d6f-8541-1900fdc49572-kube-api-access-5gxqc\") pod \"ceilometer-0\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " pod="openstack/ceilometer-0" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.910402 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " pod="openstack/ceilometer-0" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.910423 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-scripts\") pod \"ceilometer-0\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " pod="openstack/ceilometer-0" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.910455 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a01d08c-d6df-4d6f-8541-1900fdc49572-run-httpd\") pod \"ceilometer-0\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " pod="openstack/ceilometer-0" Mar 01 09:29:52 crc kubenswrapper[4792]: I0301 09:29:52.011722 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " pod="openstack/ceilometer-0" Mar 01 09:29:52 crc kubenswrapper[4792]: I0301 09:29:52.011849 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-config-data\") pod \"ceilometer-0\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " pod="openstack/ceilometer-0" Mar 01 09:29:52 crc kubenswrapper[4792]: I0301 09:29:52.011869 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a01d08c-d6df-4d6f-8541-1900fdc49572-log-httpd\") pod \"ceilometer-0\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " pod="openstack/ceilometer-0" Mar 01 09:29:52 crc kubenswrapper[4792]: I0301 09:29:52.012375 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a01d08c-d6df-4d6f-8541-1900fdc49572-log-httpd\") pod \"ceilometer-0\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " pod="openstack/ceilometer-0" Mar 01 09:29:52 crc kubenswrapper[4792]: I0301 09:29:52.012401 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gxqc\" (UniqueName: \"kubernetes.io/projected/0a01d08c-d6df-4d6f-8541-1900fdc49572-kube-api-access-5gxqc\") pod \"ceilometer-0\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " pod="openstack/ceilometer-0" Mar 01 09:29:52 crc kubenswrapper[4792]: I0301 09:29:52.012443 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " pod="openstack/ceilometer-0" Mar 01 09:29:52 crc kubenswrapper[4792]: I0301 09:29:52.012505 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-scripts\") pod \"ceilometer-0\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " pod="openstack/ceilometer-0" Mar 01 09:29:52 crc kubenswrapper[4792]: I0301 09:29:52.012533 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a01d08c-d6df-4d6f-8541-1900fdc49572-run-httpd\") pod \"ceilometer-0\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " pod="openstack/ceilometer-0" Mar 01 09:29:52 crc kubenswrapper[4792]: I0301 09:29:52.012864 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a01d08c-d6df-4d6f-8541-1900fdc49572-run-httpd\") pod \"ceilometer-0\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " pod="openstack/ceilometer-0" Mar 01 09:29:52 crc kubenswrapper[4792]: I0301 09:29:52.015391 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " pod="openstack/ceilometer-0" Mar 01 09:29:52 crc kubenswrapper[4792]: I0301 09:29:52.017833 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-config-data\") pod \"ceilometer-0\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " pod="openstack/ceilometer-0" Mar 01 09:29:52 crc kubenswrapper[4792]: I0301 09:29:52.024080 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " pod="openstack/ceilometer-0" Mar 01 09:29:52 crc kubenswrapper[4792]: I0301 09:29:52.029618 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-scripts\") pod \"ceilometer-0\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " pod="openstack/ceilometer-0" Mar 01 09:29:52 crc kubenswrapper[4792]: I0301 09:29:52.031861 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gxqc\" (UniqueName: \"kubernetes.io/projected/0a01d08c-d6df-4d6f-8541-1900fdc49572-kube-api-access-5gxqc\") pod \"ceilometer-0\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " pod="openstack/ceilometer-0" Mar 01 09:29:52 crc kubenswrapper[4792]: I0301 09:29:52.176034 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:29:52 crc kubenswrapper[4792]: I0301 09:29:52.689295 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:29:52 crc kubenswrapper[4792]: W0301 09:29:52.695152 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a01d08c_d6df_4d6f_8541_1900fdc49572.slice/crio-e73ae627714e9a71df05613aefebb2895fe9d5a5671a7f1ed0848a6a3b3c37e9 WatchSource:0}: Error finding container e73ae627714e9a71df05613aefebb2895fe9d5a5671a7f1ed0848a6a3b3c37e9: Status 404 returned error can't find the container with id e73ae627714e9a71df05613aefebb2895fe9d5a5671a7f1ed0848a6a3b3c37e9 Mar 01 09:29:52 crc kubenswrapper[4792]: I0301 09:29:52.741075 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a01d08c-d6df-4d6f-8541-1900fdc49572","Type":"ContainerStarted","Data":"e73ae627714e9a71df05613aefebb2895fe9d5a5671a7f1ed0848a6a3b3c37e9"} Mar 01 09:29:52 crc kubenswrapper[4792]: I0301 09:29:52.942681 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:29:53 crc kubenswrapper[4792]: I0301 09:29:53.425729 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" path="/var/lib/kubelet/pods/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf/volumes" Mar 01 09:29:53 crc kubenswrapper[4792]: I0301 09:29:53.749943 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a01d08c-d6df-4d6f-8541-1900fdc49572","Type":"ContainerStarted","Data":"92ac0ec70a0ab187bdd68568f0762d2d4a87c7aac17cf322ee9c2a9beff90c0f"} Mar 01 09:29:54 crc kubenswrapper[4792]: I0301 09:29:54.758664 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a01d08c-d6df-4d6f-8541-1900fdc49572","Type":"ContainerStarted","Data":"1262adca1515a35db418cfd53c93f2e1127fb911c650764e91834954617701cc"} Mar 01 09:29:54 crc kubenswrapper[4792]: I0301 09:29:54.760026 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a01d08c-d6df-4d6f-8541-1900fdc49572","Type":"ContainerStarted","Data":"a38454a03f115d37c4fd02183858fd0260647cc7fdf0118804aa752bded945c5"} Mar 01 09:29:57 crc kubenswrapper[4792]: I0301 09:29:57.782703 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a01d08c-d6df-4d6f-8541-1900fdc49572","Type":"ContainerStarted","Data":"d0d62026e9cee2a3842a4b40df8aa2796c73539e781c96bf2099aa7fe76f801f"} Mar 01 09:29:57 crc kubenswrapper[4792]: I0301 09:29:57.783402 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0a01d08c-d6df-4d6f-8541-1900fdc49572" containerName="ceilometer-central-agent" containerID="cri-o://92ac0ec70a0ab187bdd68568f0762d2d4a87c7aac17cf322ee9c2a9beff90c0f" gracePeriod=30 Mar 01 09:29:57 crc kubenswrapper[4792]: I0301 09:29:57.783645 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 01 09:29:57 crc kubenswrapper[4792]: I0301 09:29:57.784178 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0a01d08c-d6df-4d6f-8541-1900fdc49572" containerName="proxy-httpd" containerID="cri-o://d0d62026e9cee2a3842a4b40df8aa2796c73539e781c96bf2099aa7fe76f801f" gracePeriod=30 Mar 01 09:29:57 crc kubenswrapper[4792]: I0301 09:29:57.784226 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0a01d08c-d6df-4d6f-8541-1900fdc49572" containerName="sg-core" containerID="cri-o://1262adca1515a35db418cfd53c93f2e1127fb911c650764e91834954617701cc" gracePeriod=30 Mar 01 09:29:57 crc kubenswrapper[4792]: I0301 09:29:57.784260 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0a01d08c-d6df-4d6f-8541-1900fdc49572" containerName="ceilometer-notification-agent" containerID="cri-o://a38454a03f115d37c4fd02183858fd0260647cc7fdf0118804aa752bded945c5" gracePeriod=30 Mar 01 09:29:57 crc kubenswrapper[4792]: I0301 09:29:57.811486 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.375608685 podStartE2EDuration="6.811467158s" podCreationTimestamp="2026-03-01 09:29:51 +0000 UTC" firstStartedPulling="2026-03-01 09:29:52.698312165 +0000 UTC m=+1321.940191382" lastFinishedPulling="2026-03-01 09:29:57.134170658 +0000 UTC m=+1326.376049855" observedRunningTime="2026-03-01 09:29:57.804702216 +0000 UTC m=+1327.046581413" watchObservedRunningTime="2026-03-01 09:29:57.811467158 +0000 UTC m=+1327.053346345" Mar 01 09:29:58 crc kubenswrapper[4792]: I0301 09:29:58.809319 4792 generic.go:334] "Generic (PLEG): container finished" podID="0a01d08c-d6df-4d6f-8541-1900fdc49572" containerID="d0d62026e9cee2a3842a4b40df8aa2796c73539e781c96bf2099aa7fe76f801f" exitCode=0 Mar 01 09:29:58 crc kubenswrapper[4792]: I0301 09:29:58.810149 4792 generic.go:334] "Generic (PLEG): container finished" podID="0a01d08c-d6df-4d6f-8541-1900fdc49572" containerID="1262adca1515a35db418cfd53c93f2e1127fb911c650764e91834954617701cc" exitCode=2 Mar 01 09:29:58 crc kubenswrapper[4792]: I0301 09:29:58.810225 4792 generic.go:334] "Generic (PLEG): container finished" podID="0a01d08c-d6df-4d6f-8541-1900fdc49572" containerID="a38454a03f115d37c4fd02183858fd0260647cc7fdf0118804aa752bded945c5" exitCode=0 Mar 01 09:29:58 crc kubenswrapper[4792]: I0301 09:29:58.809385 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a01d08c-d6df-4d6f-8541-1900fdc49572","Type":"ContainerDied","Data":"d0d62026e9cee2a3842a4b40df8aa2796c73539e781c96bf2099aa7fe76f801f"} Mar 01 09:29:58 crc kubenswrapper[4792]: I0301 09:29:58.810363 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a01d08c-d6df-4d6f-8541-1900fdc49572","Type":"ContainerDied","Data":"1262adca1515a35db418cfd53c93f2e1127fb911c650764e91834954617701cc"} Mar 01 09:29:58 crc kubenswrapper[4792]: I0301 09:29:58.810435 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a01d08c-d6df-4d6f-8541-1900fdc49572","Type":"ContainerDied","Data":"a38454a03f115d37c4fd02183858fd0260647cc7fdf0118804aa752bded945c5"} Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.137013 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539290-wch24"] Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.138781 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539290-wch24" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.144661 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539290-npvbb"] Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.145708 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.145811 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539290-npvbb" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.154457 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.154883 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.155024 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.155665 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539290-npvbb"] Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.157552 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.164963 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539290-wch24"] Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.265257 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjrfl\" (UniqueName: \"kubernetes.io/projected/29833925-b21b-44d4-954c-e3252e5e69c4-kube-api-access-mjrfl\") pod \"collect-profiles-29539290-wch24\" (UID: \"29833925-b21b-44d4-954c-e3252e5e69c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539290-wch24" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.265331 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxxtb\" (UniqueName: \"kubernetes.io/projected/d3644e57-7093-4402-a6f2-48ed10ac14fa-kube-api-access-nxxtb\") pod \"auto-csr-approver-29539290-npvbb\" (UID: \"d3644e57-7093-4402-a6f2-48ed10ac14fa\") " pod="openshift-infra/auto-csr-approver-29539290-npvbb" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.265391 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/29833925-b21b-44d4-954c-e3252e5e69c4-secret-volume\") pod \"collect-profiles-29539290-wch24\" (UID: \"29833925-b21b-44d4-954c-e3252e5e69c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539290-wch24" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.265416 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/29833925-b21b-44d4-954c-e3252e5e69c4-config-volume\") pod \"collect-profiles-29539290-wch24\" (UID: \"29833925-b21b-44d4-954c-e3252e5e69c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539290-wch24" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.367280 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/29833925-b21b-44d4-954c-e3252e5e69c4-secret-volume\") pod \"collect-profiles-29539290-wch24\" (UID: \"29833925-b21b-44d4-954c-e3252e5e69c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539290-wch24" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.367342 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/29833925-b21b-44d4-954c-e3252e5e69c4-config-volume\") pod \"collect-profiles-29539290-wch24\" (UID: \"29833925-b21b-44d4-954c-e3252e5e69c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539290-wch24" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.367429 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjrfl\" (UniqueName: \"kubernetes.io/projected/29833925-b21b-44d4-954c-e3252e5e69c4-kube-api-access-mjrfl\") pod \"collect-profiles-29539290-wch24\" (UID: \"29833925-b21b-44d4-954c-e3252e5e69c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539290-wch24" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.367506 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxxtb\" (UniqueName: \"kubernetes.io/projected/d3644e57-7093-4402-a6f2-48ed10ac14fa-kube-api-access-nxxtb\") pod \"auto-csr-approver-29539290-npvbb\" (UID: \"d3644e57-7093-4402-a6f2-48ed10ac14fa\") " pod="openshift-infra/auto-csr-approver-29539290-npvbb" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.368353 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/29833925-b21b-44d4-954c-e3252e5e69c4-config-volume\") pod \"collect-profiles-29539290-wch24\" (UID: \"29833925-b21b-44d4-954c-e3252e5e69c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539290-wch24" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.377671 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/29833925-b21b-44d4-954c-e3252e5e69c4-secret-volume\") pod \"collect-profiles-29539290-wch24\" (UID: \"29833925-b21b-44d4-954c-e3252e5e69c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539290-wch24" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.388525 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjrfl\" (UniqueName: \"kubernetes.io/projected/29833925-b21b-44d4-954c-e3252e5e69c4-kube-api-access-mjrfl\") pod \"collect-profiles-29539290-wch24\" (UID: \"29833925-b21b-44d4-954c-e3252e5e69c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539290-wch24" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.393148 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxxtb\" (UniqueName: \"kubernetes.io/projected/d3644e57-7093-4402-a6f2-48ed10ac14fa-kube-api-access-nxxtb\") pod \"auto-csr-approver-29539290-npvbb\" (UID: \"d3644e57-7093-4402-a6f2-48ed10ac14fa\") " pod="openshift-infra/auto-csr-approver-29539290-npvbb" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.464352 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539290-wch24" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.479346 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539290-npvbb" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.756171 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.836531 4792 generic.go:334] "Generic (PLEG): container finished" podID="0a01d08c-d6df-4d6f-8541-1900fdc49572" containerID="92ac0ec70a0ab187bdd68568f0762d2d4a87c7aac17cf322ee9c2a9beff90c0f" exitCode=0 Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.836570 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a01d08c-d6df-4d6f-8541-1900fdc49572","Type":"ContainerDied","Data":"92ac0ec70a0ab187bdd68568f0762d2d4a87c7aac17cf322ee9c2a9beff90c0f"} Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.836601 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a01d08c-d6df-4d6f-8541-1900fdc49572","Type":"ContainerDied","Data":"e73ae627714e9a71df05613aefebb2895fe9d5a5671a7f1ed0848a6a3b3c37e9"} Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.836617 4792 scope.go:117] "RemoveContainer" containerID="d0d62026e9cee2a3842a4b40df8aa2796c73539e781c96bf2099aa7fe76f801f" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.836743 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.866155 4792 scope.go:117] "RemoveContainer" containerID="1262adca1515a35db418cfd53c93f2e1127fb911c650764e91834954617701cc" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.880223 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a01d08c-d6df-4d6f-8541-1900fdc49572-log-httpd\") pod \"0a01d08c-d6df-4d6f-8541-1900fdc49572\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.880290 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-combined-ca-bundle\") pod \"0a01d08c-d6df-4d6f-8541-1900fdc49572\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.880406 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-config-data\") pod \"0a01d08c-d6df-4d6f-8541-1900fdc49572\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.880429 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a01d08c-d6df-4d6f-8541-1900fdc49572-run-httpd\") pod \"0a01d08c-d6df-4d6f-8541-1900fdc49572\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.880462 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-scripts\") pod \"0a01d08c-d6df-4d6f-8541-1900fdc49572\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.880516 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-sg-core-conf-yaml\") pod \"0a01d08c-d6df-4d6f-8541-1900fdc49572\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.880949 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gxqc\" (UniqueName: \"kubernetes.io/projected/0a01d08c-d6df-4d6f-8541-1900fdc49572-kube-api-access-5gxqc\") pod \"0a01d08c-d6df-4d6f-8541-1900fdc49572\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.882337 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a01d08c-d6df-4d6f-8541-1900fdc49572-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0a01d08c-d6df-4d6f-8541-1900fdc49572" (UID: "0a01d08c-d6df-4d6f-8541-1900fdc49572"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.882878 4792 scope.go:117] "RemoveContainer" containerID="a38454a03f115d37c4fd02183858fd0260647cc7fdf0118804aa752bded945c5" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.882928 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a01d08c-d6df-4d6f-8541-1900fdc49572-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0a01d08c-d6df-4d6f-8541-1900fdc49572" (UID: "0a01d08c-d6df-4d6f-8541-1900fdc49572"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.888326 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a01d08c-d6df-4d6f-8541-1900fdc49572-kube-api-access-5gxqc" (OuterVolumeSpecName: "kube-api-access-5gxqc") pod "0a01d08c-d6df-4d6f-8541-1900fdc49572" (UID: "0a01d08c-d6df-4d6f-8541-1900fdc49572"). InnerVolumeSpecName "kube-api-access-5gxqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.888628 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-scripts" (OuterVolumeSpecName: "scripts") pod "0a01d08c-d6df-4d6f-8541-1900fdc49572" (UID: "0a01d08c-d6df-4d6f-8541-1900fdc49572"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.911144 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0a01d08c-d6df-4d6f-8541-1900fdc49572" (UID: "0a01d08c-d6df-4d6f-8541-1900fdc49572"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.912029 4792 scope.go:117] "RemoveContainer" containerID="92ac0ec70a0ab187bdd68568f0762d2d4a87c7aac17cf322ee9c2a9beff90c0f" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.937877 4792 scope.go:117] "RemoveContainer" containerID="d0d62026e9cee2a3842a4b40df8aa2796c73539e781c96bf2099aa7fe76f801f" Mar 01 09:30:00 crc kubenswrapper[4792]: E0301 09:30:00.938232 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0d62026e9cee2a3842a4b40df8aa2796c73539e781c96bf2099aa7fe76f801f\": container with ID starting with d0d62026e9cee2a3842a4b40df8aa2796c73539e781c96bf2099aa7fe76f801f not found: ID does not exist" containerID="d0d62026e9cee2a3842a4b40df8aa2796c73539e781c96bf2099aa7fe76f801f" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.938256 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0d62026e9cee2a3842a4b40df8aa2796c73539e781c96bf2099aa7fe76f801f"} err="failed to get container status \"d0d62026e9cee2a3842a4b40df8aa2796c73539e781c96bf2099aa7fe76f801f\": rpc error: code = NotFound desc = could not find container \"d0d62026e9cee2a3842a4b40df8aa2796c73539e781c96bf2099aa7fe76f801f\": container with ID starting with d0d62026e9cee2a3842a4b40df8aa2796c73539e781c96bf2099aa7fe76f801f not found: ID does not exist" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.938275 4792 scope.go:117] "RemoveContainer" containerID="1262adca1515a35db418cfd53c93f2e1127fb911c650764e91834954617701cc" Mar 01 09:30:00 crc kubenswrapper[4792]: E0301 09:30:00.938640 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1262adca1515a35db418cfd53c93f2e1127fb911c650764e91834954617701cc\": container with ID starting with 1262adca1515a35db418cfd53c93f2e1127fb911c650764e91834954617701cc not found: ID does not exist" containerID="1262adca1515a35db418cfd53c93f2e1127fb911c650764e91834954617701cc" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.938660 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1262adca1515a35db418cfd53c93f2e1127fb911c650764e91834954617701cc"} err="failed to get container status \"1262adca1515a35db418cfd53c93f2e1127fb911c650764e91834954617701cc\": rpc error: code = NotFound desc = could not find container \"1262adca1515a35db418cfd53c93f2e1127fb911c650764e91834954617701cc\": container with ID starting with 1262adca1515a35db418cfd53c93f2e1127fb911c650764e91834954617701cc not found: ID does not exist" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.938673 4792 scope.go:117] "RemoveContainer" containerID="a38454a03f115d37c4fd02183858fd0260647cc7fdf0118804aa752bded945c5" Mar 01 09:30:00 crc kubenswrapper[4792]: E0301 09:30:00.938888 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a38454a03f115d37c4fd02183858fd0260647cc7fdf0118804aa752bded945c5\": container with ID starting with a38454a03f115d37c4fd02183858fd0260647cc7fdf0118804aa752bded945c5 not found: ID does not exist" containerID="a38454a03f115d37c4fd02183858fd0260647cc7fdf0118804aa752bded945c5" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.938917 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a38454a03f115d37c4fd02183858fd0260647cc7fdf0118804aa752bded945c5"} err="failed to get container status \"a38454a03f115d37c4fd02183858fd0260647cc7fdf0118804aa752bded945c5\": rpc error: code = NotFound desc = could not find container \"a38454a03f115d37c4fd02183858fd0260647cc7fdf0118804aa752bded945c5\": container with ID starting with a38454a03f115d37c4fd02183858fd0260647cc7fdf0118804aa752bded945c5 not found: ID does not exist" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.938931 4792 scope.go:117] "RemoveContainer" containerID="92ac0ec70a0ab187bdd68568f0762d2d4a87c7aac17cf322ee9c2a9beff90c0f" Mar 01 09:30:00 crc kubenswrapper[4792]: E0301 09:30:00.939162 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92ac0ec70a0ab187bdd68568f0762d2d4a87c7aac17cf322ee9c2a9beff90c0f\": container with ID starting with 92ac0ec70a0ab187bdd68568f0762d2d4a87c7aac17cf322ee9c2a9beff90c0f not found: ID does not exist" containerID="92ac0ec70a0ab187bdd68568f0762d2d4a87c7aac17cf322ee9c2a9beff90c0f" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.939183 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92ac0ec70a0ab187bdd68568f0762d2d4a87c7aac17cf322ee9c2a9beff90c0f"} err="failed to get container status \"92ac0ec70a0ab187bdd68568f0762d2d4a87c7aac17cf322ee9c2a9beff90c0f\": rpc error: code = NotFound desc = could not find container \"92ac0ec70a0ab187bdd68568f0762d2d4a87c7aac17cf322ee9c2a9beff90c0f\": container with ID starting with 92ac0ec70a0ab187bdd68568f0762d2d4a87c7aac17cf322ee9c2a9beff90c0f not found: ID does not exist" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.976280 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a01d08c-d6df-4d6f-8541-1900fdc49572" (UID: "0a01d08c-d6df-4d6f-8541-1900fdc49572"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.978869 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-config-data" (OuterVolumeSpecName: "config-data") pod "0a01d08c-d6df-4d6f-8541-1900fdc49572" (UID: "0a01d08c-d6df-4d6f-8541-1900fdc49572"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.982846 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gxqc\" (UniqueName: \"kubernetes.io/projected/0a01d08c-d6df-4d6f-8541-1900fdc49572-kube-api-access-5gxqc\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.982944 4792 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a01d08c-d6df-4d6f-8541-1900fdc49572-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.983020 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.983077 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.983130 4792 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a01d08c-d6df-4d6f-8541-1900fdc49572-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.983190 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.983252 4792 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.109166 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539290-wch24"] Mar 01 09:30:01 crc kubenswrapper[4792]: W0301 09:30:01.109588 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29833925_b21b_44d4_954c_e3252e5e69c4.slice/crio-7de1da900ab59863539f4e6656b31f82070f579ba91efc05ea10d5e6c1d0ff39 WatchSource:0}: Error finding container 7de1da900ab59863539f4e6656b31f82070f579ba91efc05ea10d5e6c1d0ff39: Status 404 returned error can't find the container with id 7de1da900ab59863539f4e6656b31f82070f579ba91efc05ea10d5e6c1d0ff39 Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.176494 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.192882 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.205374 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:30:01 crc kubenswrapper[4792]: E0301 09:30:01.205858 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a01d08c-d6df-4d6f-8541-1900fdc49572" containerName="ceilometer-notification-agent" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.205881 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a01d08c-d6df-4d6f-8541-1900fdc49572" containerName="ceilometer-notification-agent" Mar 01 09:30:01 crc kubenswrapper[4792]: E0301 09:30:01.205926 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a01d08c-d6df-4d6f-8541-1900fdc49572" containerName="ceilometer-central-agent" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.205936 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a01d08c-d6df-4d6f-8541-1900fdc49572" containerName="ceilometer-central-agent" Mar 01 09:30:01 crc kubenswrapper[4792]: E0301 09:30:01.205984 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a01d08c-d6df-4d6f-8541-1900fdc49572" containerName="proxy-httpd" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.205994 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a01d08c-d6df-4d6f-8541-1900fdc49572" containerName="proxy-httpd" Mar 01 09:30:01 crc kubenswrapper[4792]: E0301 09:30:01.206021 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a01d08c-d6df-4d6f-8541-1900fdc49572" containerName="sg-core" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.206062 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a01d08c-d6df-4d6f-8541-1900fdc49572" containerName="sg-core" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.206257 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a01d08c-d6df-4d6f-8541-1900fdc49572" containerName="proxy-httpd" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.206276 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a01d08c-d6df-4d6f-8541-1900fdc49572" containerName="ceilometer-central-agent" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.206293 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a01d08c-d6df-4d6f-8541-1900fdc49572" containerName="sg-core" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.206305 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a01d08c-d6df-4d6f-8541-1900fdc49572" containerName="ceilometer-notification-agent" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.211115 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.221668 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.222148 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.266607 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.282337 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539290-npvbb"] Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.297592 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-run-httpd\") pod \"ceilometer-0\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " pod="openstack/ceilometer-0" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.299673 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-config-data\") pod \"ceilometer-0\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " pod="openstack/ceilometer-0" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.299825 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " pod="openstack/ceilometer-0" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.299992 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54hjp\" (UniqueName: \"kubernetes.io/projected/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-kube-api-access-54hjp\") pod \"ceilometer-0\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " pod="openstack/ceilometer-0" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.300136 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-scripts\") pod \"ceilometer-0\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " pod="openstack/ceilometer-0" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.300244 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " pod="openstack/ceilometer-0" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.300359 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-log-httpd\") pod \"ceilometer-0\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " pod="openstack/ceilometer-0" Mar 01 09:30:01 crc kubenswrapper[4792]: E0301 09:30:01.375056 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a01d08c_d6df_4d6f_8541_1900fdc49572.slice\": RecentStats: unable to find data in memory cache]" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.402572 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-config-data\") pod \"ceilometer-0\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " pod="openstack/ceilometer-0" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.402652 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " pod="openstack/ceilometer-0" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.402720 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54hjp\" (UniqueName: \"kubernetes.io/projected/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-kube-api-access-54hjp\") pod \"ceilometer-0\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " pod="openstack/ceilometer-0" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.402756 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-scripts\") pod \"ceilometer-0\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " pod="openstack/ceilometer-0" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.402802 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " pod="openstack/ceilometer-0" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.402825 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-log-httpd\") pod \"ceilometer-0\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " pod="openstack/ceilometer-0" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.402932 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-run-httpd\") pod \"ceilometer-0\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " pod="openstack/ceilometer-0" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.403832 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-run-httpd\") pod \"ceilometer-0\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " pod="openstack/ceilometer-0" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.405006 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-log-httpd\") pod \"ceilometer-0\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " pod="openstack/ceilometer-0" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.409481 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " pod="openstack/ceilometer-0" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.409828 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-config-data\") pod \"ceilometer-0\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " pod="openstack/ceilometer-0" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.411526 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " pod="openstack/ceilometer-0" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.419424 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a01d08c-d6df-4d6f-8541-1900fdc49572" path="/var/lib/kubelet/pods/0a01d08c-d6df-4d6f-8541-1900fdc49572/volumes" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.422013 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-scripts\") pod \"ceilometer-0\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " pod="openstack/ceilometer-0" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.423962 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54hjp\" (UniqueName: \"kubernetes.io/projected/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-kube-api-access-54hjp\") pod \"ceilometer-0\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " pod="openstack/ceilometer-0" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.534264 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.847303 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539290-wch24" event={"ID":"29833925-b21b-44d4-954c-e3252e5e69c4","Type":"ContainerStarted","Data":"2ef9903d192bdc03acaf2fe74facecbf1f886cfe12a9446e1e544403dd9c2365"} Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.847348 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539290-wch24" event={"ID":"29833925-b21b-44d4-954c-e3252e5e69c4","Type":"ContainerStarted","Data":"7de1da900ab59863539f4e6656b31f82070f579ba91efc05ea10d5e6c1d0ff39"} Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.849715 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539290-npvbb" event={"ID":"d3644e57-7093-4402-a6f2-48ed10ac14fa","Type":"ContainerStarted","Data":"4f151dd273dafbf61f585fc22ccdced51fd774a77ab507a61fa8f9d6f851d433"} Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.868217 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29539290-wch24" podStartSLOduration=1.8682008300000001 podStartE2EDuration="1.86820083s" podCreationTimestamp="2026-03-01 09:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:30:01.863409425 +0000 UTC m=+1331.105288632" watchObservedRunningTime="2026-03-01 09:30:01.86820083 +0000 UTC m=+1331.110080027" Mar 01 09:30:02 crc kubenswrapper[4792]: I0301 09:30:02.003326 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:30:02 crc kubenswrapper[4792]: W0301 09:30:02.017447 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b9dfa7c_35ce_4f0d_9439_ed55e060a486.slice/crio-13d1178136245bb5d7f800642b1b34582d1e453c7a1057abd9a450d34e19ac11 WatchSource:0}: Error finding container 13d1178136245bb5d7f800642b1b34582d1e453c7a1057abd9a450d34e19ac11: Status 404 returned error can't find the container with id 13d1178136245bb5d7f800642b1b34582d1e453c7a1057abd9a450d34e19ac11 Mar 01 09:30:02 crc kubenswrapper[4792]: I0301 09:30:02.859939 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0b9dfa7c-35ce-4f0d-9439-ed55e060a486","Type":"ContainerStarted","Data":"98dcb90013209d7c9aae298e15969a8ba76fb60c52fecc8c46812f9b8ef37321"} Mar 01 09:30:02 crc kubenswrapper[4792]: I0301 09:30:02.860238 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0b9dfa7c-35ce-4f0d-9439-ed55e060a486","Type":"ContainerStarted","Data":"13d1178136245bb5d7f800642b1b34582d1e453c7a1057abd9a450d34e19ac11"} Mar 01 09:30:02 crc kubenswrapper[4792]: I0301 09:30:02.861516 4792 generic.go:334] "Generic (PLEG): container finished" podID="29833925-b21b-44d4-954c-e3252e5e69c4" containerID="2ef9903d192bdc03acaf2fe74facecbf1f886cfe12a9446e1e544403dd9c2365" exitCode=0 Mar 01 09:30:02 crc kubenswrapper[4792]: I0301 09:30:02.861550 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539290-wch24" event={"ID":"29833925-b21b-44d4-954c-e3252e5e69c4","Type":"ContainerDied","Data":"2ef9903d192bdc03acaf2fe74facecbf1f886cfe12a9446e1e544403dd9c2365"} Mar 01 09:30:03 crc kubenswrapper[4792]: I0301 09:30:03.882098 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0b9dfa7c-35ce-4f0d-9439-ed55e060a486","Type":"ContainerStarted","Data":"26d4385a852587fbc619f3d05dc7aef49f2ba1f691477d85d71993fa4441f74f"} Mar 01 09:30:03 crc kubenswrapper[4792]: I0301 09:30:03.882407 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0b9dfa7c-35ce-4f0d-9439-ed55e060a486","Type":"ContainerStarted","Data":"b17992a1fa202276e607ebb67128eedd9320a4b892cde2ad53d17b1a493bd1e6"} Mar 01 09:30:03 crc kubenswrapper[4792]: I0301 09:30:03.886287 4792 generic.go:334] "Generic (PLEG): container finished" podID="de6ead5c-face-41ff-ab6e-aebb7ca73c1c" containerID="b0e1e09f850992f4661d6be2a0a76260a157f0e5c0875fd88ff0bc92644a8d13" exitCode=0 Mar 01 09:30:03 crc kubenswrapper[4792]: I0301 09:30:03.887292 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-tp5l7" event={"ID":"de6ead5c-face-41ff-ab6e-aebb7ca73c1c","Type":"ContainerDied","Data":"b0e1e09f850992f4661d6be2a0a76260a157f0e5c0875fd88ff0bc92644a8d13"} Mar 01 09:30:04 crc kubenswrapper[4792]: I0301 09:30:04.196269 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539290-wch24" Mar 01 09:30:04 crc kubenswrapper[4792]: I0301 09:30:04.261718 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/29833925-b21b-44d4-954c-e3252e5e69c4-secret-volume\") pod \"29833925-b21b-44d4-954c-e3252e5e69c4\" (UID: \"29833925-b21b-44d4-954c-e3252e5e69c4\") " Mar 01 09:30:04 crc kubenswrapper[4792]: I0301 09:30:04.261850 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/29833925-b21b-44d4-954c-e3252e5e69c4-config-volume\") pod \"29833925-b21b-44d4-954c-e3252e5e69c4\" (UID: \"29833925-b21b-44d4-954c-e3252e5e69c4\") " Mar 01 09:30:04 crc kubenswrapper[4792]: I0301 09:30:04.261962 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjrfl\" (UniqueName: \"kubernetes.io/projected/29833925-b21b-44d4-954c-e3252e5e69c4-kube-api-access-mjrfl\") pod \"29833925-b21b-44d4-954c-e3252e5e69c4\" (UID: \"29833925-b21b-44d4-954c-e3252e5e69c4\") " Mar 01 09:30:04 crc kubenswrapper[4792]: I0301 09:30:04.262618 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29833925-b21b-44d4-954c-e3252e5e69c4-config-volume" (OuterVolumeSpecName: "config-volume") pod "29833925-b21b-44d4-954c-e3252e5e69c4" (UID: "29833925-b21b-44d4-954c-e3252e5e69c4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:30:04 crc kubenswrapper[4792]: I0301 09:30:04.267024 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29833925-b21b-44d4-954c-e3252e5e69c4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "29833925-b21b-44d4-954c-e3252e5e69c4" (UID: "29833925-b21b-44d4-954c-e3252e5e69c4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:04 crc kubenswrapper[4792]: I0301 09:30:04.269010 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29833925-b21b-44d4-954c-e3252e5e69c4-kube-api-access-mjrfl" (OuterVolumeSpecName: "kube-api-access-mjrfl") pod "29833925-b21b-44d4-954c-e3252e5e69c4" (UID: "29833925-b21b-44d4-954c-e3252e5e69c4"). InnerVolumeSpecName "kube-api-access-mjrfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:30:04 crc kubenswrapper[4792]: I0301 09:30:04.365140 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjrfl\" (UniqueName: \"kubernetes.io/projected/29833925-b21b-44d4-954c-e3252e5e69c4-kube-api-access-mjrfl\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:04 crc kubenswrapper[4792]: I0301 09:30:04.365532 4792 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/29833925-b21b-44d4-954c-e3252e5e69c4-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:04 crc kubenswrapper[4792]: I0301 09:30:04.365593 4792 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/29833925-b21b-44d4-954c-e3252e5e69c4-config-volume\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:04 crc kubenswrapper[4792]: I0301 09:30:04.895745 4792 generic.go:334] "Generic (PLEG): container finished" podID="d3644e57-7093-4402-a6f2-48ed10ac14fa" containerID="7926ce126d7f3dd092ea29933967e6329a351e44fde88116cf9663b118841513" exitCode=0 Mar 01 09:30:04 crc kubenswrapper[4792]: I0301 09:30:04.895808 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539290-npvbb" event={"ID":"d3644e57-7093-4402-a6f2-48ed10ac14fa","Type":"ContainerDied","Data":"7926ce126d7f3dd092ea29933967e6329a351e44fde88116cf9663b118841513"} Mar 01 09:30:04 crc kubenswrapper[4792]: I0301 09:30:04.898137 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539290-wch24" Mar 01 09:30:04 crc kubenswrapper[4792]: I0301 09:30:04.898205 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539290-wch24" event={"ID":"29833925-b21b-44d4-954c-e3252e5e69c4","Type":"ContainerDied","Data":"7de1da900ab59863539f4e6656b31f82070f579ba91efc05ea10d5e6c1d0ff39"} Mar 01 09:30:04 crc kubenswrapper[4792]: I0301 09:30:04.898246 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7de1da900ab59863539f4e6656b31f82070f579ba91efc05ea10d5e6c1d0ff39" Mar 01 09:30:04 crc kubenswrapper[4792]: I0301 09:30:04.943377 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:30:04 crc kubenswrapper[4792]: I0301 09:30:04.943436 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:30:05 crc kubenswrapper[4792]: I0301 09:30:05.198623 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-tp5l7" Mar 01 09:30:05 crc kubenswrapper[4792]: I0301 09:30:05.285293 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4jjf\" (UniqueName: \"kubernetes.io/projected/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-kube-api-access-j4jjf\") pod \"de6ead5c-face-41ff-ab6e-aebb7ca73c1c\" (UID: \"de6ead5c-face-41ff-ab6e-aebb7ca73c1c\") " Mar 01 09:30:05 crc kubenswrapper[4792]: I0301 09:30:05.285385 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-scripts\") pod \"de6ead5c-face-41ff-ab6e-aebb7ca73c1c\" (UID: \"de6ead5c-face-41ff-ab6e-aebb7ca73c1c\") " Mar 01 09:30:05 crc kubenswrapper[4792]: I0301 09:30:05.285437 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-combined-ca-bundle\") pod \"de6ead5c-face-41ff-ab6e-aebb7ca73c1c\" (UID: \"de6ead5c-face-41ff-ab6e-aebb7ca73c1c\") " Mar 01 09:30:05 crc kubenswrapper[4792]: I0301 09:30:05.285513 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-config-data\") pod \"de6ead5c-face-41ff-ab6e-aebb7ca73c1c\" (UID: \"de6ead5c-face-41ff-ab6e-aebb7ca73c1c\") " Mar 01 09:30:05 crc kubenswrapper[4792]: I0301 09:30:05.291605 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-kube-api-access-j4jjf" (OuterVolumeSpecName: "kube-api-access-j4jjf") pod "de6ead5c-face-41ff-ab6e-aebb7ca73c1c" (UID: "de6ead5c-face-41ff-ab6e-aebb7ca73c1c"). InnerVolumeSpecName "kube-api-access-j4jjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:30:05 crc kubenswrapper[4792]: I0301 09:30:05.307153 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-scripts" (OuterVolumeSpecName: "scripts") pod "de6ead5c-face-41ff-ab6e-aebb7ca73c1c" (UID: "de6ead5c-face-41ff-ab6e-aebb7ca73c1c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:05 crc kubenswrapper[4792]: I0301 09:30:05.312714 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-config-data" (OuterVolumeSpecName: "config-data") pod "de6ead5c-face-41ff-ab6e-aebb7ca73c1c" (UID: "de6ead5c-face-41ff-ab6e-aebb7ca73c1c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:05 crc kubenswrapper[4792]: I0301 09:30:05.316936 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de6ead5c-face-41ff-ab6e-aebb7ca73c1c" (UID: "de6ead5c-face-41ff-ab6e-aebb7ca73c1c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:05 crc kubenswrapper[4792]: I0301 09:30:05.387781 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4jjf\" (UniqueName: \"kubernetes.io/projected/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-kube-api-access-j4jjf\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:05 crc kubenswrapper[4792]: I0301 09:30:05.387816 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:05 crc kubenswrapper[4792]: I0301 09:30:05.387826 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:05 crc kubenswrapper[4792]: I0301 09:30:05.387840 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:05 crc kubenswrapper[4792]: I0301 09:30:05.907502 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-tp5l7" event={"ID":"de6ead5c-face-41ff-ab6e-aebb7ca73c1c","Type":"ContainerDied","Data":"69bc546e1a44d03d4232782332f19345cbd8f578e040e6ba8b8c80c9abe934e9"} Mar 01 09:30:05 crc kubenswrapper[4792]: I0301 09:30:05.907539 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69bc546e1a44d03d4232782332f19345cbd8f578e040e6ba8b8c80c9abe934e9" Mar 01 09:30:05 crc kubenswrapper[4792]: I0301 09:30:05.907587 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-tp5l7" Mar 01 09:30:05 crc kubenswrapper[4792]: I0301 09:30:05.913061 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0b9dfa7c-35ce-4f0d-9439-ed55e060a486","Type":"ContainerStarted","Data":"ea052cff5c038f68038711c9c5471ef75635b4c2956cef2d713687d1ae0e9463"} Mar 01 09:30:05 crc kubenswrapper[4792]: I0301 09:30:05.913105 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 01 09:30:05 crc kubenswrapper[4792]: I0301 09:30:05.946581 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.413804943 podStartE2EDuration="4.946561921s" podCreationTimestamp="2026-03-01 09:30:01 +0000 UTC" firstStartedPulling="2026-03-01 09:30:02.019749034 +0000 UTC m=+1331.261628231" lastFinishedPulling="2026-03-01 09:30:05.552506012 +0000 UTC m=+1334.794385209" observedRunningTime="2026-03-01 09:30:05.939482401 +0000 UTC m=+1335.181361598" watchObservedRunningTime="2026-03-01 09:30:05.946561921 +0000 UTC m=+1335.188441118" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.006018 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 01 09:30:06 crc kubenswrapper[4792]: E0301 09:30:06.006476 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de6ead5c-face-41ff-ab6e-aebb7ca73c1c" containerName="nova-cell0-conductor-db-sync" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.006510 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="de6ead5c-face-41ff-ab6e-aebb7ca73c1c" containerName="nova-cell0-conductor-db-sync" Mar 01 09:30:06 crc kubenswrapper[4792]: E0301 09:30:06.006525 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29833925-b21b-44d4-954c-e3252e5e69c4" containerName="collect-profiles" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.006533 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="29833925-b21b-44d4-954c-e3252e5e69c4" containerName="collect-profiles" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.006724 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="29833925-b21b-44d4-954c-e3252e5e69c4" containerName="collect-profiles" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.006761 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="de6ead5c-face-41ff-ab6e-aebb7ca73c1c" containerName="nova-cell0-conductor-db-sync" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.008261 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.011165 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-hfrkw" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.022146 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.035807 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.098885 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f95aafcd-79b6-4ece-b3e1-ee9ea32a2754-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f95aafcd-79b6-4ece-b3e1-ee9ea32a2754\") " pod="openstack/nova-cell0-conductor-0" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.099099 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tffl\" (UniqueName: \"kubernetes.io/projected/f95aafcd-79b6-4ece-b3e1-ee9ea32a2754-kube-api-access-9tffl\") pod \"nova-cell0-conductor-0\" (UID: \"f95aafcd-79b6-4ece-b3e1-ee9ea32a2754\") " pod="openstack/nova-cell0-conductor-0" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.099175 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f95aafcd-79b6-4ece-b3e1-ee9ea32a2754-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f95aafcd-79b6-4ece-b3e1-ee9ea32a2754\") " pod="openstack/nova-cell0-conductor-0" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.200773 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tffl\" (UniqueName: \"kubernetes.io/projected/f95aafcd-79b6-4ece-b3e1-ee9ea32a2754-kube-api-access-9tffl\") pod \"nova-cell0-conductor-0\" (UID: \"f95aafcd-79b6-4ece-b3e1-ee9ea32a2754\") " pod="openstack/nova-cell0-conductor-0" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.200851 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f95aafcd-79b6-4ece-b3e1-ee9ea32a2754-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f95aafcd-79b6-4ece-b3e1-ee9ea32a2754\") " pod="openstack/nova-cell0-conductor-0" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.201047 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f95aafcd-79b6-4ece-b3e1-ee9ea32a2754-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f95aafcd-79b6-4ece-b3e1-ee9ea32a2754\") " pod="openstack/nova-cell0-conductor-0" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.207185 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f95aafcd-79b6-4ece-b3e1-ee9ea32a2754-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f95aafcd-79b6-4ece-b3e1-ee9ea32a2754\") " pod="openstack/nova-cell0-conductor-0" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.207682 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f95aafcd-79b6-4ece-b3e1-ee9ea32a2754-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f95aafcd-79b6-4ece-b3e1-ee9ea32a2754\") " pod="openstack/nova-cell0-conductor-0" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.218054 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tffl\" (UniqueName: \"kubernetes.io/projected/f95aafcd-79b6-4ece-b3e1-ee9ea32a2754-kube-api-access-9tffl\") pod \"nova-cell0-conductor-0\" (UID: \"f95aafcd-79b6-4ece-b3e1-ee9ea32a2754\") " pod="openstack/nova-cell0-conductor-0" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.304672 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539290-npvbb" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.325592 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.403960 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxxtb\" (UniqueName: \"kubernetes.io/projected/d3644e57-7093-4402-a6f2-48ed10ac14fa-kube-api-access-nxxtb\") pod \"d3644e57-7093-4402-a6f2-48ed10ac14fa\" (UID: \"d3644e57-7093-4402-a6f2-48ed10ac14fa\") " Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.419164 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3644e57-7093-4402-a6f2-48ed10ac14fa-kube-api-access-nxxtb" (OuterVolumeSpecName: "kube-api-access-nxxtb") pod "d3644e57-7093-4402-a6f2-48ed10ac14fa" (UID: "d3644e57-7093-4402-a6f2-48ed10ac14fa"). InnerVolumeSpecName "kube-api-access-nxxtb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.507100 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxxtb\" (UniqueName: \"kubernetes.io/projected/d3644e57-7093-4402-a6f2-48ed10ac14fa-kube-api-access-nxxtb\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.745894 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 01 09:30:06 crc kubenswrapper[4792]: W0301 09:30:06.748034 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf95aafcd_79b6_4ece_b3e1_ee9ea32a2754.slice/crio-4eadc073525a6b5d36de1b41c86d60c694e2676d2f2e09432b3268b0c047e386 WatchSource:0}: Error finding container 4eadc073525a6b5d36de1b41c86d60c694e2676d2f2e09432b3268b0c047e386: Status 404 returned error can't find the container with id 4eadc073525a6b5d36de1b41c86d60c694e2676d2f2e09432b3268b0c047e386 Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.922109 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539290-npvbb" event={"ID":"d3644e57-7093-4402-a6f2-48ed10ac14fa","Type":"ContainerDied","Data":"4f151dd273dafbf61f585fc22ccdced51fd774a77ab507a61fa8f9d6f851d433"} Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.923446 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f151dd273dafbf61f585fc22ccdced51fd774a77ab507a61fa8f9d6f851d433" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.923532 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539290-npvbb" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.926512 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"f95aafcd-79b6-4ece-b3e1-ee9ea32a2754","Type":"ContainerStarted","Data":"4eadc073525a6b5d36de1b41c86d60c694e2676d2f2e09432b3268b0c047e386"} Mar 01 09:30:07 crc kubenswrapper[4792]: I0301 09:30:07.386019 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539284-g5rbc"] Mar 01 09:30:07 crc kubenswrapper[4792]: I0301 09:30:07.394644 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539284-g5rbc"] Mar 01 09:30:07 crc kubenswrapper[4792]: I0301 09:30:07.418714 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1263f40a-23c7-4ab8-8ebc-7c697e2eacd6" path="/var/lib/kubelet/pods/1263f40a-23c7-4ab8-8ebc-7c697e2eacd6/volumes" Mar 01 09:30:07 crc kubenswrapper[4792]: I0301 09:30:07.934130 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"f95aafcd-79b6-4ece-b3e1-ee9ea32a2754","Type":"ContainerStarted","Data":"7a5248b939415c8e8f78362b848bacdcd5e78b6f41deb4adf883e48ced4035df"} Mar 01 09:30:07 crc kubenswrapper[4792]: I0301 09:30:07.934280 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 01 09:30:07 crc kubenswrapper[4792]: I0301 09:30:07.953217 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.9531967569999997 podStartE2EDuration="2.953196757s" podCreationTimestamp="2026-03-01 09:30:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:30:07.94623533 +0000 UTC m=+1337.188114537" watchObservedRunningTime="2026-03-01 09:30:07.953196757 +0000 UTC m=+1337.195075954" Mar 01 09:30:11 crc kubenswrapper[4792]: I0301 09:30:11.354439 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 01 09:30:11 crc kubenswrapper[4792]: I0301 09:30:11.792500 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-8vfwt"] Mar 01 09:30:11 crc kubenswrapper[4792]: E0301 09:30:11.793131 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3644e57-7093-4402-a6f2-48ed10ac14fa" containerName="oc" Mar 01 09:30:11 crc kubenswrapper[4792]: I0301 09:30:11.793147 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3644e57-7093-4402-a6f2-48ed10ac14fa" containerName="oc" Mar 01 09:30:11 crc kubenswrapper[4792]: I0301 09:30:11.793331 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3644e57-7093-4402-a6f2-48ed10ac14fa" containerName="oc" Mar 01 09:30:11 crc kubenswrapper[4792]: I0301 09:30:11.793840 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-8vfwt" Mar 01 09:30:11 crc kubenswrapper[4792]: I0301 09:30:11.797553 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 01 09:30:11 crc kubenswrapper[4792]: I0301 09:30:11.797829 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 01 09:30:11 crc kubenswrapper[4792]: I0301 09:30:11.814703 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-8vfwt"] Mar 01 09:30:11 crc kubenswrapper[4792]: I0301 09:30:11.900672 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32a84376-7418-49cd-9c62-fdd1af7ec31b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-8vfwt\" (UID: \"32a84376-7418-49cd-9c62-fdd1af7ec31b\") " pod="openstack/nova-cell0-cell-mapping-8vfwt" Mar 01 09:30:11 crc kubenswrapper[4792]: I0301 09:30:11.900744 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l625\" (UniqueName: \"kubernetes.io/projected/32a84376-7418-49cd-9c62-fdd1af7ec31b-kube-api-access-8l625\") pod \"nova-cell0-cell-mapping-8vfwt\" (UID: \"32a84376-7418-49cd-9c62-fdd1af7ec31b\") " pod="openstack/nova-cell0-cell-mapping-8vfwt" Mar 01 09:30:11 crc kubenswrapper[4792]: I0301 09:30:11.900780 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32a84376-7418-49cd-9c62-fdd1af7ec31b-scripts\") pod \"nova-cell0-cell-mapping-8vfwt\" (UID: \"32a84376-7418-49cd-9c62-fdd1af7ec31b\") " pod="openstack/nova-cell0-cell-mapping-8vfwt" Mar 01 09:30:11 crc kubenswrapper[4792]: I0301 09:30:11.900811 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32a84376-7418-49cd-9c62-fdd1af7ec31b-config-data\") pod \"nova-cell0-cell-mapping-8vfwt\" (UID: \"32a84376-7418-49cd-9c62-fdd1af7ec31b\") " pod="openstack/nova-cell0-cell-mapping-8vfwt" Mar 01 09:30:11 crc kubenswrapper[4792]: I0301 09:30:11.935702 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 01 09:30:11 crc kubenswrapper[4792]: I0301 09:30:11.947001 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 01 09:30:11 crc kubenswrapper[4792]: I0301 09:30:11.949484 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 01 09:30:11 crc kubenswrapper[4792]: I0301 09:30:11.969600 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.001966 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwtn2\" (UniqueName: \"kubernetes.io/projected/892ffeb5-f853-45a6-9ad4-a2a40437a406-kube-api-access-gwtn2\") pod \"nova-metadata-0\" (UID: \"892ffeb5-f853-45a6-9ad4-a2a40437a406\") " pod="openstack/nova-metadata-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.002020 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/892ffeb5-f853-45a6-9ad4-a2a40437a406-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"892ffeb5-f853-45a6-9ad4-a2a40437a406\") " pod="openstack/nova-metadata-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.002068 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32a84376-7418-49cd-9c62-fdd1af7ec31b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-8vfwt\" (UID: \"32a84376-7418-49cd-9c62-fdd1af7ec31b\") " pod="openstack/nova-cell0-cell-mapping-8vfwt" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.002100 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/892ffeb5-f853-45a6-9ad4-a2a40437a406-logs\") pod \"nova-metadata-0\" (UID: \"892ffeb5-f853-45a6-9ad4-a2a40437a406\") " pod="openstack/nova-metadata-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.002131 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l625\" (UniqueName: \"kubernetes.io/projected/32a84376-7418-49cd-9c62-fdd1af7ec31b-kube-api-access-8l625\") pod \"nova-cell0-cell-mapping-8vfwt\" (UID: \"32a84376-7418-49cd-9c62-fdd1af7ec31b\") " pod="openstack/nova-cell0-cell-mapping-8vfwt" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.002161 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32a84376-7418-49cd-9c62-fdd1af7ec31b-scripts\") pod \"nova-cell0-cell-mapping-8vfwt\" (UID: \"32a84376-7418-49cd-9c62-fdd1af7ec31b\") " pod="openstack/nova-cell0-cell-mapping-8vfwt" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.003984 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32a84376-7418-49cd-9c62-fdd1af7ec31b-config-data\") pod \"nova-cell0-cell-mapping-8vfwt\" (UID: \"32a84376-7418-49cd-9c62-fdd1af7ec31b\") " pod="openstack/nova-cell0-cell-mapping-8vfwt" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.004029 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/892ffeb5-f853-45a6-9ad4-a2a40437a406-config-data\") pod \"nova-metadata-0\" (UID: \"892ffeb5-f853-45a6-9ad4-a2a40437a406\") " pod="openstack/nova-metadata-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.022285 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32a84376-7418-49cd-9c62-fdd1af7ec31b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-8vfwt\" (UID: \"32a84376-7418-49cd-9c62-fdd1af7ec31b\") " pod="openstack/nova-cell0-cell-mapping-8vfwt" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.031147 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32a84376-7418-49cd-9c62-fdd1af7ec31b-scripts\") pod \"nova-cell0-cell-mapping-8vfwt\" (UID: \"32a84376-7418-49cd-9c62-fdd1af7ec31b\") " pod="openstack/nova-cell0-cell-mapping-8vfwt" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.033007 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32a84376-7418-49cd-9c62-fdd1af7ec31b-config-data\") pod \"nova-cell0-cell-mapping-8vfwt\" (UID: \"32a84376-7418-49cd-9c62-fdd1af7ec31b\") " pod="openstack/nova-cell0-cell-mapping-8vfwt" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.045567 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.045746 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l625\" (UniqueName: \"kubernetes.io/projected/32a84376-7418-49cd-9c62-fdd1af7ec31b-kube-api-access-8l625\") pod \"nova-cell0-cell-mapping-8vfwt\" (UID: \"32a84376-7418-49cd-9c62-fdd1af7ec31b\") " pod="openstack/nova-cell0-cell-mapping-8vfwt" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.048354 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.058316 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.074459 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.109836 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49690073-1340-4686-bc4b-f69901bb45d9-config-data\") pod \"nova-api-0\" (UID: \"49690073-1340-4686-bc4b-f69901bb45d9\") " pod="openstack/nova-api-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.109927 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvskn\" (UniqueName: \"kubernetes.io/projected/49690073-1340-4686-bc4b-f69901bb45d9-kube-api-access-dvskn\") pod \"nova-api-0\" (UID: \"49690073-1340-4686-bc4b-f69901bb45d9\") " pod="openstack/nova-api-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.109970 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/892ffeb5-f853-45a6-9ad4-a2a40437a406-logs\") pod \"nova-metadata-0\" (UID: \"892ffeb5-f853-45a6-9ad4-a2a40437a406\") " pod="openstack/nova-metadata-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.110000 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49690073-1340-4686-bc4b-f69901bb45d9-logs\") pod \"nova-api-0\" (UID: \"49690073-1340-4686-bc4b-f69901bb45d9\") " pod="openstack/nova-api-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.110072 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/892ffeb5-f853-45a6-9ad4-a2a40437a406-config-data\") pod \"nova-metadata-0\" (UID: \"892ffeb5-f853-45a6-9ad4-a2a40437a406\") " pod="openstack/nova-metadata-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.110097 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49690073-1340-4686-bc4b-f69901bb45d9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"49690073-1340-4686-bc4b-f69901bb45d9\") " pod="openstack/nova-api-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.110163 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwtn2\" (UniqueName: \"kubernetes.io/projected/892ffeb5-f853-45a6-9ad4-a2a40437a406-kube-api-access-gwtn2\") pod \"nova-metadata-0\" (UID: \"892ffeb5-f853-45a6-9ad4-a2a40437a406\") " pod="openstack/nova-metadata-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.110189 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/892ffeb5-f853-45a6-9ad4-a2a40437a406-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"892ffeb5-f853-45a6-9ad4-a2a40437a406\") " pod="openstack/nova-metadata-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.111128 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/892ffeb5-f853-45a6-9ad4-a2a40437a406-logs\") pod \"nova-metadata-0\" (UID: \"892ffeb5-f853-45a6-9ad4-a2a40437a406\") " pod="openstack/nova-metadata-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.116111 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-8vfwt" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.160747 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/892ffeb5-f853-45a6-9ad4-a2a40437a406-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"892ffeb5-f853-45a6-9ad4-a2a40437a406\") " pod="openstack/nova-metadata-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.162026 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/892ffeb5-f853-45a6-9ad4-a2a40437a406-config-data\") pod \"nova-metadata-0\" (UID: \"892ffeb5-f853-45a6-9ad4-a2a40437a406\") " pod="openstack/nova-metadata-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.213361 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49690073-1340-4686-bc4b-f69901bb45d9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"49690073-1340-4686-bc4b-f69901bb45d9\") " pod="openstack/nova-api-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.213706 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49690073-1340-4686-bc4b-f69901bb45d9-config-data\") pod \"nova-api-0\" (UID: \"49690073-1340-4686-bc4b-f69901bb45d9\") " pod="openstack/nova-api-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.213761 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvskn\" (UniqueName: \"kubernetes.io/projected/49690073-1340-4686-bc4b-f69901bb45d9-kube-api-access-dvskn\") pod \"nova-api-0\" (UID: \"49690073-1340-4686-bc4b-f69901bb45d9\") " pod="openstack/nova-api-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.213796 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49690073-1340-4686-bc4b-f69901bb45d9-logs\") pod \"nova-api-0\" (UID: \"49690073-1340-4686-bc4b-f69901bb45d9\") " pod="openstack/nova-api-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.214502 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49690073-1340-4686-bc4b-f69901bb45d9-logs\") pod \"nova-api-0\" (UID: \"49690073-1340-4686-bc4b-f69901bb45d9\") " pod="openstack/nova-api-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.229221 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwtn2\" (UniqueName: \"kubernetes.io/projected/892ffeb5-f853-45a6-9ad4-a2a40437a406-kube-api-access-gwtn2\") pod \"nova-metadata-0\" (UID: \"892ffeb5-f853-45a6-9ad4-a2a40437a406\") " pod="openstack/nova-metadata-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.230629 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49690073-1340-4686-bc4b-f69901bb45d9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"49690073-1340-4686-bc4b-f69901bb45d9\") " pod="openstack/nova-api-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.235766 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49690073-1340-4686-bc4b-f69901bb45d9-config-data\") pod \"nova-api-0\" (UID: \"49690073-1340-4686-bc4b-f69901bb45d9\") " pod="openstack/nova-api-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.266639 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.267391 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvskn\" (UniqueName: \"kubernetes.io/projected/49690073-1340-4686-bc4b-f69901bb45d9-kube-api-access-dvskn\") pod \"nova-api-0\" (UID: \"49690073-1340-4686-bc4b-f69901bb45d9\") " pod="openstack/nova-api-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.285315 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.286515 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.302529 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.311715 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.348788 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.380257 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw"] Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.382898 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.404041 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw"] Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.420091 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.421397 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.423789 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.424651 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-dns-svc\") pod \"dnsmasq-dns-7ff5b4cd7c-d6nmw\" (UID: \"dae0c901-5f9c-4248-96dd-08acb2b5d278\") " pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.424682 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-config\") pod \"dnsmasq-dns-7ff5b4cd7c-d6nmw\" (UID: \"dae0c901-5f9c-4248-96dd-08acb2b5d278\") " pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.424701 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b24b4c8-4f85-4eae-93c7-3249c1a54f09-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2b24b4c8-4f85-4eae-93c7-3249c1a54f09\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.424725 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8015a6e-cf5d-4728-b2e6-66bb8960fd40-config-data\") pod \"nova-scheduler-0\" (UID: \"b8015a6e-cf5d-4728-b2e6-66bb8960fd40\") " pod="openstack/nova-scheduler-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.424748 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwdw9\" (UniqueName: \"kubernetes.io/projected/b8015a6e-cf5d-4728-b2e6-66bb8960fd40-kube-api-access-pwdw9\") pod \"nova-scheduler-0\" (UID: \"b8015a6e-cf5d-4728-b2e6-66bb8960fd40\") " pod="openstack/nova-scheduler-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.424780 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b24b4c8-4f85-4eae-93c7-3249c1a54f09-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2b24b4c8-4f85-4eae-93c7-3249c1a54f09\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.424812 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s2hb\" (UniqueName: \"kubernetes.io/projected/2b24b4c8-4f85-4eae-93c7-3249c1a54f09-kube-api-access-5s2hb\") pod \"nova-cell1-novncproxy-0\" (UID: \"2b24b4c8-4f85-4eae-93c7-3249c1a54f09\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.424850 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8015a6e-cf5d-4728-b2e6-66bb8960fd40-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b8015a6e-cf5d-4728-b2e6-66bb8960fd40\") " pod="openstack/nova-scheduler-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.424934 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5b4cd7c-d6nmw\" (UID: \"dae0c901-5f9c-4248-96dd-08acb2b5d278\") " pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.424969 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhfgl\" (UniqueName: \"kubernetes.io/projected/dae0c901-5f9c-4248-96dd-08acb2b5d278-kube-api-access-rhfgl\") pod \"dnsmasq-dns-7ff5b4cd7c-d6nmw\" (UID: \"dae0c901-5f9c-4248-96dd-08acb2b5d278\") " pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.424992 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5b4cd7c-d6nmw\" (UID: \"dae0c901-5f9c-4248-96dd-08acb2b5d278\") " pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.445425 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.526761 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b24b4c8-4f85-4eae-93c7-3249c1a54f09-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2b24b4c8-4f85-4eae-93c7-3249c1a54f09\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.526846 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s2hb\" (UniqueName: \"kubernetes.io/projected/2b24b4c8-4f85-4eae-93c7-3249c1a54f09-kube-api-access-5s2hb\") pod \"nova-cell1-novncproxy-0\" (UID: \"2b24b4c8-4f85-4eae-93c7-3249c1a54f09\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.526929 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8015a6e-cf5d-4728-b2e6-66bb8960fd40-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b8015a6e-cf5d-4728-b2e6-66bb8960fd40\") " pod="openstack/nova-scheduler-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.527016 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5b4cd7c-d6nmw\" (UID: \"dae0c901-5f9c-4248-96dd-08acb2b5d278\") " pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.527051 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhfgl\" (UniqueName: \"kubernetes.io/projected/dae0c901-5f9c-4248-96dd-08acb2b5d278-kube-api-access-rhfgl\") pod \"dnsmasq-dns-7ff5b4cd7c-d6nmw\" (UID: \"dae0c901-5f9c-4248-96dd-08acb2b5d278\") " pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.527100 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5b4cd7c-d6nmw\" (UID: \"dae0c901-5f9c-4248-96dd-08acb2b5d278\") " pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.527187 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-dns-svc\") pod \"dnsmasq-dns-7ff5b4cd7c-d6nmw\" (UID: \"dae0c901-5f9c-4248-96dd-08acb2b5d278\") " pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.527241 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-config\") pod \"dnsmasq-dns-7ff5b4cd7c-d6nmw\" (UID: \"dae0c901-5f9c-4248-96dd-08acb2b5d278\") " pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.527265 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b24b4c8-4f85-4eae-93c7-3249c1a54f09-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2b24b4c8-4f85-4eae-93c7-3249c1a54f09\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.527295 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8015a6e-cf5d-4728-b2e6-66bb8960fd40-config-data\") pod \"nova-scheduler-0\" (UID: \"b8015a6e-cf5d-4728-b2e6-66bb8960fd40\") " pod="openstack/nova-scheduler-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.527350 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwdw9\" (UniqueName: \"kubernetes.io/projected/b8015a6e-cf5d-4728-b2e6-66bb8960fd40-kube-api-access-pwdw9\") pod \"nova-scheduler-0\" (UID: \"b8015a6e-cf5d-4728-b2e6-66bb8960fd40\") " pod="openstack/nova-scheduler-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.528250 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5b4cd7c-d6nmw\" (UID: \"dae0c901-5f9c-4248-96dd-08acb2b5d278\") " pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.529015 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5b4cd7c-d6nmw\" (UID: \"dae0c901-5f9c-4248-96dd-08acb2b5d278\") " pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.533646 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-config\") pod \"dnsmasq-dns-7ff5b4cd7c-d6nmw\" (UID: \"dae0c901-5f9c-4248-96dd-08acb2b5d278\") " pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.529852 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-dns-svc\") pod \"dnsmasq-dns-7ff5b4cd7c-d6nmw\" (UID: \"dae0c901-5f9c-4248-96dd-08acb2b5d278\") " pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.552742 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhfgl\" (UniqueName: \"kubernetes.io/projected/dae0c901-5f9c-4248-96dd-08acb2b5d278-kube-api-access-rhfgl\") pod \"dnsmasq-dns-7ff5b4cd7c-d6nmw\" (UID: \"dae0c901-5f9c-4248-96dd-08acb2b5d278\") " pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.556091 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s2hb\" (UniqueName: \"kubernetes.io/projected/2b24b4c8-4f85-4eae-93c7-3249c1a54f09-kube-api-access-5s2hb\") pod \"nova-cell1-novncproxy-0\" (UID: \"2b24b4c8-4f85-4eae-93c7-3249c1a54f09\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.561131 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b24b4c8-4f85-4eae-93c7-3249c1a54f09-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2b24b4c8-4f85-4eae-93c7-3249c1a54f09\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.561380 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b24b4c8-4f85-4eae-93c7-3249c1a54f09-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2b24b4c8-4f85-4eae-93c7-3249c1a54f09\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.563509 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8015a6e-cf5d-4728-b2e6-66bb8960fd40-config-data\") pod \"nova-scheduler-0\" (UID: \"b8015a6e-cf5d-4728-b2e6-66bb8960fd40\") " pod="openstack/nova-scheduler-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.563880 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwdw9\" (UniqueName: \"kubernetes.io/projected/b8015a6e-cf5d-4728-b2e6-66bb8960fd40-kube-api-access-pwdw9\") pod \"nova-scheduler-0\" (UID: \"b8015a6e-cf5d-4728-b2e6-66bb8960fd40\") " pod="openstack/nova-scheduler-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.573097 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8015a6e-cf5d-4728-b2e6-66bb8960fd40-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b8015a6e-cf5d-4728-b2e6-66bb8960fd40\") " pod="openstack/nova-scheduler-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.653448 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.713589 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.749569 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.754700 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-8vfwt"] Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.953174 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.015164 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"49690073-1340-4686-bc4b-f69901bb45d9","Type":"ContainerStarted","Data":"527b5d3ab3c0911a39430a25a4440c1f01d3f9995da50cf30f22affc0352f613"} Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.018019 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-8vfwt" event={"ID":"32a84376-7418-49cd-9c62-fdd1af7ec31b","Type":"ContainerStarted","Data":"3c68bb44e29a7b14a923b915dc00892bf89acf28bc6734efebeb0e2ee1c07a61"} Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.108666 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.224578 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tjd85"] Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.226392 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-tjd85" Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.233893 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.239938 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.249554 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-config-data\") pod \"nova-cell1-conductor-db-sync-tjd85\" (UID: \"7269b8b7-440f-4fae-b0f1-f624e9d5b29a\") " pod="openstack/nova-cell1-conductor-db-sync-tjd85" Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.249601 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5qnd\" (UniqueName: \"kubernetes.io/projected/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-kube-api-access-l5qnd\") pod \"nova-cell1-conductor-db-sync-tjd85\" (UID: \"7269b8b7-440f-4fae-b0f1-f624e9d5b29a\") " pod="openstack/nova-cell1-conductor-db-sync-tjd85" Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.249722 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-scripts\") pod \"nova-cell1-conductor-db-sync-tjd85\" (UID: \"7269b8b7-440f-4fae-b0f1-f624e9d5b29a\") " pod="openstack/nova-cell1-conductor-db-sync-tjd85" Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.249765 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-tjd85\" (UID: \"7269b8b7-440f-4fae-b0f1-f624e9d5b29a\") " pod="openstack/nova-cell1-conductor-db-sync-tjd85" Mar 01 09:30:13 crc kubenswrapper[4792]: W0301 09:30:13.262378 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8015a6e_cf5d_4728_b2e6_66bb8960fd40.slice/crio-8d6bd74bfd2c811ba8feb074f6cd879de61baffea6857711abd8ac6646b3af71 WatchSource:0}: Error finding container 8d6bd74bfd2c811ba8feb074f6cd879de61baffea6857711abd8ac6646b3af71: Status 404 returned error can't find the container with id 8d6bd74bfd2c811ba8feb074f6cd879de61baffea6857711abd8ac6646b3af71 Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.262921 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tjd85"] Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.277505 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.352939 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-scripts\") pod \"nova-cell1-conductor-db-sync-tjd85\" (UID: \"7269b8b7-440f-4fae-b0f1-f624e9d5b29a\") " pod="openstack/nova-cell1-conductor-db-sync-tjd85" Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.353004 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-tjd85\" (UID: \"7269b8b7-440f-4fae-b0f1-f624e9d5b29a\") " pod="openstack/nova-cell1-conductor-db-sync-tjd85" Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.353049 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-config-data\") pod \"nova-cell1-conductor-db-sync-tjd85\" (UID: \"7269b8b7-440f-4fae-b0f1-f624e9d5b29a\") " pod="openstack/nova-cell1-conductor-db-sync-tjd85" Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.353085 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5qnd\" (UniqueName: \"kubernetes.io/projected/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-kube-api-access-l5qnd\") pod \"nova-cell1-conductor-db-sync-tjd85\" (UID: \"7269b8b7-440f-4fae-b0f1-f624e9d5b29a\") " pod="openstack/nova-cell1-conductor-db-sync-tjd85" Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.377117 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-scripts\") pod \"nova-cell1-conductor-db-sync-tjd85\" (UID: \"7269b8b7-440f-4fae-b0f1-f624e9d5b29a\") " pod="openstack/nova-cell1-conductor-db-sync-tjd85" Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.377710 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-tjd85\" (UID: \"7269b8b7-440f-4fae-b0f1-f624e9d5b29a\") " pod="openstack/nova-cell1-conductor-db-sync-tjd85" Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.380383 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5qnd\" (UniqueName: \"kubernetes.io/projected/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-kube-api-access-l5qnd\") pod \"nova-cell1-conductor-db-sync-tjd85\" (UID: \"7269b8b7-440f-4fae-b0f1-f624e9d5b29a\") " pod="openstack/nova-cell1-conductor-db-sync-tjd85" Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.386799 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-config-data\") pod \"nova-cell1-conductor-db-sync-tjd85\" (UID: \"7269b8b7-440f-4fae-b0f1-f624e9d5b29a\") " pod="openstack/nova-cell1-conductor-db-sync-tjd85" Mar 01 09:30:13 crc kubenswrapper[4792]: W0301 09:30:13.425873 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddae0c901_5f9c_4248_96dd_08acb2b5d278.slice/crio-c9bb9d1c4be64e9583f3d8ffde433440cf4e39ab72b4447ef9f1e98ba96850ca WatchSource:0}: Error finding container c9bb9d1c4be64e9583f3d8ffde433440cf4e39ab72b4447ef9f1e98ba96850ca: Status 404 returned error can't find the container with id c9bb9d1c4be64e9583f3d8ffde433440cf4e39ab72b4447ef9f1e98ba96850ca Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.436899 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw"] Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.566954 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.600730 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-tjd85" Mar 01 09:30:14 crc kubenswrapper[4792]: I0301 09:30:14.064365 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-8vfwt" event={"ID":"32a84376-7418-49cd-9c62-fdd1af7ec31b","Type":"ContainerStarted","Data":"cc49a2b9acb35bd4588c6c9cb6d10085e66d67e1476ad98c515c87fcc8a40be2"} Mar 01 09:30:14 crc kubenswrapper[4792]: I0301 09:30:14.069032 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2b24b4c8-4f85-4eae-93c7-3249c1a54f09","Type":"ContainerStarted","Data":"666f673dbf785249e4b855230a831650c499d51cb9413297a868fb5bc1afca52"} Mar 01 09:30:14 crc kubenswrapper[4792]: I0301 09:30:14.077542 4792 generic.go:334] "Generic (PLEG): container finished" podID="dae0c901-5f9c-4248-96dd-08acb2b5d278" containerID="4f80ddb9167cc5bebd3ccdc43bf19f478b728967aa30e43898dc24927a2246f9" exitCode=0 Mar 01 09:30:14 crc kubenswrapper[4792]: I0301 09:30:14.077621 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" event={"ID":"dae0c901-5f9c-4248-96dd-08acb2b5d278","Type":"ContainerDied","Data":"4f80ddb9167cc5bebd3ccdc43bf19f478b728967aa30e43898dc24927a2246f9"} Mar 01 09:30:14 crc kubenswrapper[4792]: I0301 09:30:14.077652 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" event={"ID":"dae0c901-5f9c-4248-96dd-08acb2b5d278","Type":"ContainerStarted","Data":"c9bb9d1c4be64e9583f3d8ffde433440cf4e39ab72b4447ef9f1e98ba96850ca"} Mar 01 09:30:14 crc kubenswrapper[4792]: I0301 09:30:14.082508 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"892ffeb5-f853-45a6-9ad4-a2a40437a406","Type":"ContainerStarted","Data":"e1ddfe00df757587e10aacff0143490ad9255e7b1f73da1f009a37752ae1c648"} Mar 01 09:30:14 crc kubenswrapper[4792]: I0301 09:30:14.084303 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b8015a6e-cf5d-4728-b2e6-66bb8960fd40","Type":"ContainerStarted","Data":"8d6bd74bfd2c811ba8feb074f6cd879de61baffea6857711abd8ac6646b3af71"} Mar 01 09:30:14 crc kubenswrapper[4792]: I0301 09:30:14.097255 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-8vfwt" podStartSLOduration=3.097238198 podStartE2EDuration="3.097238198s" podCreationTimestamp="2026-03-01 09:30:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:30:14.090350443 +0000 UTC m=+1343.332229640" watchObservedRunningTime="2026-03-01 09:30:14.097238198 +0000 UTC m=+1343.339117395" Mar 01 09:30:14 crc kubenswrapper[4792]: I0301 09:30:14.172030 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tjd85"] Mar 01 09:30:14 crc kubenswrapper[4792]: W0301 09:30:14.242979 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7269b8b7_440f_4fae_b0f1_f624e9d5b29a.slice/crio-c0f91939f7bb3b7f30d5eba3a1ceca8589d9bda852653f48b1879bd42cead480 WatchSource:0}: Error finding container c0f91939f7bb3b7f30d5eba3a1ceca8589d9bda852653f48b1879bd42cead480: Status 404 returned error can't find the container with id c0f91939f7bb3b7f30d5eba3a1ceca8589d9bda852653f48b1879bd42cead480 Mar 01 09:30:15 crc kubenswrapper[4792]: I0301 09:30:15.102569 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" event={"ID":"dae0c901-5f9c-4248-96dd-08acb2b5d278","Type":"ContainerStarted","Data":"36ef0d467aa78ea6675158c30915ca7e3bff42a879c627274dd90d0498f741fd"} Mar 01 09:30:15 crc kubenswrapper[4792]: I0301 09:30:15.103306 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" Mar 01 09:30:15 crc kubenswrapper[4792]: I0301 09:30:15.107175 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-tjd85" event={"ID":"7269b8b7-440f-4fae-b0f1-f624e9d5b29a","Type":"ContainerStarted","Data":"7b120b9d05aec1bbdb715a0cec1430208c27b75e09d6308e245c67d773de0e22"} Mar 01 09:30:15 crc kubenswrapper[4792]: I0301 09:30:15.107201 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-tjd85" event={"ID":"7269b8b7-440f-4fae-b0f1-f624e9d5b29a","Type":"ContainerStarted","Data":"c0f91939f7bb3b7f30d5eba3a1ceca8589d9bda852653f48b1879bd42cead480"} Mar 01 09:30:15 crc kubenswrapper[4792]: I0301 09:30:15.127943 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" podStartSLOduration=3.127920062 podStartE2EDuration="3.127920062s" podCreationTimestamp="2026-03-01 09:30:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:30:15.122219465 +0000 UTC m=+1344.364098662" watchObservedRunningTime="2026-03-01 09:30:15.127920062 +0000 UTC m=+1344.369799269" Mar 01 09:30:15 crc kubenswrapper[4792]: I0301 09:30:15.146536 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-tjd85" podStartSLOduration=2.146518858 podStartE2EDuration="2.146518858s" podCreationTimestamp="2026-03-01 09:30:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:30:15.13952051 +0000 UTC m=+1344.381399707" watchObservedRunningTime="2026-03-01 09:30:15.146518858 +0000 UTC m=+1344.388398055" Mar 01 09:30:16 crc kubenswrapper[4792]: I0301 09:30:16.067376 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 01 09:30:16 crc kubenswrapper[4792]: I0301 09:30:16.100581 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 01 09:30:18 crc kubenswrapper[4792]: I0301 09:30:18.134370 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"49690073-1340-4686-bc4b-f69901bb45d9","Type":"ContainerStarted","Data":"e37d4adb8d9d5ffbf813442c1fdbcfbbf0812ab4b1f3d2f82a5ad52ffcbb0ef7"} Mar 01 09:30:18 crc kubenswrapper[4792]: I0301 09:30:18.134862 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"49690073-1340-4686-bc4b-f69901bb45d9","Type":"ContainerStarted","Data":"0a8da701899d6dbb39ad32c67cda198a10066ca5649f3a3b10692182aafa59b8"} Mar 01 09:30:18 crc kubenswrapper[4792]: I0301 09:30:18.135726 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b8015a6e-cf5d-4728-b2e6-66bb8960fd40","Type":"ContainerStarted","Data":"43517df943397eb0f37738ca82db7645bf4ef1a52cc7a3b9599b08d521c29e03"} Mar 01 09:30:18 crc kubenswrapper[4792]: I0301 09:30:18.137813 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2b24b4c8-4f85-4eae-93c7-3249c1a54f09","Type":"ContainerStarted","Data":"b93abbe99982a2c4da73af61051b2d1996a55ea6924dc1c336ede901534900b5"} Mar 01 09:30:18 crc kubenswrapper[4792]: I0301 09:30:18.137836 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="2b24b4c8-4f85-4eae-93c7-3249c1a54f09" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://b93abbe99982a2c4da73af61051b2d1996a55ea6924dc1c336ede901534900b5" gracePeriod=30 Mar 01 09:30:18 crc kubenswrapper[4792]: I0301 09:30:18.141126 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"892ffeb5-f853-45a6-9ad4-a2a40437a406","Type":"ContainerStarted","Data":"0f9e913b971c9134bc2136ec2762bd10c403ebea92fa22f81cb94880f84799fa"} Mar 01 09:30:18 crc kubenswrapper[4792]: I0301 09:30:18.141157 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"892ffeb5-f853-45a6-9ad4-a2a40437a406","Type":"ContainerStarted","Data":"93888e1a70b199ccd8eb105633970b355749358a35a6a8da18a1130d74ff49a3"} Mar 01 09:30:18 crc kubenswrapper[4792]: I0301 09:30:18.141273 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="892ffeb5-f853-45a6-9ad4-a2a40437a406" containerName="nova-metadata-log" containerID="cri-o://93888e1a70b199ccd8eb105633970b355749358a35a6a8da18a1130d74ff49a3" gracePeriod=30 Mar 01 09:30:18 crc kubenswrapper[4792]: I0301 09:30:18.141306 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="892ffeb5-f853-45a6-9ad4-a2a40437a406" containerName="nova-metadata-metadata" containerID="cri-o://0f9e913b971c9134bc2136ec2762bd10c403ebea92fa22f81cb94880f84799fa" gracePeriod=30 Mar 01 09:30:18 crc kubenswrapper[4792]: I0301 09:30:18.163529 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.119655806 podStartE2EDuration="7.163510509s" podCreationTimestamp="2026-03-01 09:30:11 +0000 UTC" firstStartedPulling="2026-03-01 09:30:12.979806444 +0000 UTC m=+1342.221685641" lastFinishedPulling="2026-03-01 09:30:17.023661147 +0000 UTC m=+1346.265540344" observedRunningTime="2026-03-01 09:30:18.158322235 +0000 UTC m=+1347.400201432" watchObservedRunningTime="2026-03-01 09:30:18.163510509 +0000 UTC m=+1347.405389706" Mar 01 09:30:18 crc kubenswrapper[4792]: I0301 09:30:18.178681 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.418364087 podStartE2EDuration="6.178663202s" podCreationTimestamp="2026-03-01 09:30:12 +0000 UTC" firstStartedPulling="2026-03-01 09:30:13.264747036 +0000 UTC m=+1342.506626233" lastFinishedPulling="2026-03-01 09:30:17.025046141 +0000 UTC m=+1346.266925348" observedRunningTime="2026-03-01 09:30:18.173885248 +0000 UTC m=+1347.415764445" watchObservedRunningTime="2026-03-01 09:30:18.178663202 +0000 UTC m=+1347.420542399" Mar 01 09:30:18 crc kubenswrapper[4792]: I0301 09:30:18.201879 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.753811261 podStartE2EDuration="6.201851828s" podCreationTimestamp="2026-03-01 09:30:12 +0000 UTC" firstStartedPulling="2026-03-01 09:30:13.582134507 +0000 UTC m=+1342.824013694" lastFinishedPulling="2026-03-01 09:30:17.030175064 +0000 UTC m=+1346.272054261" observedRunningTime="2026-03-01 09:30:18.18774829 +0000 UTC m=+1347.429627487" watchObservedRunningTime="2026-03-01 09:30:18.201851828 +0000 UTC m=+1347.443731025" Mar 01 09:30:18 crc kubenswrapper[4792]: I0301 09:30:18.217131 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.312282314 podStartE2EDuration="7.217111884s" podCreationTimestamp="2026-03-01 09:30:11 +0000 UTC" firstStartedPulling="2026-03-01 09:30:13.123394437 +0000 UTC m=+1342.365273634" lastFinishedPulling="2026-03-01 09:30:17.028224007 +0000 UTC m=+1346.270103204" observedRunningTime="2026-03-01 09:30:18.208450337 +0000 UTC m=+1347.450329534" watchObservedRunningTime="2026-03-01 09:30:18.217111884 +0000 UTC m=+1347.458991081" Mar 01 09:30:18 crc kubenswrapper[4792]: I0301 09:30:18.768191 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 01 09:30:18 crc kubenswrapper[4792]: I0301 09:30:18.946969 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwtn2\" (UniqueName: \"kubernetes.io/projected/892ffeb5-f853-45a6-9ad4-a2a40437a406-kube-api-access-gwtn2\") pod \"892ffeb5-f853-45a6-9ad4-a2a40437a406\" (UID: \"892ffeb5-f853-45a6-9ad4-a2a40437a406\") " Mar 01 09:30:18 crc kubenswrapper[4792]: I0301 09:30:18.947283 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/892ffeb5-f853-45a6-9ad4-a2a40437a406-config-data\") pod \"892ffeb5-f853-45a6-9ad4-a2a40437a406\" (UID: \"892ffeb5-f853-45a6-9ad4-a2a40437a406\") " Mar 01 09:30:18 crc kubenswrapper[4792]: I0301 09:30:18.947377 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/892ffeb5-f853-45a6-9ad4-a2a40437a406-logs\") pod \"892ffeb5-f853-45a6-9ad4-a2a40437a406\" (UID: \"892ffeb5-f853-45a6-9ad4-a2a40437a406\") " Mar 01 09:30:18 crc kubenswrapper[4792]: I0301 09:30:18.947444 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/892ffeb5-f853-45a6-9ad4-a2a40437a406-combined-ca-bundle\") pod \"892ffeb5-f853-45a6-9ad4-a2a40437a406\" (UID: \"892ffeb5-f853-45a6-9ad4-a2a40437a406\") " Mar 01 09:30:18 crc kubenswrapper[4792]: I0301 09:30:18.947716 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/892ffeb5-f853-45a6-9ad4-a2a40437a406-logs" (OuterVolumeSpecName: "logs") pod "892ffeb5-f853-45a6-9ad4-a2a40437a406" (UID: "892ffeb5-f853-45a6-9ad4-a2a40437a406"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:30:18 crc kubenswrapper[4792]: I0301 09:30:18.948156 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/892ffeb5-f853-45a6-9ad4-a2a40437a406-logs\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:18 crc kubenswrapper[4792]: I0301 09:30:18.954544 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/892ffeb5-f853-45a6-9ad4-a2a40437a406-kube-api-access-gwtn2" (OuterVolumeSpecName: "kube-api-access-gwtn2") pod "892ffeb5-f853-45a6-9ad4-a2a40437a406" (UID: "892ffeb5-f853-45a6-9ad4-a2a40437a406"). InnerVolumeSpecName "kube-api-access-gwtn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:30:18 crc kubenswrapper[4792]: I0301 09:30:18.976300 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/892ffeb5-f853-45a6-9ad4-a2a40437a406-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "892ffeb5-f853-45a6-9ad4-a2a40437a406" (UID: "892ffeb5-f853-45a6-9ad4-a2a40437a406"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:18 crc kubenswrapper[4792]: I0301 09:30:18.979418 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/892ffeb5-f853-45a6-9ad4-a2a40437a406-config-data" (OuterVolumeSpecName: "config-data") pod "892ffeb5-f853-45a6-9ad4-a2a40437a406" (UID: "892ffeb5-f853-45a6-9ad4-a2a40437a406"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.050110 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/892ffeb5-f853-45a6-9ad4-a2a40437a406-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.050156 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwtn2\" (UniqueName: \"kubernetes.io/projected/892ffeb5-f853-45a6-9ad4-a2a40437a406-kube-api-access-gwtn2\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.050172 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/892ffeb5-f853-45a6-9ad4-a2a40437a406-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.152514 4792 generic.go:334] "Generic (PLEG): container finished" podID="892ffeb5-f853-45a6-9ad4-a2a40437a406" containerID="0f9e913b971c9134bc2136ec2762bd10c403ebea92fa22f81cb94880f84799fa" exitCode=0 Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.152814 4792 generic.go:334] "Generic (PLEG): container finished" podID="892ffeb5-f853-45a6-9ad4-a2a40437a406" containerID="93888e1a70b199ccd8eb105633970b355749358a35a6a8da18a1130d74ff49a3" exitCode=143 Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.152696 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"892ffeb5-f853-45a6-9ad4-a2a40437a406","Type":"ContainerDied","Data":"0f9e913b971c9134bc2136ec2762bd10c403ebea92fa22f81cb94880f84799fa"} Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.152975 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"892ffeb5-f853-45a6-9ad4-a2a40437a406","Type":"ContainerDied","Data":"93888e1a70b199ccd8eb105633970b355749358a35a6a8da18a1130d74ff49a3"} Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.152994 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"892ffeb5-f853-45a6-9ad4-a2a40437a406","Type":"ContainerDied","Data":"e1ddfe00df757587e10aacff0143490ad9255e7b1f73da1f009a37752ae1c648"} Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.153012 4792 scope.go:117] "RemoveContainer" containerID="0f9e913b971c9134bc2136ec2762bd10c403ebea92fa22f81cb94880f84799fa" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.152668 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.185706 4792 scope.go:117] "RemoveContainer" containerID="93888e1a70b199ccd8eb105633970b355749358a35a6a8da18a1130d74ff49a3" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.206208 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.225128 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.225548 4792 scope.go:117] "RemoveContainer" containerID="0f9e913b971c9134bc2136ec2762bd10c403ebea92fa22f81cb94880f84799fa" Mar 01 09:30:19 crc kubenswrapper[4792]: E0301 09:30:19.226165 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f9e913b971c9134bc2136ec2762bd10c403ebea92fa22f81cb94880f84799fa\": container with ID starting with 0f9e913b971c9134bc2136ec2762bd10c403ebea92fa22f81cb94880f84799fa not found: ID does not exist" containerID="0f9e913b971c9134bc2136ec2762bd10c403ebea92fa22f81cb94880f84799fa" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.226203 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f9e913b971c9134bc2136ec2762bd10c403ebea92fa22f81cb94880f84799fa"} err="failed to get container status \"0f9e913b971c9134bc2136ec2762bd10c403ebea92fa22f81cb94880f84799fa\": rpc error: code = NotFound desc = could not find container \"0f9e913b971c9134bc2136ec2762bd10c403ebea92fa22f81cb94880f84799fa\": container with ID starting with 0f9e913b971c9134bc2136ec2762bd10c403ebea92fa22f81cb94880f84799fa not found: ID does not exist" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.226230 4792 scope.go:117] "RemoveContainer" containerID="93888e1a70b199ccd8eb105633970b355749358a35a6a8da18a1130d74ff49a3" Mar 01 09:30:19 crc kubenswrapper[4792]: E0301 09:30:19.226523 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93888e1a70b199ccd8eb105633970b355749358a35a6a8da18a1130d74ff49a3\": container with ID starting with 93888e1a70b199ccd8eb105633970b355749358a35a6a8da18a1130d74ff49a3 not found: ID does not exist" containerID="93888e1a70b199ccd8eb105633970b355749358a35a6a8da18a1130d74ff49a3" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.226552 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93888e1a70b199ccd8eb105633970b355749358a35a6a8da18a1130d74ff49a3"} err="failed to get container status \"93888e1a70b199ccd8eb105633970b355749358a35a6a8da18a1130d74ff49a3\": rpc error: code = NotFound desc = could not find container \"93888e1a70b199ccd8eb105633970b355749358a35a6a8da18a1130d74ff49a3\": container with ID starting with 93888e1a70b199ccd8eb105633970b355749358a35a6a8da18a1130d74ff49a3 not found: ID does not exist" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.226570 4792 scope.go:117] "RemoveContainer" containerID="0f9e913b971c9134bc2136ec2762bd10c403ebea92fa22f81cb94880f84799fa" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.226791 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f9e913b971c9134bc2136ec2762bd10c403ebea92fa22f81cb94880f84799fa"} err="failed to get container status \"0f9e913b971c9134bc2136ec2762bd10c403ebea92fa22f81cb94880f84799fa\": rpc error: code = NotFound desc = could not find container \"0f9e913b971c9134bc2136ec2762bd10c403ebea92fa22f81cb94880f84799fa\": container with ID starting with 0f9e913b971c9134bc2136ec2762bd10c403ebea92fa22f81cb94880f84799fa not found: ID does not exist" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.226823 4792 scope.go:117] "RemoveContainer" containerID="93888e1a70b199ccd8eb105633970b355749358a35a6a8da18a1130d74ff49a3" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.227355 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93888e1a70b199ccd8eb105633970b355749358a35a6a8da18a1130d74ff49a3"} err="failed to get container status \"93888e1a70b199ccd8eb105633970b355749358a35a6a8da18a1130d74ff49a3\": rpc error: code = NotFound desc = could not find container \"93888e1a70b199ccd8eb105633970b355749358a35a6a8da18a1130d74ff49a3\": container with ID starting with 93888e1a70b199ccd8eb105633970b355749358a35a6a8da18a1130d74ff49a3 not found: ID does not exist" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.239019 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 01 09:30:19 crc kubenswrapper[4792]: E0301 09:30:19.239427 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="892ffeb5-f853-45a6-9ad4-a2a40437a406" containerName="nova-metadata-log" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.239443 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="892ffeb5-f853-45a6-9ad4-a2a40437a406" containerName="nova-metadata-log" Mar 01 09:30:19 crc kubenswrapper[4792]: E0301 09:30:19.239458 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="892ffeb5-f853-45a6-9ad4-a2a40437a406" containerName="nova-metadata-metadata" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.239465 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="892ffeb5-f853-45a6-9ad4-a2a40437a406" containerName="nova-metadata-metadata" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.239640 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="892ffeb5-f853-45a6-9ad4-a2a40437a406" containerName="nova-metadata-metadata" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.239660 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="892ffeb5-f853-45a6-9ad4-a2a40437a406" containerName="nova-metadata-log" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.240519 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.249637 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.285745 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bshv\" (UniqueName: \"kubernetes.io/projected/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-kube-api-access-8bshv\") pod \"nova-metadata-0\" (UID: \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\") " pod="openstack/nova-metadata-0" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.286087 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-logs\") pod \"nova-metadata-0\" (UID: \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\") " pod="openstack/nova-metadata-0" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.286144 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\") " pod="openstack/nova-metadata-0" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.286223 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-config-data\") pod \"nova-metadata-0\" (UID: \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\") " pod="openstack/nova-metadata-0" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.286267 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\") " pod="openstack/nova-metadata-0" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.288946 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.289403 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.388329 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-logs\") pod \"nova-metadata-0\" (UID: \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\") " pod="openstack/nova-metadata-0" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.388412 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\") " pod="openstack/nova-metadata-0" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.388507 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-config-data\") pod \"nova-metadata-0\" (UID: \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\") " pod="openstack/nova-metadata-0" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.388553 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\") " pod="openstack/nova-metadata-0" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.388599 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bshv\" (UniqueName: \"kubernetes.io/projected/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-kube-api-access-8bshv\") pod \"nova-metadata-0\" (UID: \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\") " pod="openstack/nova-metadata-0" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.388834 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-logs\") pod \"nova-metadata-0\" (UID: \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\") " pod="openstack/nova-metadata-0" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.394178 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\") " pod="openstack/nova-metadata-0" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.402649 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-config-data\") pod \"nova-metadata-0\" (UID: \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\") " pod="openstack/nova-metadata-0" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.409442 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\") " pod="openstack/nova-metadata-0" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.409527 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bshv\" (UniqueName: \"kubernetes.io/projected/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-kube-api-access-8bshv\") pod \"nova-metadata-0\" (UID: \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\") " pod="openstack/nova-metadata-0" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.420002 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="892ffeb5-f853-45a6-9ad4-a2a40437a406" path="/var/lib/kubelet/pods/892ffeb5-f853-45a6-9ad4-a2a40437a406/volumes" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.604979 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 01 09:30:20 crc kubenswrapper[4792]: I0301 09:30:20.053427 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 01 09:30:20 crc kubenswrapper[4792]: W0301 09:30:20.061030 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0521e997_0d3e_4e56_9302_cbd2c79e2c0a.slice/crio-327e848ad4f5537716892bd8c75c4caf7104a2e6089d9a069750a5ed9373fae4 WatchSource:0}: Error finding container 327e848ad4f5537716892bd8c75c4caf7104a2e6089d9a069750a5ed9373fae4: Status 404 returned error can't find the container with id 327e848ad4f5537716892bd8c75c4caf7104a2e6089d9a069750a5ed9373fae4 Mar 01 09:30:20 crc kubenswrapper[4792]: I0301 09:30:20.180045 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0521e997-0d3e-4e56-9302-cbd2c79e2c0a","Type":"ContainerStarted","Data":"327e848ad4f5537716892bd8c75c4caf7104a2e6089d9a069750a5ed9373fae4"} Mar 01 09:30:21 crc kubenswrapper[4792]: I0301 09:30:21.190668 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0521e997-0d3e-4e56-9302-cbd2c79e2c0a","Type":"ContainerStarted","Data":"f6d95cb3f5287f3e4b2a3379b854f7695bc69ab2f663fff0b92f66887b19c4ff"} Mar 01 09:30:21 crc kubenswrapper[4792]: I0301 09:30:21.190940 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0521e997-0d3e-4e56-9302-cbd2c79e2c0a","Type":"ContainerStarted","Data":"08fff41a871a4698232ae6125185f351896691d6f802d9111c8ea965e8178547"} Mar 01 09:30:21 crc kubenswrapper[4792]: I0301 09:30:21.218601 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.218579633 podStartE2EDuration="2.218579633s" podCreationTimestamp="2026-03-01 09:30:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:30:21.20676514 +0000 UTC m=+1350.448644347" watchObservedRunningTime="2026-03-01 09:30:21.218579633 +0000 UTC m=+1350.460458830" Mar 01 09:30:22 crc kubenswrapper[4792]: I0301 09:30:22.205098 4792 generic.go:334] "Generic (PLEG): container finished" podID="32a84376-7418-49cd-9c62-fdd1af7ec31b" containerID="cc49a2b9acb35bd4588c6c9cb6d10085e66d67e1476ad98c515c87fcc8a40be2" exitCode=0 Mar 01 09:30:22 crc kubenswrapper[4792]: I0301 09:30:22.205226 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-8vfwt" event={"ID":"32a84376-7418-49cd-9c62-fdd1af7ec31b","Type":"ContainerDied","Data":"cc49a2b9acb35bd4588c6c9cb6d10085e66d67e1476ad98c515c87fcc8a40be2"} Mar 01 09:30:22 crc kubenswrapper[4792]: I0301 09:30:22.314706 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 01 09:30:22 crc kubenswrapper[4792]: I0301 09:30:22.314955 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 01 09:30:22 crc kubenswrapper[4792]: I0301 09:30:22.654287 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 01 09:30:22 crc kubenswrapper[4792]: I0301 09:30:22.654326 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 01 09:30:22 crc kubenswrapper[4792]: I0301 09:30:22.694005 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 01 09:30:22 crc kubenswrapper[4792]: I0301 09:30:22.715161 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" Mar 01 09:30:22 crc kubenswrapper[4792]: I0301 09:30:22.749824 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:22 crc kubenswrapper[4792]: I0301 09:30:22.785922 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7675674687-rrbg6"] Mar 01 09:30:22 crc kubenswrapper[4792]: I0301 09:30:22.786394 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7675674687-rrbg6" podUID="b5f281f2-c77c-49cc-93a4-e7ed029f29bb" containerName="dnsmasq-dns" containerID="cri-o://fa470bfaa3e77451b41cb706fe5366ce1e220ca16dcf36e41c6bb50c6ad8d869" gracePeriod=10 Mar 01 09:30:22 crc kubenswrapper[4792]: I0301 09:30:22.972336 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7675674687-rrbg6" podUID="b5f281f2-c77c-49cc-93a4-e7ed029f29bb" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.155:5353: connect: connection refused" Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.219264 4792 generic.go:334] "Generic (PLEG): container finished" podID="7269b8b7-440f-4fae-b0f1-f624e9d5b29a" containerID="7b120b9d05aec1bbdb715a0cec1430208c27b75e09d6308e245c67d773de0e22" exitCode=0 Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.219344 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-tjd85" event={"ID":"7269b8b7-440f-4fae-b0f1-f624e9d5b29a","Type":"ContainerDied","Data":"7b120b9d05aec1bbdb715a0cec1430208c27b75e09d6308e245c67d773de0e22"} Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.237842 4792 generic.go:334] "Generic (PLEG): container finished" podID="b5f281f2-c77c-49cc-93a4-e7ed029f29bb" containerID="fa470bfaa3e77451b41cb706fe5366ce1e220ca16dcf36e41c6bb50c6ad8d869" exitCode=0 Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.238262 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7675674687-rrbg6" event={"ID":"b5f281f2-c77c-49cc-93a4-e7ed029f29bb","Type":"ContainerDied","Data":"fa470bfaa3e77451b41cb706fe5366ce1e220ca16dcf36e41c6bb50c6ad8d869"} Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.293736 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.314000 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7675674687-rrbg6" Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.404548 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="49690073-1340-4686-bc4b-f69901bb45d9" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.179:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.404577 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="49690073-1340-4686-bc4b-f69901bb45d9" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.179:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.490747 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2j7zl\" (UniqueName: \"kubernetes.io/projected/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-kube-api-access-2j7zl\") pod \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\" (UID: \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\") " Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.490844 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-ovsdbserver-sb\") pod \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\" (UID: \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\") " Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.491031 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-dns-svc\") pod \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\" (UID: \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\") " Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.491083 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-ovsdbserver-nb\") pod \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\" (UID: \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\") " Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.491130 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-config\") pod \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\" (UID: \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\") " Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.522736 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-kube-api-access-2j7zl" (OuterVolumeSpecName: "kube-api-access-2j7zl") pod "b5f281f2-c77c-49cc-93a4-e7ed029f29bb" (UID: "b5f281f2-c77c-49cc-93a4-e7ed029f29bb"). InnerVolumeSpecName "kube-api-access-2j7zl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.569653 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b5f281f2-c77c-49cc-93a4-e7ed029f29bb" (UID: "b5f281f2-c77c-49cc-93a4-e7ed029f29bb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.581695 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b5f281f2-c77c-49cc-93a4-e7ed029f29bb" (UID: "b5f281f2-c77c-49cc-93a4-e7ed029f29bb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.593933 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2j7zl\" (UniqueName: \"kubernetes.io/projected/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-kube-api-access-2j7zl\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.593962 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.593971 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.598119 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b5f281f2-c77c-49cc-93a4-e7ed029f29bb" (UID: "b5f281f2-c77c-49cc-93a4-e7ed029f29bb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.613482 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-config" (OuterVolumeSpecName: "config") pod "b5f281f2-c77c-49cc-93a4-e7ed029f29bb" (UID: "b5f281f2-c77c-49cc-93a4-e7ed029f29bb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.638222 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-8vfwt" Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.697335 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.697372 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.798437 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32a84376-7418-49cd-9c62-fdd1af7ec31b-combined-ca-bundle\") pod \"32a84376-7418-49cd-9c62-fdd1af7ec31b\" (UID: \"32a84376-7418-49cd-9c62-fdd1af7ec31b\") " Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.798496 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32a84376-7418-49cd-9c62-fdd1af7ec31b-scripts\") pod \"32a84376-7418-49cd-9c62-fdd1af7ec31b\" (UID: \"32a84376-7418-49cd-9c62-fdd1af7ec31b\") " Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.798639 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8l625\" (UniqueName: \"kubernetes.io/projected/32a84376-7418-49cd-9c62-fdd1af7ec31b-kube-api-access-8l625\") pod \"32a84376-7418-49cd-9c62-fdd1af7ec31b\" (UID: \"32a84376-7418-49cd-9c62-fdd1af7ec31b\") " Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.798896 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32a84376-7418-49cd-9c62-fdd1af7ec31b-config-data\") pod \"32a84376-7418-49cd-9c62-fdd1af7ec31b\" (UID: \"32a84376-7418-49cd-9c62-fdd1af7ec31b\") " Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.809538 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32a84376-7418-49cd-9c62-fdd1af7ec31b-kube-api-access-8l625" (OuterVolumeSpecName: "kube-api-access-8l625") pod "32a84376-7418-49cd-9c62-fdd1af7ec31b" (UID: "32a84376-7418-49cd-9c62-fdd1af7ec31b"). InnerVolumeSpecName "kube-api-access-8l625". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.811111 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32a84376-7418-49cd-9c62-fdd1af7ec31b-scripts" (OuterVolumeSpecName: "scripts") pod "32a84376-7418-49cd-9c62-fdd1af7ec31b" (UID: "32a84376-7418-49cd-9c62-fdd1af7ec31b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.840059 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32a84376-7418-49cd-9c62-fdd1af7ec31b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32a84376-7418-49cd-9c62-fdd1af7ec31b" (UID: "32a84376-7418-49cd-9c62-fdd1af7ec31b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.847676 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32a84376-7418-49cd-9c62-fdd1af7ec31b-config-data" (OuterVolumeSpecName: "config-data") pod "32a84376-7418-49cd-9c62-fdd1af7ec31b" (UID: "32a84376-7418-49cd-9c62-fdd1af7ec31b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.904988 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32a84376-7418-49cd-9c62-fdd1af7ec31b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.905027 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32a84376-7418-49cd-9c62-fdd1af7ec31b-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.905040 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8l625\" (UniqueName: \"kubernetes.io/projected/32a84376-7418-49cd-9c62-fdd1af7ec31b-kube-api-access-8l625\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.905053 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32a84376-7418-49cd-9c62-fdd1af7ec31b-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.255961 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-8vfwt" event={"ID":"32a84376-7418-49cd-9c62-fdd1af7ec31b","Type":"ContainerDied","Data":"3c68bb44e29a7b14a923b915dc00892bf89acf28bc6734efebeb0e2ee1c07a61"} Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.256268 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c68bb44e29a7b14a923b915dc00892bf89acf28bc6734efebeb0e2ee1c07a61" Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.256328 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-8vfwt" Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.268656 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7675674687-rrbg6" Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.269044 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7675674687-rrbg6" event={"ID":"b5f281f2-c77c-49cc-93a4-e7ed029f29bb","Type":"ContainerDied","Data":"23c50a76020a45988e337315d0efa1b099af136e2bf3deb4ff1bf47f7e64507f"} Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.269159 4792 scope.go:117] "RemoveContainer" containerID="fa470bfaa3e77451b41cb706fe5366ce1e220ca16dcf36e41c6bb50c6ad8d869" Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.320594 4792 scope.go:117] "RemoveContainer" containerID="1bc8af094ffa7e776635e0ebadb742f3592eec13f050af88c7f45cc46dd6b7ae" Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.338019 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7675674687-rrbg6"] Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.355249 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7675674687-rrbg6"] Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.389061 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.389508 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="49690073-1340-4686-bc4b-f69901bb45d9" containerName="nova-api-log" containerID="cri-o://0a8da701899d6dbb39ad32c67cda198a10066ca5649f3a3b10692182aafa59b8" gracePeriod=30 Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.390024 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="49690073-1340-4686-bc4b-f69901bb45d9" containerName="nova-api-api" containerID="cri-o://e37d4adb8d9d5ffbf813442c1fdbcfbbf0812ab4b1f3d2f82a5ad52ffcbb0ef7" gracePeriod=30 Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.403340 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.417199 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.417460 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0521e997-0d3e-4e56-9302-cbd2c79e2c0a" containerName="nova-metadata-log" containerID="cri-o://08fff41a871a4698232ae6125185f351896691d6f802d9111c8ea965e8178547" gracePeriod=30 Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.417814 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0521e997-0d3e-4e56-9302-cbd2c79e2c0a" containerName="nova-metadata-metadata" containerID="cri-o://f6d95cb3f5287f3e4b2a3379b854f7695bc69ab2f663fff0b92f66887b19c4ff" gracePeriod=30 Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.605325 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.605366 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.788409 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-tjd85" Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.931758 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5qnd\" (UniqueName: \"kubernetes.io/projected/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-kube-api-access-l5qnd\") pod \"7269b8b7-440f-4fae-b0f1-f624e9d5b29a\" (UID: \"7269b8b7-440f-4fae-b0f1-f624e9d5b29a\") " Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.931807 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-config-data\") pod \"7269b8b7-440f-4fae-b0f1-f624e9d5b29a\" (UID: \"7269b8b7-440f-4fae-b0f1-f624e9d5b29a\") " Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.932016 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-combined-ca-bundle\") pod \"7269b8b7-440f-4fae-b0f1-f624e9d5b29a\" (UID: \"7269b8b7-440f-4fae-b0f1-f624e9d5b29a\") " Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.932052 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-scripts\") pod \"7269b8b7-440f-4fae-b0f1-f624e9d5b29a\" (UID: \"7269b8b7-440f-4fae-b0f1-f624e9d5b29a\") " Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.945030 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-scripts" (OuterVolumeSpecName: "scripts") pod "7269b8b7-440f-4fae-b0f1-f624e9d5b29a" (UID: "7269b8b7-440f-4fae-b0f1-f624e9d5b29a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.961625 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-kube-api-access-l5qnd" (OuterVolumeSpecName: "kube-api-access-l5qnd") pod "7269b8b7-440f-4fae-b0f1-f624e9d5b29a" (UID: "7269b8b7-440f-4fae-b0f1-f624e9d5b29a"). InnerVolumeSpecName "kube-api-access-l5qnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.987384 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7269b8b7-440f-4fae-b0f1-f624e9d5b29a" (UID: "7269b8b7-440f-4fae-b0f1-f624e9d5b29a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.002579 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-config-data" (OuterVolumeSpecName: "config-data") pod "7269b8b7-440f-4fae-b0f1-f624e9d5b29a" (UID: "7269b8b7-440f-4fae-b0f1-f624e9d5b29a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.048209 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5qnd\" (UniqueName: \"kubernetes.io/projected/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-kube-api-access-l5qnd\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.048248 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.048283 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.048293 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.147170 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.250534 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-config-data\") pod \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\" (UID: \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\") " Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.250731 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-logs\") pod \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\" (UID: \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\") " Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.250782 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-combined-ca-bundle\") pod \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\" (UID: \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\") " Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.250868 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bshv\" (UniqueName: \"kubernetes.io/projected/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-kube-api-access-8bshv\") pod \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\" (UID: \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\") " Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.250952 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-nova-metadata-tls-certs\") pod \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\" (UID: \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\") " Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.251140 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-logs" (OuterVolumeSpecName: "logs") pod "0521e997-0d3e-4e56-9302-cbd2c79e2c0a" (UID: "0521e997-0d3e-4e56-9302-cbd2c79e2c0a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.251607 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-logs\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.258047 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-kube-api-access-8bshv" (OuterVolumeSpecName: "kube-api-access-8bshv") pod "0521e997-0d3e-4e56-9302-cbd2c79e2c0a" (UID: "0521e997-0d3e-4e56-9302-cbd2c79e2c0a"). InnerVolumeSpecName "kube-api-access-8bshv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.283991 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0521e997-0d3e-4e56-9302-cbd2c79e2c0a" (UID: "0521e997-0d3e-4e56-9302-cbd2c79e2c0a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.290578 4792 generic.go:334] "Generic (PLEG): container finished" podID="0521e997-0d3e-4e56-9302-cbd2c79e2c0a" containerID="f6d95cb3f5287f3e4b2a3379b854f7695bc69ab2f663fff0b92f66887b19c4ff" exitCode=0 Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.290638 4792 generic.go:334] "Generic (PLEG): container finished" podID="0521e997-0d3e-4e56-9302-cbd2c79e2c0a" containerID="08fff41a871a4698232ae6125185f351896691d6f802d9111c8ea965e8178547" exitCode=143 Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.290715 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0521e997-0d3e-4e56-9302-cbd2c79e2c0a","Type":"ContainerDied","Data":"f6d95cb3f5287f3e4b2a3379b854f7695bc69ab2f663fff0b92f66887b19c4ff"} Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.290750 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0521e997-0d3e-4e56-9302-cbd2c79e2c0a","Type":"ContainerDied","Data":"08fff41a871a4698232ae6125185f351896691d6f802d9111c8ea965e8178547"} Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.290790 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0521e997-0d3e-4e56-9302-cbd2c79e2c0a","Type":"ContainerDied","Data":"327e848ad4f5537716892bd8c75c4caf7104a2e6089d9a069750a5ed9373fae4"} Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.290808 4792 scope.go:117] "RemoveContainer" containerID="f6d95cb3f5287f3e4b2a3379b854f7695bc69ab2f663fff0b92f66887b19c4ff" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.290986 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.302182 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-config-data" (OuterVolumeSpecName: "config-data") pod "0521e997-0d3e-4e56-9302-cbd2c79e2c0a" (UID: "0521e997-0d3e-4e56-9302-cbd2c79e2c0a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.303361 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-tjd85" event={"ID":"7269b8b7-440f-4fae-b0f1-f624e9d5b29a","Type":"ContainerDied","Data":"c0f91939f7bb3b7f30d5eba3a1ceca8589d9bda852653f48b1879bd42cead480"} Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.303395 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0f91939f7bb3b7f30d5eba3a1ceca8589d9bda852653f48b1879bd42cead480" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.303466 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-tjd85" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.312797 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.313155 4792 generic.go:334] "Generic (PLEG): container finished" podID="49690073-1340-4686-bc4b-f69901bb45d9" containerID="0a8da701899d6dbb39ad32c67cda198a10066ca5649f3a3b10692182aafa59b8" exitCode=143 Mar 01 09:30:25 crc kubenswrapper[4792]: E0301 09:30:25.313288 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0521e997-0d3e-4e56-9302-cbd2c79e2c0a" containerName="nova-metadata-metadata" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.313310 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0521e997-0d3e-4e56-9302-cbd2c79e2c0a" containerName="nova-metadata-metadata" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.313319 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="b8015a6e-cf5d-4728-b2e6-66bb8960fd40" containerName="nova-scheduler-scheduler" containerID="cri-o://43517df943397eb0f37738ca82db7645bf4ef1a52cc7a3b9599b08d521c29e03" gracePeriod=30 Mar 01 09:30:25 crc kubenswrapper[4792]: E0301 09:30:25.313329 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5f281f2-c77c-49cc-93a4-e7ed029f29bb" containerName="dnsmasq-dns" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.313339 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5f281f2-c77c-49cc-93a4-e7ed029f29bb" containerName="dnsmasq-dns" Mar 01 09:30:25 crc kubenswrapper[4792]: E0301 09:30:25.313358 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5f281f2-c77c-49cc-93a4-e7ed029f29bb" containerName="init" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.313367 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5f281f2-c77c-49cc-93a4-e7ed029f29bb" containerName="init" Mar 01 09:30:25 crc kubenswrapper[4792]: E0301 09:30:25.313379 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7269b8b7-440f-4fae-b0f1-f624e9d5b29a" containerName="nova-cell1-conductor-db-sync" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.313387 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7269b8b7-440f-4fae-b0f1-f624e9d5b29a" containerName="nova-cell1-conductor-db-sync" Mar 01 09:30:25 crc kubenswrapper[4792]: E0301 09:30:25.313402 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0521e997-0d3e-4e56-9302-cbd2c79e2c0a" containerName="nova-metadata-log" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.313411 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0521e997-0d3e-4e56-9302-cbd2c79e2c0a" containerName="nova-metadata-log" Mar 01 09:30:25 crc kubenswrapper[4792]: E0301 09:30:25.313436 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32a84376-7418-49cd-9c62-fdd1af7ec31b" containerName="nova-manage" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.313445 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="32a84376-7418-49cd-9c62-fdd1af7ec31b" containerName="nova-manage" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.313661 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="32a84376-7418-49cd-9c62-fdd1af7ec31b" containerName="nova-manage" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.313676 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="7269b8b7-440f-4fae-b0f1-f624e9d5b29a" containerName="nova-cell1-conductor-db-sync" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.313687 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5f281f2-c77c-49cc-93a4-e7ed029f29bb" containerName="dnsmasq-dns" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.313705 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0521e997-0d3e-4e56-9302-cbd2c79e2c0a" containerName="nova-metadata-log" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.313724 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0521e997-0d3e-4e56-9302-cbd2c79e2c0a" containerName="nova-metadata-metadata" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.316810 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"49690073-1340-4686-bc4b-f69901bb45d9","Type":"ContainerDied","Data":"0a8da701899d6dbb39ad32c67cda198a10066ca5649f3a3b10692182aafa59b8"} Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.316963 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.320205 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.338607 4792 scope.go:117] "RemoveContainer" containerID="08fff41a871a4698232ae6125185f351896691d6f802d9111c8ea965e8178547" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.374155 4792 scope.go:117] "RemoveContainer" containerID="f6d95cb3f5287f3e4b2a3379b854f7695bc69ab2f663fff0b92f66887b19c4ff" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.375456 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.376605 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ef6cc4e-2fd6-403b-a163-638395c30672-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9ef6cc4e-2fd6-403b-a163-638395c30672\") " pod="openstack/nova-cell1-conductor-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.376649 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ef6cc4e-2fd6-403b-a163-638395c30672-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9ef6cc4e-2fd6-403b-a163-638395c30672\") " pod="openstack/nova-cell1-conductor-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.376704 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4rfg\" (UniqueName: \"kubernetes.io/projected/9ef6cc4e-2fd6-403b-a163-638395c30672-kube-api-access-z4rfg\") pod \"nova-cell1-conductor-0\" (UID: \"9ef6cc4e-2fd6-403b-a163-638395c30672\") " pod="openstack/nova-cell1-conductor-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.376795 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.376807 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bshv\" (UniqueName: \"kubernetes.io/projected/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-kube-api-access-8bshv\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.376817 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:25 crc kubenswrapper[4792]: E0301 09:30:25.377787 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6d95cb3f5287f3e4b2a3379b854f7695bc69ab2f663fff0b92f66887b19c4ff\": container with ID starting with f6d95cb3f5287f3e4b2a3379b854f7695bc69ab2f663fff0b92f66887b19c4ff not found: ID does not exist" containerID="f6d95cb3f5287f3e4b2a3379b854f7695bc69ab2f663fff0b92f66887b19c4ff" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.377834 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6d95cb3f5287f3e4b2a3379b854f7695bc69ab2f663fff0b92f66887b19c4ff"} err="failed to get container status \"f6d95cb3f5287f3e4b2a3379b854f7695bc69ab2f663fff0b92f66887b19c4ff\": rpc error: code = NotFound desc = could not find container \"f6d95cb3f5287f3e4b2a3379b854f7695bc69ab2f663fff0b92f66887b19c4ff\": container with ID starting with f6d95cb3f5287f3e4b2a3379b854f7695bc69ab2f663fff0b92f66887b19c4ff not found: ID does not exist" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.377867 4792 scope.go:117] "RemoveContainer" containerID="08fff41a871a4698232ae6125185f351896691d6f802d9111c8ea965e8178547" Mar 01 09:30:25 crc kubenswrapper[4792]: E0301 09:30:25.380113 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08fff41a871a4698232ae6125185f351896691d6f802d9111c8ea965e8178547\": container with ID starting with 08fff41a871a4698232ae6125185f351896691d6f802d9111c8ea965e8178547 not found: ID does not exist" containerID="08fff41a871a4698232ae6125185f351896691d6f802d9111c8ea965e8178547" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.380152 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "0521e997-0d3e-4e56-9302-cbd2c79e2c0a" (UID: "0521e997-0d3e-4e56-9302-cbd2c79e2c0a"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.380148 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08fff41a871a4698232ae6125185f351896691d6f802d9111c8ea965e8178547"} err="failed to get container status \"08fff41a871a4698232ae6125185f351896691d6f802d9111c8ea965e8178547\": rpc error: code = NotFound desc = could not find container \"08fff41a871a4698232ae6125185f351896691d6f802d9111c8ea965e8178547\": container with ID starting with 08fff41a871a4698232ae6125185f351896691d6f802d9111c8ea965e8178547 not found: ID does not exist" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.380187 4792 scope.go:117] "RemoveContainer" containerID="f6d95cb3f5287f3e4b2a3379b854f7695bc69ab2f663fff0b92f66887b19c4ff" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.380482 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6d95cb3f5287f3e4b2a3379b854f7695bc69ab2f663fff0b92f66887b19c4ff"} err="failed to get container status \"f6d95cb3f5287f3e4b2a3379b854f7695bc69ab2f663fff0b92f66887b19c4ff\": rpc error: code = NotFound desc = could not find container \"f6d95cb3f5287f3e4b2a3379b854f7695bc69ab2f663fff0b92f66887b19c4ff\": container with ID starting with f6d95cb3f5287f3e4b2a3379b854f7695bc69ab2f663fff0b92f66887b19c4ff not found: ID does not exist" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.380505 4792 scope.go:117] "RemoveContainer" containerID="08fff41a871a4698232ae6125185f351896691d6f802d9111c8ea965e8178547" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.380827 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08fff41a871a4698232ae6125185f351896691d6f802d9111c8ea965e8178547"} err="failed to get container status \"08fff41a871a4698232ae6125185f351896691d6f802d9111c8ea965e8178547\": rpc error: code = NotFound desc = could not find container \"08fff41a871a4698232ae6125185f351896691d6f802d9111c8ea965e8178547\": container with ID starting with 08fff41a871a4698232ae6125185f351896691d6f802d9111c8ea965e8178547 not found: ID does not exist" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.439131 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5f281f2-c77c-49cc-93a4-e7ed029f29bb" path="/var/lib/kubelet/pods/b5f281f2-c77c-49cc-93a4-e7ed029f29bb/volumes" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.479476 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ef6cc4e-2fd6-403b-a163-638395c30672-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9ef6cc4e-2fd6-403b-a163-638395c30672\") " pod="openstack/nova-cell1-conductor-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.479558 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ef6cc4e-2fd6-403b-a163-638395c30672-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9ef6cc4e-2fd6-403b-a163-638395c30672\") " pod="openstack/nova-cell1-conductor-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.479624 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4rfg\" (UniqueName: \"kubernetes.io/projected/9ef6cc4e-2fd6-403b-a163-638395c30672-kube-api-access-z4rfg\") pod \"nova-cell1-conductor-0\" (UID: \"9ef6cc4e-2fd6-403b-a163-638395c30672\") " pod="openstack/nova-cell1-conductor-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.479802 4792 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.483841 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ef6cc4e-2fd6-403b-a163-638395c30672-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9ef6cc4e-2fd6-403b-a163-638395c30672\") " pod="openstack/nova-cell1-conductor-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.485255 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ef6cc4e-2fd6-403b-a163-638395c30672-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9ef6cc4e-2fd6-403b-a163-638395c30672\") " pod="openstack/nova-cell1-conductor-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.496441 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4rfg\" (UniqueName: \"kubernetes.io/projected/9ef6cc4e-2fd6-403b-a163-638395c30672-kube-api-access-z4rfg\") pod \"nova-cell1-conductor-0\" (UID: \"9ef6cc4e-2fd6-403b-a163-638395c30672\") " pod="openstack/nova-cell1-conductor-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.618296 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.625592 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.641879 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.652130 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.653767 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.666123 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.666324 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.691971 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.786033 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/66b40740-5f2c-4f3a-9d20-3307335829ed-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"66b40740-5f2c-4f3a-9d20-3307335829ed\") " pod="openstack/nova-metadata-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.786114 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66b40740-5f2c-4f3a-9d20-3307335829ed-config-data\") pod \"nova-metadata-0\" (UID: \"66b40740-5f2c-4f3a-9d20-3307335829ed\") " pod="openstack/nova-metadata-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.786138 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqzd2\" (UniqueName: \"kubernetes.io/projected/66b40740-5f2c-4f3a-9d20-3307335829ed-kube-api-access-lqzd2\") pod \"nova-metadata-0\" (UID: \"66b40740-5f2c-4f3a-9d20-3307335829ed\") " pod="openstack/nova-metadata-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.786189 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66b40740-5f2c-4f3a-9d20-3307335829ed-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"66b40740-5f2c-4f3a-9d20-3307335829ed\") " pod="openstack/nova-metadata-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.786427 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66b40740-5f2c-4f3a-9d20-3307335829ed-logs\") pod \"nova-metadata-0\" (UID: \"66b40740-5f2c-4f3a-9d20-3307335829ed\") " pod="openstack/nova-metadata-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.889096 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66b40740-5f2c-4f3a-9d20-3307335829ed-logs\") pod \"nova-metadata-0\" (UID: \"66b40740-5f2c-4f3a-9d20-3307335829ed\") " pod="openstack/nova-metadata-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.889959 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/66b40740-5f2c-4f3a-9d20-3307335829ed-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"66b40740-5f2c-4f3a-9d20-3307335829ed\") " pod="openstack/nova-metadata-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.890006 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66b40740-5f2c-4f3a-9d20-3307335829ed-config-data\") pod \"nova-metadata-0\" (UID: \"66b40740-5f2c-4f3a-9d20-3307335829ed\") " pod="openstack/nova-metadata-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.890023 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqzd2\" (UniqueName: \"kubernetes.io/projected/66b40740-5f2c-4f3a-9d20-3307335829ed-kube-api-access-lqzd2\") pod \"nova-metadata-0\" (UID: \"66b40740-5f2c-4f3a-9d20-3307335829ed\") " pod="openstack/nova-metadata-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.890077 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66b40740-5f2c-4f3a-9d20-3307335829ed-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"66b40740-5f2c-4f3a-9d20-3307335829ed\") " pod="openstack/nova-metadata-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.889862 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66b40740-5f2c-4f3a-9d20-3307335829ed-logs\") pod \"nova-metadata-0\" (UID: \"66b40740-5f2c-4f3a-9d20-3307335829ed\") " pod="openstack/nova-metadata-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.894339 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66b40740-5f2c-4f3a-9d20-3307335829ed-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"66b40740-5f2c-4f3a-9d20-3307335829ed\") " pod="openstack/nova-metadata-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.894412 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66b40740-5f2c-4f3a-9d20-3307335829ed-config-data\") pod \"nova-metadata-0\" (UID: \"66b40740-5f2c-4f3a-9d20-3307335829ed\") " pod="openstack/nova-metadata-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.894752 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/66b40740-5f2c-4f3a-9d20-3307335829ed-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"66b40740-5f2c-4f3a-9d20-3307335829ed\") " pod="openstack/nova-metadata-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.932638 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqzd2\" (UniqueName: \"kubernetes.io/projected/66b40740-5f2c-4f3a-9d20-3307335829ed-kube-api-access-lqzd2\") pod \"nova-metadata-0\" (UID: \"66b40740-5f2c-4f3a-9d20-3307335829ed\") " pod="openstack/nova-metadata-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.970876 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 01 09:30:26 crc kubenswrapper[4792]: I0301 09:30:26.089345 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 01 09:30:26 crc kubenswrapper[4792]: I0301 09:30:26.330678 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9ef6cc4e-2fd6-403b-a163-638395c30672","Type":"ContainerStarted","Data":"3c916f3a8483dfafcf41e54c045dfc799303a49d176cbd1bf0abdc70d9f1810a"} Mar 01 09:30:26 crc kubenswrapper[4792]: I0301 09:30:26.331043 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 01 09:30:26 crc kubenswrapper[4792]: I0301 09:30:26.331056 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9ef6cc4e-2fd6-403b-a163-638395c30672","Type":"ContainerStarted","Data":"73174187fe252d0dfc1749cb6d328c9b3e5fc0df9679a189709faf90120b6094"} Mar 01 09:30:26 crc kubenswrapper[4792]: I0301 09:30:26.352802 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=1.352786212 podStartE2EDuration="1.352786212s" podCreationTimestamp="2026-03-01 09:30:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:30:26.348191351 +0000 UTC m=+1355.590070548" watchObservedRunningTime="2026-03-01 09:30:26.352786212 +0000 UTC m=+1355.594665419" Mar 01 09:30:26 crc kubenswrapper[4792]: I0301 09:30:26.464116 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 01 09:30:26 crc kubenswrapper[4792]: W0301 09:30:26.465348 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66b40740_5f2c_4f3a_9d20_3307335829ed.slice/crio-243858d3fba357536368a417ecf1a917b39cc53d20c017fb5d7083ee13365f6d WatchSource:0}: Error finding container 243858d3fba357536368a417ecf1a917b39cc53d20c017fb5d7083ee13365f6d: Status 404 returned error can't find the container with id 243858d3fba357536368a417ecf1a917b39cc53d20c017fb5d7083ee13365f6d Mar 01 09:30:27 crc kubenswrapper[4792]: I0301 09:30:27.040104 4792 scope.go:117] "RemoveContainer" containerID="25e2a65861bddb5cf69014ea8d6e4a60ec1aeeeac6538cad770632d905286110" Mar 01 09:30:27 crc kubenswrapper[4792]: I0301 09:30:27.346344 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"66b40740-5f2c-4f3a-9d20-3307335829ed","Type":"ContainerStarted","Data":"a14ebd244facee732540cda9c7f5ef2f0b1a5f3035c360a40a8a1d517d895ef8"} Mar 01 09:30:27 crc kubenswrapper[4792]: I0301 09:30:27.346855 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"66b40740-5f2c-4f3a-9d20-3307335829ed","Type":"ContainerStarted","Data":"4824e5f7f0b4542375429e0324e26b504dd03e3e54b0250d4667215c3a5784ed"} Mar 01 09:30:27 crc kubenswrapper[4792]: I0301 09:30:27.346875 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"66b40740-5f2c-4f3a-9d20-3307335829ed","Type":"ContainerStarted","Data":"243858d3fba357536368a417ecf1a917b39cc53d20c017fb5d7083ee13365f6d"} Mar 01 09:30:27 crc kubenswrapper[4792]: I0301 09:30:27.376210 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.37619207 podStartE2EDuration="2.37619207s" podCreationTimestamp="2026-03-01 09:30:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:30:27.372590514 +0000 UTC m=+1356.614469721" watchObservedRunningTime="2026-03-01 09:30:27.37619207 +0000 UTC m=+1356.618071267" Mar 01 09:30:27 crc kubenswrapper[4792]: I0301 09:30:27.429504 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0521e997-0d3e-4e56-9302-cbd2c79e2c0a" path="/var/lib/kubelet/pods/0521e997-0d3e-4e56-9302-cbd2c79e2c0a/volumes" Mar 01 09:30:27 crc kubenswrapper[4792]: E0301 09:30:27.656300 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="43517df943397eb0f37738ca82db7645bf4ef1a52cc7a3b9599b08d521c29e03" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 01 09:30:27 crc kubenswrapper[4792]: E0301 09:30:27.658087 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="43517df943397eb0f37738ca82db7645bf4ef1a52cc7a3b9599b08d521c29e03" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 01 09:30:27 crc kubenswrapper[4792]: E0301 09:30:27.659760 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="43517df943397eb0f37738ca82db7645bf4ef1a52cc7a3b9599b08d521c29e03" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 01 09:30:27 crc kubenswrapper[4792]: E0301 09:30:27.659819 4792 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="b8015a6e-cf5d-4728-b2e6-66bb8960fd40" containerName="nova-scheduler-scheduler" Mar 01 09:30:29 crc kubenswrapper[4792]: I0301 09:30:29.364780 4792 generic.go:334] "Generic (PLEG): container finished" podID="b8015a6e-cf5d-4728-b2e6-66bb8960fd40" containerID="43517df943397eb0f37738ca82db7645bf4ef1a52cc7a3b9599b08d521c29e03" exitCode=0 Mar 01 09:30:29 crc kubenswrapper[4792]: I0301 09:30:29.364801 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b8015a6e-cf5d-4728-b2e6-66bb8960fd40","Type":"ContainerDied","Data":"43517df943397eb0f37738ca82db7645bf4ef1a52cc7a3b9599b08d521c29e03"} Mar 01 09:30:29 crc kubenswrapper[4792]: I0301 09:30:29.490055 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 01 09:30:29 crc kubenswrapper[4792]: I0301 09:30:29.661268 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8015a6e-cf5d-4728-b2e6-66bb8960fd40-config-data\") pod \"b8015a6e-cf5d-4728-b2e6-66bb8960fd40\" (UID: \"b8015a6e-cf5d-4728-b2e6-66bb8960fd40\") " Mar 01 09:30:29 crc kubenswrapper[4792]: I0301 09:30:29.661408 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwdw9\" (UniqueName: \"kubernetes.io/projected/b8015a6e-cf5d-4728-b2e6-66bb8960fd40-kube-api-access-pwdw9\") pod \"b8015a6e-cf5d-4728-b2e6-66bb8960fd40\" (UID: \"b8015a6e-cf5d-4728-b2e6-66bb8960fd40\") " Mar 01 09:30:29 crc kubenswrapper[4792]: I0301 09:30:29.661431 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8015a6e-cf5d-4728-b2e6-66bb8960fd40-combined-ca-bundle\") pod \"b8015a6e-cf5d-4728-b2e6-66bb8960fd40\" (UID: \"b8015a6e-cf5d-4728-b2e6-66bb8960fd40\") " Mar 01 09:30:29 crc kubenswrapper[4792]: I0301 09:30:29.667526 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8015a6e-cf5d-4728-b2e6-66bb8960fd40-kube-api-access-pwdw9" (OuterVolumeSpecName: "kube-api-access-pwdw9") pod "b8015a6e-cf5d-4728-b2e6-66bb8960fd40" (UID: "b8015a6e-cf5d-4728-b2e6-66bb8960fd40"). InnerVolumeSpecName "kube-api-access-pwdw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:30:29 crc kubenswrapper[4792]: I0301 09:30:29.686252 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8015a6e-cf5d-4728-b2e6-66bb8960fd40-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b8015a6e-cf5d-4728-b2e6-66bb8960fd40" (UID: "b8015a6e-cf5d-4728-b2e6-66bb8960fd40"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:29 crc kubenswrapper[4792]: I0301 09:30:29.692422 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8015a6e-cf5d-4728-b2e6-66bb8960fd40-config-data" (OuterVolumeSpecName: "config-data") pod "b8015a6e-cf5d-4728-b2e6-66bb8960fd40" (UID: "b8015a6e-cf5d-4728-b2e6-66bb8960fd40"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:29 crc kubenswrapper[4792]: I0301 09:30:29.763493 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8015a6e-cf5d-4728-b2e6-66bb8960fd40-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:29 crc kubenswrapper[4792]: I0301 09:30:29.763540 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwdw9\" (UniqueName: \"kubernetes.io/projected/b8015a6e-cf5d-4728-b2e6-66bb8960fd40-kube-api-access-pwdw9\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:29 crc kubenswrapper[4792]: I0301 09:30:29.763557 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8015a6e-cf5d-4728-b2e6-66bb8960fd40-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.246623 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.278122 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49690073-1340-4686-bc4b-f69901bb45d9-combined-ca-bundle\") pod \"49690073-1340-4686-bc4b-f69901bb45d9\" (UID: \"49690073-1340-4686-bc4b-f69901bb45d9\") " Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.278162 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49690073-1340-4686-bc4b-f69901bb45d9-config-data\") pod \"49690073-1340-4686-bc4b-f69901bb45d9\" (UID: \"49690073-1340-4686-bc4b-f69901bb45d9\") " Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.278257 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49690073-1340-4686-bc4b-f69901bb45d9-logs\") pod \"49690073-1340-4686-bc4b-f69901bb45d9\" (UID: \"49690073-1340-4686-bc4b-f69901bb45d9\") " Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.278316 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvskn\" (UniqueName: \"kubernetes.io/projected/49690073-1340-4686-bc4b-f69901bb45d9-kube-api-access-dvskn\") pod \"49690073-1340-4686-bc4b-f69901bb45d9\" (UID: \"49690073-1340-4686-bc4b-f69901bb45d9\") " Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.278970 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49690073-1340-4686-bc4b-f69901bb45d9-logs" (OuterVolumeSpecName: "logs") pod "49690073-1340-4686-bc4b-f69901bb45d9" (UID: "49690073-1340-4686-bc4b-f69901bb45d9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.292119 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49690073-1340-4686-bc4b-f69901bb45d9-kube-api-access-dvskn" (OuterVolumeSpecName: "kube-api-access-dvskn") pod "49690073-1340-4686-bc4b-f69901bb45d9" (UID: "49690073-1340-4686-bc4b-f69901bb45d9"). InnerVolumeSpecName "kube-api-access-dvskn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.309230 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49690073-1340-4686-bc4b-f69901bb45d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49690073-1340-4686-bc4b-f69901bb45d9" (UID: "49690073-1340-4686-bc4b-f69901bb45d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.313065 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49690073-1340-4686-bc4b-f69901bb45d9-config-data" (OuterVolumeSpecName: "config-data") pod "49690073-1340-4686-bc4b-f69901bb45d9" (UID: "49690073-1340-4686-bc4b-f69901bb45d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.377757 4792 generic.go:334] "Generic (PLEG): container finished" podID="49690073-1340-4686-bc4b-f69901bb45d9" containerID="e37d4adb8d9d5ffbf813442c1fdbcfbbf0812ab4b1f3d2f82a5ad52ffcbb0ef7" exitCode=0 Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.377936 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"49690073-1340-4686-bc4b-f69901bb45d9","Type":"ContainerDied","Data":"e37d4adb8d9d5ffbf813442c1fdbcfbbf0812ab4b1f3d2f82a5ad52ffcbb0ef7"} Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.378526 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"49690073-1340-4686-bc4b-f69901bb45d9","Type":"ContainerDied","Data":"527b5d3ab3c0911a39430a25a4440c1f01d3f9995da50cf30f22affc0352f613"} Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.378559 4792 scope.go:117] "RemoveContainer" containerID="e37d4adb8d9d5ffbf813442c1fdbcfbbf0812ab4b1f3d2f82a5ad52ffcbb0ef7" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.378086 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.380986 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49690073-1340-4686-bc4b-f69901bb45d9-logs\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.381010 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvskn\" (UniqueName: \"kubernetes.io/projected/49690073-1340-4686-bc4b-f69901bb45d9-kube-api-access-dvskn\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.381061 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49690073-1340-4686-bc4b-f69901bb45d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.381075 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49690073-1340-4686-bc4b-f69901bb45d9-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.391499 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b8015a6e-cf5d-4728-b2e6-66bb8960fd40","Type":"ContainerDied","Data":"8d6bd74bfd2c811ba8feb074f6cd879de61baffea6857711abd8ac6646b3af71"} Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.391546 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.403978 4792 scope.go:117] "RemoveContainer" containerID="0a8da701899d6dbb39ad32c67cda198a10066ca5649f3a3b10692182aafa59b8" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.431789 4792 scope.go:117] "RemoveContainer" containerID="e37d4adb8d9d5ffbf813442c1fdbcfbbf0812ab4b1f3d2f82a5ad52ffcbb0ef7" Mar 01 09:30:30 crc kubenswrapper[4792]: E0301 09:30:30.432299 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e37d4adb8d9d5ffbf813442c1fdbcfbbf0812ab4b1f3d2f82a5ad52ffcbb0ef7\": container with ID starting with e37d4adb8d9d5ffbf813442c1fdbcfbbf0812ab4b1f3d2f82a5ad52ffcbb0ef7 not found: ID does not exist" containerID="e37d4adb8d9d5ffbf813442c1fdbcfbbf0812ab4b1f3d2f82a5ad52ffcbb0ef7" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.432356 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e37d4adb8d9d5ffbf813442c1fdbcfbbf0812ab4b1f3d2f82a5ad52ffcbb0ef7"} err="failed to get container status \"e37d4adb8d9d5ffbf813442c1fdbcfbbf0812ab4b1f3d2f82a5ad52ffcbb0ef7\": rpc error: code = NotFound desc = could not find container \"e37d4adb8d9d5ffbf813442c1fdbcfbbf0812ab4b1f3d2f82a5ad52ffcbb0ef7\": container with ID starting with e37d4adb8d9d5ffbf813442c1fdbcfbbf0812ab4b1f3d2f82a5ad52ffcbb0ef7 not found: ID does not exist" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.432382 4792 scope.go:117] "RemoveContainer" containerID="0a8da701899d6dbb39ad32c67cda198a10066ca5649f3a3b10692182aafa59b8" Mar 01 09:30:30 crc kubenswrapper[4792]: E0301 09:30:30.436876 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a8da701899d6dbb39ad32c67cda198a10066ca5649f3a3b10692182aafa59b8\": container with ID starting with 0a8da701899d6dbb39ad32c67cda198a10066ca5649f3a3b10692182aafa59b8 not found: ID does not exist" containerID="0a8da701899d6dbb39ad32c67cda198a10066ca5649f3a3b10692182aafa59b8" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.437098 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a8da701899d6dbb39ad32c67cda198a10066ca5649f3a3b10692182aafa59b8"} err="failed to get container status \"0a8da701899d6dbb39ad32c67cda198a10066ca5649f3a3b10692182aafa59b8\": rpc error: code = NotFound desc = could not find container \"0a8da701899d6dbb39ad32c67cda198a10066ca5649f3a3b10692182aafa59b8\": container with ID starting with 0a8da701899d6dbb39ad32c67cda198a10066ca5649f3a3b10692182aafa59b8 not found: ID does not exist" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.437196 4792 scope.go:117] "RemoveContainer" containerID="43517df943397eb0f37738ca82db7645bf4ef1a52cc7a3b9599b08d521c29e03" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.444190 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.456454 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.475058 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.484347 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.501841 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 01 09:30:30 crc kubenswrapper[4792]: E0301 09:30:30.502385 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49690073-1340-4686-bc4b-f69901bb45d9" containerName="nova-api-log" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.502403 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="49690073-1340-4686-bc4b-f69901bb45d9" containerName="nova-api-log" Mar 01 09:30:30 crc kubenswrapper[4792]: E0301 09:30:30.502413 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49690073-1340-4686-bc4b-f69901bb45d9" containerName="nova-api-api" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.502419 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="49690073-1340-4686-bc4b-f69901bb45d9" containerName="nova-api-api" Mar 01 09:30:30 crc kubenswrapper[4792]: E0301 09:30:30.502444 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8015a6e-cf5d-4728-b2e6-66bb8960fd40" containerName="nova-scheduler-scheduler" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.502450 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8015a6e-cf5d-4728-b2e6-66bb8960fd40" containerName="nova-scheduler-scheduler" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.502632 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="49690073-1340-4686-bc4b-f69901bb45d9" containerName="nova-api-log" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.502643 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="49690073-1340-4686-bc4b-f69901bb45d9" containerName="nova-api-api" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.502661 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8015a6e-cf5d-4728-b2e6-66bb8960fd40" containerName="nova-scheduler-scheduler" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.503556 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.516178 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.522031 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.523418 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.525510 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.534511 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.541396 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.583082 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b5702fe-b26d-43ee-b702-4ac5527947cd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4b5702fe-b26d-43ee-b702-4ac5527947cd\") " pod="openstack/nova-scheduler-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.583159 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-config-data\") pod \"nova-api-0\" (UID: \"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5\") " pod="openstack/nova-api-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.583179 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcn88\" (UniqueName: \"kubernetes.io/projected/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-kube-api-access-lcn88\") pod \"nova-api-0\" (UID: \"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5\") " pod="openstack/nova-api-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.583203 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-logs\") pod \"nova-api-0\" (UID: \"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5\") " pod="openstack/nova-api-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.583255 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5\") " pod="openstack/nova-api-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.583490 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b5702fe-b26d-43ee-b702-4ac5527947cd-config-data\") pod \"nova-scheduler-0\" (UID: \"4b5702fe-b26d-43ee-b702-4ac5527947cd\") " pod="openstack/nova-scheduler-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.583594 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4tl5\" (UniqueName: \"kubernetes.io/projected/4b5702fe-b26d-43ee-b702-4ac5527947cd-kube-api-access-k4tl5\") pod \"nova-scheduler-0\" (UID: \"4b5702fe-b26d-43ee-b702-4ac5527947cd\") " pod="openstack/nova-scheduler-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.685793 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5\") " pod="openstack/nova-api-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.685880 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b5702fe-b26d-43ee-b702-4ac5527947cd-config-data\") pod \"nova-scheduler-0\" (UID: \"4b5702fe-b26d-43ee-b702-4ac5527947cd\") " pod="openstack/nova-scheduler-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.685932 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4tl5\" (UniqueName: \"kubernetes.io/projected/4b5702fe-b26d-43ee-b702-4ac5527947cd-kube-api-access-k4tl5\") pod \"nova-scheduler-0\" (UID: \"4b5702fe-b26d-43ee-b702-4ac5527947cd\") " pod="openstack/nova-scheduler-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.685965 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b5702fe-b26d-43ee-b702-4ac5527947cd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4b5702fe-b26d-43ee-b702-4ac5527947cd\") " pod="openstack/nova-scheduler-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.686015 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-config-data\") pod \"nova-api-0\" (UID: \"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5\") " pod="openstack/nova-api-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.686031 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcn88\" (UniqueName: \"kubernetes.io/projected/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-kube-api-access-lcn88\") pod \"nova-api-0\" (UID: \"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5\") " pod="openstack/nova-api-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.686053 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-logs\") pod \"nova-api-0\" (UID: \"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5\") " pod="openstack/nova-api-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.686838 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-logs\") pod \"nova-api-0\" (UID: \"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5\") " pod="openstack/nova-api-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.689725 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b5702fe-b26d-43ee-b702-4ac5527947cd-config-data\") pod \"nova-scheduler-0\" (UID: \"4b5702fe-b26d-43ee-b702-4ac5527947cd\") " pod="openstack/nova-scheduler-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.689851 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b5702fe-b26d-43ee-b702-4ac5527947cd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4b5702fe-b26d-43ee-b702-4ac5527947cd\") " pod="openstack/nova-scheduler-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.690747 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5\") " pod="openstack/nova-api-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.694432 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-config-data\") pod \"nova-api-0\" (UID: \"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5\") " pod="openstack/nova-api-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.703366 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcn88\" (UniqueName: \"kubernetes.io/projected/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-kube-api-access-lcn88\") pod \"nova-api-0\" (UID: \"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5\") " pod="openstack/nova-api-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.703368 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4tl5\" (UniqueName: \"kubernetes.io/projected/4b5702fe-b26d-43ee-b702-4ac5527947cd-kube-api-access-k4tl5\") pod \"nova-scheduler-0\" (UID: \"4b5702fe-b26d-43ee-b702-4ac5527947cd\") " pod="openstack/nova-scheduler-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.834992 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.851747 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.972303 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.972626 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 01 09:30:31 crc kubenswrapper[4792]: I0301 09:30:31.256291 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 01 09:30:31 crc kubenswrapper[4792]: I0301 09:30:31.341319 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 01 09:30:31 crc kubenswrapper[4792]: W0301 09:30:31.349259 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b5702fe_b26d_43ee_b702_4ac5527947cd.slice/crio-100c00aaeb772803e831be7330ba512b669313e4b7b45f0161f7576070d6c332 WatchSource:0}: Error finding container 100c00aaeb772803e831be7330ba512b669313e4b7b45f0161f7576070d6c332: Status 404 returned error can't find the container with id 100c00aaeb772803e831be7330ba512b669313e4b7b45f0161f7576070d6c332 Mar 01 09:30:31 crc kubenswrapper[4792]: I0301 09:30:31.438332 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49690073-1340-4686-bc4b-f69901bb45d9" path="/var/lib/kubelet/pods/49690073-1340-4686-bc4b-f69901bb45d9/volumes" Mar 01 09:30:31 crc kubenswrapper[4792]: I0301 09:30:31.439755 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8015a6e-cf5d-4728-b2e6-66bb8960fd40" path="/var/lib/kubelet/pods/b8015a6e-cf5d-4728-b2e6-66bb8960fd40/volumes" Mar 01 09:30:31 crc kubenswrapper[4792]: I0301 09:30:31.440793 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5","Type":"ContainerStarted","Data":"89a1a9514a04ba2ef6114510db9082d4635321014208a8dffde8aafd68862a7c"} Mar 01 09:30:31 crc kubenswrapper[4792]: I0301 09:30:31.440854 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4b5702fe-b26d-43ee-b702-4ac5527947cd","Type":"ContainerStarted","Data":"100c00aaeb772803e831be7330ba512b669313e4b7b45f0161f7576070d6c332"} Mar 01 09:30:31 crc kubenswrapper[4792]: I0301 09:30:31.542474 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 01 09:30:32 crc kubenswrapper[4792]: I0301 09:30:32.465377 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5","Type":"ContainerStarted","Data":"59421f9fcdf393f55cf74f2eb4e16fc9e1ecf0f2e44e7dfd8974930450018cc7"} Mar 01 09:30:32 crc kubenswrapper[4792]: I0301 09:30:32.466592 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5","Type":"ContainerStarted","Data":"967bfda4431903ec2dd2aafcf91861e2b4d545ee88e602fcc124a7870529bbdf"} Mar 01 09:30:32 crc kubenswrapper[4792]: I0301 09:30:32.471717 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4b5702fe-b26d-43ee-b702-4ac5527947cd","Type":"ContainerStarted","Data":"1130ef58928c6df8139553b0ba2bbd03e66b69c09b50f9867cc8cd0a70ce6c1e"} Mar 01 09:30:32 crc kubenswrapper[4792]: I0301 09:30:32.498663 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.498643897 podStartE2EDuration="2.498643897s" podCreationTimestamp="2026-03-01 09:30:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:30:32.49669826 +0000 UTC m=+1361.738577467" watchObservedRunningTime="2026-03-01 09:30:32.498643897 +0000 UTC m=+1361.740523114" Mar 01 09:30:32 crc kubenswrapper[4792]: I0301 09:30:32.520095 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.52007489 podStartE2EDuration="2.52007489s" podCreationTimestamp="2026-03-01 09:30:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:30:32.518264317 +0000 UTC m=+1361.760143514" watchObservedRunningTime="2026-03-01 09:30:32.52007489 +0000 UTC m=+1361.761954097" Mar 01 09:30:33 crc kubenswrapper[4792]: I0301 09:30:33.582665 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 01 09:30:33 crc kubenswrapper[4792]: I0301 09:30:33.583160 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="c5db40bf-18aa-4877-ad92-35d50c549309" containerName="kube-state-metrics" containerID="cri-o://ab032429016daa4de09e0b350e6149a60f82fbef10f58da185abc90fde56991b" gracePeriod=30 Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.071729 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.163581 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5xkf\" (UniqueName: \"kubernetes.io/projected/c5db40bf-18aa-4877-ad92-35d50c549309-kube-api-access-t5xkf\") pod \"c5db40bf-18aa-4877-ad92-35d50c549309\" (UID: \"c5db40bf-18aa-4877-ad92-35d50c549309\") " Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.185626 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5db40bf-18aa-4877-ad92-35d50c549309-kube-api-access-t5xkf" (OuterVolumeSpecName: "kube-api-access-t5xkf") pod "c5db40bf-18aa-4877-ad92-35d50c549309" (UID: "c5db40bf-18aa-4877-ad92-35d50c549309"). InnerVolumeSpecName "kube-api-access-t5xkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.265759 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5xkf\" (UniqueName: \"kubernetes.io/projected/c5db40bf-18aa-4877-ad92-35d50c549309-kube-api-access-t5xkf\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.495345 4792 generic.go:334] "Generic (PLEG): container finished" podID="c5db40bf-18aa-4877-ad92-35d50c549309" containerID="ab032429016daa4de09e0b350e6149a60f82fbef10f58da185abc90fde56991b" exitCode=2 Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.495413 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c5db40bf-18aa-4877-ad92-35d50c549309","Type":"ContainerDied","Data":"ab032429016daa4de09e0b350e6149a60f82fbef10f58da185abc90fde56991b"} Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.495445 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c5db40bf-18aa-4877-ad92-35d50c549309","Type":"ContainerDied","Data":"d3de3b349ed8682aadffbec7a09f7bd847d16614859016d386affe481743f302"} Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.495465 4792 scope.go:117] "RemoveContainer" containerID="ab032429016daa4de09e0b350e6149a60f82fbef10f58da185abc90fde56991b" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.495700 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.524897 4792 scope.go:117] "RemoveContainer" containerID="ab032429016daa4de09e0b350e6149a60f82fbef10f58da185abc90fde56991b" Mar 01 09:30:34 crc kubenswrapper[4792]: E0301 09:30:34.526669 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab032429016daa4de09e0b350e6149a60f82fbef10f58da185abc90fde56991b\": container with ID starting with ab032429016daa4de09e0b350e6149a60f82fbef10f58da185abc90fde56991b not found: ID does not exist" containerID="ab032429016daa4de09e0b350e6149a60f82fbef10f58da185abc90fde56991b" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.526786 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab032429016daa4de09e0b350e6149a60f82fbef10f58da185abc90fde56991b"} err="failed to get container status \"ab032429016daa4de09e0b350e6149a60f82fbef10f58da185abc90fde56991b\": rpc error: code = NotFound desc = could not find container \"ab032429016daa4de09e0b350e6149a60f82fbef10f58da185abc90fde56991b\": container with ID starting with ab032429016daa4de09e0b350e6149a60f82fbef10f58da185abc90fde56991b not found: ID does not exist" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.544988 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.552944 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.575969 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 01 09:30:34 crc kubenswrapper[4792]: E0301 09:30:34.576621 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5db40bf-18aa-4877-ad92-35d50c549309" containerName="kube-state-metrics" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.576716 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5db40bf-18aa-4877-ad92-35d50c549309" containerName="kube-state-metrics" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.577014 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5db40bf-18aa-4877-ad92-35d50c549309" containerName="kube-state-metrics" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.577673 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.583206 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.583385 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.597582 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.673523 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f21d62f-3539-4d5d-aeaa-cc816a51d412-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6f21d62f-3539-4d5d-aeaa-cc816a51d412\") " pod="openstack/kube-state-metrics-0" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.673591 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6f21d62f-3539-4d5d-aeaa-cc816a51d412-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6f21d62f-3539-4d5d-aeaa-cc816a51d412\") " pod="openstack/kube-state-metrics-0" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.673628 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddbwq\" (UniqueName: \"kubernetes.io/projected/6f21d62f-3539-4d5d-aeaa-cc816a51d412-kube-api-access-ddbwq\") pod \"kube-state-metrics-0\" (UID: \"6f21d62f-3539-4d5d-aeaa-cc816a51d412\") " pod="openstack/kube-state-metrics-0" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.673717 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f21d62f-3539-4d5d-aeaa-cc816a51d412-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6f21d62f-3539-4d5d-aeaa-cc816a51d412\") " pod="openstack/kube-state-metrics-0" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.775129 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f21d62f-3539-4d5d-aeaa-cc816a51d412-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6f21d62f-3539-4d5d-aeaa-cc816a51d412\") " pod="openstack/kube-state-metrics-0" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.775200 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6f21d62f-3539-4d5d-aeaa-cc816a51d412-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6f21d62f-3539-4d5d-aeaa-cc816a51d412\") " pod="openstack/kube-state-metrics-0" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.775245 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddbwq\" (UniqueName: \"kubernetes.io/projected/6f21d62f-3539-4d5d-aeaa-cc816a51d412-kube-api-access-ddbwq\") pod \"kube-state-metrics-0\" (UID: \"6f21d62f-3539-4d5d-aeaa-cc816a51d412\") " pod="openstack/kube-state-metrics-0" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.775285 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f21d62f-3539-4d5d-aeaa-cc816a51d412-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6f21d62f-3539-4d5d-aeaa-cc816a51d412\") " pod="openstack/kube-state-metrics-0" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.782367 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f21d62f-3539-4d5d-aeaa-cc816a51d412-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6f21d62f-3539-4d5d-aeaa-cc816a51d412\") " pod="openstack/kube-state-metrics-0" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.785440 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6f21d62f-3539-4d5d-aeaa-cc816a51d412-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6f21d62f-3539-4d5d-aeaa-cc816a51d412\") " pod="openstack/kube-state-metrics-0" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.786646 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f21d62f-3539-4d5d-aeaa-cc816a51d412-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6f21d62f-3539-4d5d-aeaa-cc816a51d412\") " pod="openstack/kube-state-metrics-0" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.792340 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.792655 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0b9dfa7c-35ce-4f0d-9439-ed55e060a486" containerName="ceilometer-central-agent" containerID="cri-o://98dcb90013209d7c9aae298e15969a8ba76fb60c52fecc8c46812f9b8ef37321" gracePeriod=30 Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.792716 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0b9dfa7c-35ce-4f0d-9439-ed55e060a486" containerName="sg-core" containerID="cri-o://26d4385a852587fbc619f3d05dc7aef49f2ba1f691477d85d71993fa4441f74f" gracePeriod=30 Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.792780 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0b9dfa7c-35ce-4f0d-9439-ed55e060a486" containerName="ceilometer-notification-agent" containerID="cri-o://b17992a1fa202276e607ebb67128eedd9320a4b892cde2ad53d17b1a493bd1e6" gracePeriod=30 Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.792875 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0b9dfa7c-35ce-4f0d-9439-ed55e060a486" containerName="proxy-httpd" containerID="cri-o://ea052cff5c038f68038711c9c5471ef75635b4c2956cef2d713687d1ae0e9463" gracePeriod=30 Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.806753 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddbwq\" (UniqueName: \"kubernetes.io/projected/6f21d62f-3539-4d5d-aeaa-cc816a51d412-kube-api-access-ddbwq\") pod \"kube-state-metrics-0\" (UID: \"6f21d62f-3539-4d5d-aeaa-cc816a51d412\") " pod="openstack/kube-state-metrics-0" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.900219 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.943894 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.943971 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.944013 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.944769 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d3c51b46f8c635f0bd922e6a816c6cbadeb855fe42f4d60474cde44514e4e4de"} pod="openshift-machine-config-operator/machine-config-daemon-bqszv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.944824 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" containerID="cri-o://d3c51b46f8c635f0bd922e6a816c6cbadeb855fe42f4d60474cde44514e4e4de" gracePeriod=600 Mar 01 09:30:35 crc kubenswrapper[4792]: I0301 09:30:35.418120 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5db40bf-18aa-4877-ad92-35d50c549309" path="/var/lib/kubelet/pods/c5db40bf-18aa-4877-ad92-35d50c549309/volumes" Mar 01 09:30:35 crc kubenswrapper[4792]: W0301 09:30:35.464922 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f21d62f_3539_4d5d_aeaa_cc816a51d412.slice/crio-52b3df514e15fe47df835ef406e9d71bccbbe7d4ea940261945dc0eb99090cbf WatchSource:0}: Error finding container 52b3df514e15fe47df835ef406e9d71bccbbe7d4ea940261945dc0eb99090cbf: Status 404 returned error can't find the container with id 52b3df514e15fe47df835ef406e9d71bccbbe7d4ea940261945dc0eb99090cbf Mar 01 09:30:35 crc kubenswrapper[4792]: I0301 09:30:35.468950 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 01 09:30:35 crc kubenswrapper[4792]: I0301 09:30:35.505210 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6f21d62f-3539-4d5d-aeaa-cc816a51d412","Type":"ContainerStarted","Data":"52b3df514e15fe47df835ef406e9d71bccbbe7d4ea940261945dc0eb99090cbf"} Mar 01 09:30:35 crc kubenswrapper[4792]: I0301 09:30:35.508137 4792 generic.go:334] "Generic (PLEG): container finished" podID="9105f6b0-6f16-47aa-8009-73736a90b765" containerID="d3c51b46f8c635f0bd922e6a816c6cbadeb855fe42f4d60474cde44514e4e4de" exitCode=0 Mar 01 09:30:35 crc kubenswrapper[4792]: I0301 09:30:35.508207 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerDied","Data":"d3c51b46f8c635f0bd922e6a816c6cbadeb855fe42f4d60474cde44514e4e4de"} Mar 01 09:30:35 crc kubenswrapper[4792]: I0301 09:30:35.508249 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerStarted","Data":"6c6c336556c3895a23d652801049ab8fd2cc3ff89812dc0c31bb6831441e0e06"} Mar 01 09:30:35 crc kubenswrapper[4792]: I0301 09:30:35.508268 4792 scope.go:117] "RemoveContainer" containerID="9fc8b9702d9d3591695478729a2a209996e2f83219ba8649a31afc02f286ad3f" Mar 01 09:30:35 crc kubenswrapper[4792]: I0301 09:30:35.516866 4792 generic.go:334] "Generic (PLEG): container finished" podID="0b9dfa7c-35ce-4f0d-9439-ed55e060a486" containerID="ea052cff5c038f68038711c9c5471ef75635b4c2956cef2d713687d1ae0e9463" exitCode=0 Mar 01 09:30:35 crc kubenswrapper[4792]: I0301 09:30:35.516943 4792 generic.go:334] "Generic (PLEG): container finished" podID="0b9dfa7c-35ce-4f0d-9439-ed55e060a486" containerID="26d4385a852587fbc619f3d05dc7aef49f2ba1f691477d85d71993fa4441f74f" exitCode=2 Mar 01 09:30:35 crc kubenswrapper[4792]: I0301 09:30:35.516958 4792 generic.go:334] "Generic (PLEG): container finished" podID="0b9dfa7c-35ce-4f0d-9439-ed55e060a486" containerID="98dcb90013209d7c9aae298e15969a8ba76fb60c52fecc8c46812f9b8ef37321" exitCode=0 Mar 01 09:30:35 crc kubenswrapper[4792]: I0301 09:30:35.516989 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0b9dfa7c-35ce-4f0d-9439-ed55e060a486","Type":"ContainerDied","Data":"ea052cff5c038f68038711c9c5471ef75635b4c2956cef2d713687d1ae0e9463"} Mar 01 09:30:35 crc kubenswrapper[4792]: I0301 09:30:35.517045 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0b9dfa7c-35ce-4f0d-9439-ed55e060a486","Type":"ContainerDied","Data":"26d4385a852587fbc619f3d05dc7aef49f2ba1f691477d85d71993fa4441f74f"} Mar 01 09:30:35 crc kubenswrapper[4792]: I0301 09:30:35.517061 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0b9dfa7c-35ce-4f0d-9439-ed55e060a486","Type":"ContainerDied","Data":"98dcb90013209d7c9aae298e15969a8ba76fb60c52fecc8c46812f9b8ef37321"} Mar 01 09:30:35 crc kubenswrapper[4792]: I0301 09:30:35.682352 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 01 09:30:35 crc kubenswrapper[4792]: I0301 09:30:35.852525 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 01 09:30:35 crc kubenswrapper[4792]: I0301 09:30:35.971984 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 01 09:30:35 crc kubenswrapper[4792]: I0301 09:30:35.972261 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 01 09:30:36 crc kubenswrapper[4792]: I0301 09:30:36.536811 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6f21d62f-3539-4d5d-aeaa-cc816a51d412","Type":"ContainerStarted","Data":"f0e9d810566d8958a1c8efced4ac0839665a299e441c1c47872026aff6a7c43c"} Mar 01 09:30:36 crc kubenswrapper[4792]: I0301 09:30:36.537458 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 01 09:30:36 crc kubenswrapper[4792]: I0301 09:30:36.564990 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.119348254 podStartE2EDuration="2.564971569s" podCreationTimestamp="2026-03-01 09:30:34 +0000 UTC" firstStartedPulling="2026-03-01 09:30:35.46736143 +0000 UTC m=+1364.709240627" lastFinishedPulling="2026-03-01 09:30:35.912984745 +0000 UTC m=+1365.154863942" observedRunningTime="2026-03-01 09:30:36.553274039 +0000 UTC m=+1365.795153236" watchObservedRunningTime="2026-03-01 09:30:36.564971569 +0000 UTC m=+1365.806850766" Mar 01 09:30:36 crc kubenswrapper[4792]: I0301 09:30:36.988084 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="66b40740-5f2c-4f3a-9d20-3307335829ed" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.186:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 01 09:30:36 crc kubenswrapper[4792]: I0301 09:30:36.988305 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="66b40740-5f2c-4f3a-9d20-3307335829ed" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.186:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 01 09:30:39 crc kubenswrapper[4792]: I0301 09:30:39.984413 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.102818 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54hjp\" (UniqueName: \"kubernetes.io/projected/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-kube-api-access-54hjp\") pod \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.102977 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-scripts\") pod \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.103031 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-sg-core-conf-yaml\") pod \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.103060 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-combined-ca-bundle\") pod \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.103097 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-run-httpd\") pod \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.103132 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-log-httpd\") pod \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.103202 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-config-data\") pod \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.105522 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0b9dfa7c-35ce-4f0d-9439-ed55e060a486" (UID: "0b9dfa7c-35ce-4f0d-9439-ed55e060a486"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.105744 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0b9dfa7c-35ce-4f0d-9439-ed55e060a486" (UID: "0b9dfa7c-35ce-4f0d-9439-ed55e060a486"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.110199 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-scripts" (OuterVolumeSpecName: "scripts") pod "0b9dfa7c-35ce-4f0d-9439-ed55e060a486" (UID: "0b9dfa7c-35ce-4f0d-9439-ed55e060a486"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.117957 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-kube-api-access-54hjp" (OuterVolumeSpecName: "kube-api-access-54hjp") pod "0b9dfa7c-35ce-4f0d-9439-ed55e060a486" (UID: "0b9dfa7c-35ce-4f0d-9439-ed55e060a486"). InnerVolumeSpecName "kube-api-access-54hjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.152077 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0b9dfa7c-35ce-4f0d-9439-ed55e060a486" (UID: "0b9dfa7c-35ce-4f0d-9439-ed55e060a486"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.186735 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0b9dfa7c-35ce-4f0d-9439-ed55e060a486" (UID: "0b9dfa7c-35ce-4f0d-9439-ed55e060a486"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.205263 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54hjp\" (UniqueName: \"kubernetes.io/projected/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-kube-api-access-54hjp\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.205416 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.205476 4792 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.205551 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.205607 4792 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.205661 4792 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.212152 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-config-data" (OuterVolumeSpecName: "config-data") pod "0b9dfa7c-35ce-4f0d-9439-ed55e060a486" (UID: "0b9dfa7c-35ce-4f0d-9439-ed55e060a486"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.307162 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.572566 4792 generic.go:334] "Generic (PLEG): container finished" podID="0b9dfa7c-35ce-4f0d-9439-ed55e060a486" containerID="b17992a1fa202276e607ebb67128eedd9320a4b892cde2ad53d17b1a493bd1e6" exitCode=0 Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.572608 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0b9dfa7c-35ce-4f0d-9439-ed55e060a486","Type":"ContainerDied","Data":"b17992a1fa202276e607ebb67128eedd9320a4b892cde2ad53d17b1a493bd1e6"} Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.572640 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.572649 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0b9dfa7c-35ce-4f0d-9439-ed55e060a486","Type":"ContainerDied","Data":"13d1178136245bb5d7f800642b1b34582d1e453c7a1057abd9a450d34e19ac11"} Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.572682 4792 scope.go:117] "RemoveContainer" containerID="ea052cff5c038f68038711c9c5471ef75635b4c2956cef2d713687d1ae0e9463" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.597216 4792 scope.go:117] "RemoveContainer" containerID="26d4385a852587fbc619f3d05dc7aef49f2ba1f691477d85d71993fa4441f74f" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.616163 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.643543 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.661480 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:30:40 crc kubenswrapper[4792]: E0301 09:30:40.661812 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b9dfa7c-35ce-4f0d-9439-ed55e060a486" containerName="proxy-httpd" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.661833 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b9dfa7c-35ce-4f0d-9439-ed55e060a486" containerName="proxy-httpd" Mar 01 09:30:40 crc kubenswrapper[4792]: E0301 09:30:40.661860 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b9dfa7c-35ce-4f0d-9439-ed55e060a486" containerName="sg-core" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.661868 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b9dfa7c-35ce-4f0d-9439-ed55e060a486" containerName="sg-core" Mar 01 09:30:40 crc kubenswrapper[4792]: E0301 09:30:40.661890 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b9dfa7c-35ce-4f0d-9439-ed55e060a486" containerName="ceilometer-central-agent" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.661899 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b9dfa7c-35ce-4f0d-9439-ed55e060a486" containerName="ceilometer-central-agent" Mar 01 09:30:40 crc kubenswrapper[4792]: E0301 09:30:40.662006 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b9dfa7c-35ce-4f0d-9439-ed55e060a486" containerName="ceilometer-notification-agent" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.662015 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b9dfa7c-35ce-4f0d-9439-ed55e060a486" containerName="ceilometer-notification-agent" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.662233 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b9dfa7c-35ce-4f0d-9439-ed55e060a486" containerName="proxy-httpd" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.662267 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b9dfa7c-35ce-4f0d-9439-ed55e060a486" containerName="ceilometer-central-agent" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.662278 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b9dfa7c-35ce-4f0d-9439-ed55e060a486" containerName="sg-core" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.662288 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b9dfa7c-35ce-4f0d-9439-ed55e060a486" containerName="ceilometer-notification-agent" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.663860 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.671740 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.671963 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.672096 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.682495 4792 scope.go:117] "RemoveContainer" containerID="b17992a1fa202276e607ebb67128eedd9320a4b892cde2ad53d17b1a493bd1e6" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.683443 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.730137 4792 scope.go:117] "RemoveContainer" containerID="98dcb90013209d7c9aae298e15969a8ba76fb60c52fecc8c46812f9b8ef37321" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.749856 4792 scope.go:117] "RemoveContainer" containerID="ea052cff5c038f68038711c9c5471ef75635b4c2956cef2d713687d1ae0e9463" Mar 01 09:30:40 crc kubenswrapper[4792]: E0301 09:30:40.750254 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea052cff5c038f68038711c9c5471ef75635b4c2956cef2d713687d1ae0e9463\": container with ID starting with ea052cff5c038f68038711c9c5471ef75635b4c2956cef2d713687d1ae0e9463 not found: ID does not exist" containerID="ea052cff5c038f68038711c9c5471ef75635b4c2956cef2d713687d1ae0e9463" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.750615 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea052cff5c038f68038711c9c5471ef75635b4c2956cef2d713687d1ae0e9463"} err="failed to get container status \"ea052cff5c038f68038711c9c5471ef75635b4c2956cef2d713687d1ae0e9463\": rpc error: code = NotFound desc = could not find container \"ea052cff5c038f68038711c9c5471ef75635b4c2956cef2d713687d1ae0e9463\": container with ID starting with ea052cff5c038f68038711c9c5471ef75635b4c2956cef2d713687d1ae0e9463 not found: ID does not exist" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.750637 4792 scope.go:117] "RemoveContainer" containerID="26d4385a852587fbc619f3d05dc7aef49f2ba1f691477d85d71993fa4441f74f" Mar 01 09:30:40 crc kubenswrapper[4792]: E0301 09:30:40.751056 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26d4385a852587fbc619f3d05dc7aef49f2ba1f691477d85d71993fa4441f74f\": container with ID starting with 26d4385a852587fbc619f3d05dc7aef49f2ba1f691477d85d71993fa4441f74f not found: ID does not exist" containerID="26d4385a852587fbc619f3d05dc7aef49f2ba1f691477d85d71993fa4441f74f" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.751112 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26d4385a852587fbc619f3d05dc7aef49f2ba1f691477d85d71993fa4441f74f"} err="failed to get container status \"26d4385a852587fbc619f3d05dc7aef49f2ba1f691477d85d71993fa4441f74f\": rpc error: code = NotFound desc = could not find container \"26d4385a852587fbc619f3d05dc7aef49f2ba1f691477d85d71993fa4441f74f\": container with ID starting with 26d4385a852587fbc619f3d05dc7aef49f2ba1f691477d85d71993fa4441f74f not found: ID does not exist" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.751155 4792 scope.go:117] "RemoveContainer" containerID="b17992a1fa202276e607ebb67128eedd9320a4b892cde2ad53d17b1a493bd1e6" Mar 01 09:30:40 crc kubenswrapper[4792]: E0301 09:30:40.753478 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b17992a1fa202276e607ebb67128eedd9320a4b892cde2ad53d17b1a493bd1e6\": container with ID starting with b17992a1fa202276e607ebb67128eedd9320a4b892cde2ad53d17b1a493bd1e6 not found: ID does not exist" containerID="b17992a1fa202276e607ebb67128eedd9320a4b892cde2ad53d17b1a493bd1e6" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.753503 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b17992a1fa202276e607ebb67128eedd9320a4b892cde2ad53d17b1a493bd1e6"} err="failed to get container status \"b17992a1fa202276e607ebb67128eedd9320a4b892cde2ad53d17b1a493bd1e6\": rpc error: code = NotFound desc = could not find container \"b17992a1fa202276e607ebb67128eedd9320a4b892cde2ad53d17b1a493bd1e6\": container with ID starting with b17992a1fa202276e607ebb67128eedd9320a4b892cde2ad53d17b1a493bd1e6 not found: ID does not exist" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.753516 4792 scope.go:117] "RemoveContainer" containerID="98dcb90013209d7c9aae298e15969a8ba76fb60c52fecc8c46812f9b8ef37321" Mar 01 09:30:40 crc kubenswrapper[4792]: E0301 09:30:40.753785 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98dcb90013209d7c9aae298e15969a8ba76fb60c52fecc8c46812f9b8ef37321\": container with ID starting with 98dcb90013209d7c9aae298e15969a8ba76fb60c52fecc8c46812f9b8ef37321 not found: ID does not exist" containerID="98dcb90013209d7c9aae298e15969a8ba76fb60c52fecc8c46812f9b8ef37321" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.753805 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98dcb90013209d7c9aae298e15969a8ba76fb60c52fecc8c46812f9b8ef37321"} err="failed to get container status \"98dcb90013209d7c9aae298e15969a8ba76fb60c52fecc8c46812f9b8ef37321\": rpc error: code = NotFound desc = could not find container \"98dcb90013209d7c9aae298e15969a8ba76fb60c52fecc8c46812f9b8ef37321\": container with ID starting with 98dcb90013209d7c9aae298e15969a8ba76fb60c52fecc8c46812f9b8ef37321 not found: ID does not exist" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.818041 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.818093 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb7pg\" (UniqueName: \"kubernetes.io/projected/29759876-6cc9-4695-b6a1-b0204c0eeefe-kube-api-access-nb7pg\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.818299 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-config-data\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.818377 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.818424 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29759876-6cc9-4695-b6a1-b0204c0eeefe-run-httpd\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.818613 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29759876-6cc9-4695-b6a1-b0204c0eeefe-log-httpd\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.818783 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-scripts\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.818843 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.836429 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.836482 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.853048 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.880792 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.920930 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.921014 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.921059 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nb7pg\" (UniqueName: \"kubernetes.io/projected/29759876-6cc9-4695-b6a1-b0204c0eeefe-kube-api-access-nb7pg\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.921103 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-config-data\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.921123 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.921142 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29759876-6cc9-4695-b6a1-b0204c0eeefe-run-httpd\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.921209 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29759876-6cc9-4695-b6a1-b0204c0eeefe-log-httpd\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.921249 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-scripts\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.922845 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29759876-6cc9-4695-b6a1-b0204c0eeefe-run-httpd\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.923054 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29759876-6cc9-4695-b6a1-b0204c0eeefe-log-httpd\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.927595 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-scripts\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.929282 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-config-data\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.929751 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.929930 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.946877 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.957015 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb7pg\" (UniqueName: \"kubernetes.io/projected/29759876-6cc9-4695-b6a1-b0204c0eeefe-kube-api-access-nb7pg\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.987927 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:30:41 crc kubenswrapper[4792]: I0301 09:30:41.417684 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b9dfa7c-35ce-4f0d-9439-ed55e060a486" path="/var/lib/kubelet/pods/0b9dfa7c-35ce-4f0d-9439-ed55e060a486/volumes" Mar 01 09:30:41 crc kubenswrapper[4792]: I0301 09:30:41.455481 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:30:41 crc kubenswrapper[4792]: W0301 09:30:41.458449 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29759876_6cc9_4695_b6a1_b0204c0eeefe.slice/crio-af74a6693f05dc86b0605fd8f64e799df50bc1492dea119b6c7b89e5c5c846cf WatchSource:0}: Error finding container af74a6693f05dc86b0605fd8f64e799df50bc1492dea119b6c7b89e5c5c846cf: Status 404 returned error can't find the container with id af74a6693f05dc86b0605fd8f64e799df50bc1492dea119b6c7b89e5c5c846cf Mar 01 09:30:41 crc kubenswrapper[4792]: I0301 09:30:41.584216 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29759876-6cc9-4695-b6a1-b0204c0eeefe","Type":"ContainerStarted","Data":"af74a6693f05dc86b0605fd8f64e799df50bc1492dea119b6c7b89e5c5c846cf"} Mar 01 09:30:41 crc kubenswrapper[4792]: I0301 09:30:41.635447 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 01 09:30:41 crc kubenswrapper[4792]: I0301 09:30:41.920440 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a9e82dc0-29af-47c8-bbef-1fd4bb999ff5" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.187:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 01 09:30:41 crc kubenswrapper[4792]: I0301 09:30:41.920594 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a9e82dc0-29af-47c8-bbef-1fd4bb999ff5" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.187:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 01 09:30:42 crc kubenswrapper[4792]: I0301 09:30:42.615114 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29759876-6cc9-4695-b6a1-b0204c0eeefe","Type":"ContainerStarted","Data":"527b30377b6256714a4388f7e23bcabdd65f592f13d11e6b52ac05c9342bfd2a"} Mar 01 09:30:43 crc kubenswrapper[4792]: I0301 09:30:43.627410 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29759876-6cc9-4695-b6a1-b0204c0eeefe","Type":"ContainerStarted","Data":"34964964d66ee80be4a7e87aa9e1e71eff65ac803c33a8d8654d01bf69a0f013"} Mar 01 09:30:43 crc kubenswrapper[4792]: I0301 09:30:43.627724 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29759876-6cc9-4695-b6a1-b0204c0eeefe","Type":"ContainerStarted","Data":"27961c30989977627bc62cffbc34bfcddc0bf83961913bcb7f03340031c18bf3"} Mar 01 09:30:44 crc kubenswrapper[4792]: I0301 09:30:44.910470 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 01 09:30:45 crc kubenswrapper[4792]: I0301 09:30:45.646717 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29759876-6cc9-4695-b6a1-b0204c0eeefe","Type":"ContainerStarted","Data":"ebc7b83a7e8bbf8a5ebc283245e6f34e1228e9544a5c7d8cdde9491f54d017af"} Mar 01 09:30:45 crc kubenswrapper[4792]: I0301 09:30:45.647074 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 01 09:30:45 crc kubenswrapper[4792]: I0301 09:30:45.674689 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.977116352 podStartE2EDuration="5.674497247s" podCreationTimestamp="2026-03-01 09:30:40 +0000 UTC" firstStartedPulling="2026-03-01 09:30:41.460473853 +0000 UTC m=+1370.702353040" lastFinishedPulling="2026-03-01 09:30:45.157854748 +0000 UTC m=+1374.399733935" observedRunningTime="2026-03-01 09:30:45.666900614 +0000 UTC m=+1374.908779811" watchObservedRunningTime="2026-03-01 09:30:45.674497247 +0000 UTC m=+1374.916376434" Mar 01 09:30:45 crc kubenswrapper[4792]: I0301 09:30:45.976995 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 01 09:30:45 crc kubenswrapper[4792]: I0301 09:30:45.977073 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 01 09:30:45 crc kubenswrapper[4792]: I0301 09:30:45.990636 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 01 09:30:45 crc kubenswrapper[4792]: I0301 09:30:45.999470 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 01 09:30:48 crc kubenswrapper[4792]: I0301 09:30:48.530062 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:48 crc kubenswrapper[4792]: I0301 09:30:48.676531 4792 generic.go:334] "Generic (PLEG): container finished" podID="2b24b4c8-4f85-4eae-93c7-3249c1a54f09" containerID="b93abbe99982a2c4da73af61051b2d1996a55ea6924dc1c336ede901534900b5" exitCode=137 Mar 01 09:30:48 crc kubenswrapper[4792]: I0301 09:30:48.676573 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2b24b4c8-4f85-4eae-93c7-3249c1a54f09","Type":"ContainerDied","Data":"b93abbe99982a2c4da73af61051b2d1996a55ea6924dc1c336ede901534900b5"} Mar 01 09:30:48 crc kubenswrapper[4792]: I0301 09:30:48.676599 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2b24b4c8-4f85-4eae-93c7-3249c1a54f09","Type":"ContainerDied","Data":"666f673dbf785249e4b855230a831650c499d51cb9413297a868fb5bc1afca52"} Mar 01 09:30:48 crc kubenswrapper[4792]: I0301 09:30:48.676615 4792 scope.go:117] "RemoveContainer" containerID="b93abbe99982a2c4da73af61051b2d1996a55ea6924dc1c336ede901534900b5" Mar 01 09:30:48 crc kubenswrapper[4792]: I0301 09:30:48.676724 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:48 crc kubenswrapper[4792]: I0301 09:30:48.692340 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b24b4c8-4f85-4eae-93c7-3249c1a54f09-config-data\") pod \"2b24b4c8-4f85-4eae-93c7-3249c1a54f09\" (UID: \"2b24b4c8-4f85-4eae-93c7-3249c1a54f09\") " Mar 01 09:30:48 crc kubenswrapper[4792]: I0301 09:30:48.692391 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b24b4c8-4f85-4eae-93c7-3249c1a54f09-combined-ca-bundle\") pod \"2b24b4c8-4f85-4eae-93c7-3249c1a54f09\" (UID: \"2b24b4c8-4f85-4eae-93c7-3249c1a54f09\") " Mar 01 09:30:48 crc kubenswrapper[4792]: I0301 09:30:48.692481 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5s2hb\" (UniqueName: \"kubernetes.io/projected/2b24b4c8-4f85-4eae-93c7-3249c1a54f09-kube-api-access-5s2hb\") pod \"2b24b4c8-4f85-4eae-93c7-3249c1a54f09\" (UID: \"2b24b4c8-4f85-4eae-93c7-3249c1a54f09\") " Mar 01 09:30:48 crc kubenswrapper[4792]: I0301 09:30:48.698769 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b24b4c8-4f85-4eae-93c7-3249c1a54f09-kube-api-access-5s2hb" (OuterVolumeSpecName: "kube-api-access-5s2hb") pod "2b24b4c8-4f85-4eae-93c7-3249c1a54f09" (UID: "2b24b4c8-4f85-4eae-93c7-3249c1a54f09"). InnerVolumeSpecName "kube-api-access-5s2hb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:30:48 crc kubenswrapper[4792]: I0301 09:30:48.700033 4792 scope.go:117] "RemoveContainer" containerID="b93abbe99982a2c4da73af61051b2d1996a55ea6924dc1c336ede901534900b5" Mar 01 09:30:48 crc kubenswrapper[4792]: E0301 09:30:48.700506 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b93abbe99982a2c4da73af61051b2d1996a55ea6924dc1c336ede901534900b5\": container with ID starting with b93abbe99982a2c4da73af61051b2d1996a55ea6924dc1c336ede901534900b5 not found: ID does not exist" containerID="b93abbe99982a2c4da73af61051b2d1996a55ea6924dc1c336ede901534900b5" Mar 01 09:30:48 crc kubenswrapper[4792]: I0301 09:30:48.700596 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b93abbe99982a2c4da73af61051b2d1996a55ea6924dc1c336ede901534900b5"} err="failed to get container status \"b93abbe99982a2c4da73af61051b2d1996a55ea6924dc1c336ede901534900b5\": rpc error: code = NotFound desc = could not find container \"b93abbe99982a2c4da73af61051b2d1996a55ea6924dc1c336ede901534900b5\": container with ID starting with b93abbe99982a2c4da73af61051b2d1996a55ea6924dc1c336ede901534900b5 not found: ID does not exist" Mar 01 09:30:48 crc kubenswrapper[4792]: I0301 09:30:48.724550 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b24b4c8-4f85-4eae-93c7-3249c1a54f09-config-data" (OuterVolumeSpecName: "config-data") pod "2b24b4c8-4f85-4eae-93c7-3249c1a54f09" (UID: "2b24b4c8-4f85-4eae-93c7-3249c1a54f09"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:48 crc kubenswrapper[4792]: I0301 09:30:48.735631 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b24b4c8-4f85-4eae-93c7-3249c1a54f09-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b24b4c8-4f85-4eae-93c7-3249c1a54f09" (UID: "2b24b4c8-4f85-4eae-93c7-3249c1a54f09"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:48 crc kubenswrapper[4792]: I0301 09:30:48.794697 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b24b4c8-4f85-4eae-93c7-3249c1a54f09-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:48 crc kubenswrapper[4792]: I0301 09:30:48.794731 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b24b4c8-4f85-4eae-93c7-3249c1a54f09-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:48 crc kubenswrapper[4792]: I0301 09:30:48.794742 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5s2hb\" (UniqueName: \"kubernetes.io/projected/2b24b4c8-4f85-4eae-93c7-3249c1a54f09-kube-api-access-5s2hb\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.007841 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.017208 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.038751 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 01 09:30:49 crc kubenswrapper[4792]: E0301 09:30:49.039082 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b24b4c8-4f85-4eae-93c7-3249c1a54f09" containerName="nova-cell1-novncproxy-novncproxy" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.039097 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b24b4c8-4f85-4eae-93c7-3249c1a54f09" containerName="nova-cell1-novncproxy-novncproxy" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.039289 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b24b4c8-4f85-4eae-93c7-3249c1a54f09" containerName="nova-cell1-novncproxy-novncproxy" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.039794 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.041443 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.041617 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.041831 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.056216 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.098990 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63afaac7-c934-4410-b2b5-ab04ad085489-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"63afaac7-c934-4410-b2b5-ab04ad085489\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.099089 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5ztz\" (UniqueName: \"kubernetes.io/projected/63afaac7-c934-4410-b2b5-ab04ad085489-kube-api-access-s5ztz\") pod \"nova-cell1-novncproxy-0\" (UID: \"63afaac7-c934-4410-b2b5-ab04ad085489\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.099132 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63afaac7-c934-4410-b2b5-ab04ad085489-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"63afaac7-c934-4410-b2b5-ab04ad085489\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.099203 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/63afaac7-c934-4410-b2b5-ab04ad085489-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"63afaac7-c934-4410-b2b5-ab04ad085489\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.099263 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/63afaac7-c934-4410-b2b5-ab04ad085489-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"63afaac7-c934-4410-b2b5-ab04ad085489\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.200601 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/63afaac7-c934-4410-b2b5-ab04ad085489-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"63afaac7-c934-4410-b2b5-ab04ad085489\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.200974 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/63afaac7-c934-4410-b2b5-ab04ad085489-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"63afaac7-c934-4410-b2b5-ab04ad085489\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.201114 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63afaac7-c934-4410-b2b5-ab04ad085489-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"63afaac7-c934-4410-b2b5-ab04ad085489\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.201253 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5ztz\" (UniqueName: \"kubernetes.io/projected/63afaac7-c934-4410-b2b5-ab04ad085489-kube-api-access-s5ztz\") pod \"nova-cell1-novncproxy-0\" (UID: \"63afaac7-c934-4410-b2b5-ab04ad085489\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.201382 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63afaac7-c934-4410-b2b5-ab04ad085489-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"63afaac7-c934-4410-b2b5-ab04ad085489\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.204431 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/63afaac7-c934-4410-b2b5-ab04ad085489-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"63afaac7-c934-4410-b2b5-ab04ad085489\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.204471 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/63afaac7-c934-4410-b2b5-ab04ad085489-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"63afaac7-c934-4410-b2b5-ab04ad085489\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.205119 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63afaac7-c934-4410-b2b5-ab04ad085489-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"63afaac7-c934-4410-b2b5-ab04ad085489\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.205493 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63afaac7-c934-4410-b2b5-ab04ad085489-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"63afaac7-c934-4410-b2b5-ab04ad085489\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.221607 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5ztz\" (UniqueName: \"kubernetes.io/projected/63afaac7-c934-4410-b2b5-ab04ad085489-kube-api-access-s5ztz\") pod \"nova-cell1-novncproxy-0\" (UID: \"63afaac7-c934-4410-b2b5-ab04ad085489\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.355635 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.460256 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b24b4c8-4f85-4eae-93c7-3249c1a54f09" path="/var/lib/kubelet/pods/2b24b4c8-4f85-4eae-93c7-3249c1a54f09/volumes" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.866095 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 01 09:30:49 crc kubenswrapper[4792]: W0301 09:30:49.870741 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63afaac7_c934_4410_b2b5_ab04ad085489.slice/crio-9e55529456f0c08af6f0894b4440664e6083bc44293a3fb2e38ce9051ac65d5b WatchSource:0}: Error finding container 9e55529456f0c08af6f0894b4440664e6083bc44293a3fb2e38ce9051ac65d5b: Status 404 returned error can't find the container with id 9e55529456f0c08af6f0894b4440664e6083bc44293a3fb2e38ce9051ac65d5b Mar 01 09:30:50 crc kubenswrapper[4792]: I0301 09:30:50.696003 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"63afaac7-c934-4410-b2b5-ab04ad085489","Type":"ContainerStarted","Data":"4d99695ffa50fb2178db1d19dd03219e4bc573c6f2cc98fa8df1e40c4d180dcb"} Mar 01 09:30:50 crc kubenswrapper[4792]: I0301 09:30:50.696339 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"63afaac7-c934-4410-b2b5-ab04ad085489","Type":"ContainerStarted","Data":"9e55529456f0c08af6f0894b4440664e6083bc44293a3fb2e38ce9051ac65d5b"} Mar 01 09:30:50 crc kubenswrapper[4792]: I0301 09:30:50.731396 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.7313786009999999 podStartE2EDuration="1.731378601s" podCreationTimestamp="2026-03-01 09:30:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:30:50.726602437 +0000 UTC m=+1379.968481634" watchObservedRunningTime="2026-03-01 09:30:50.731378601 +0000 UTC m=+1379.973257798" Mar 01 09:30:50 crc kubenswrapper[4792]: I0301 09:30:50.976471 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 01 09:30:50 crc kubenswrapper[4792]: I0301 09:30:50.977408 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 01 09:30:51 crc kubenswrapper[4792]: I0301 09:30:51.091616 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 01 09:30:51 crc kubenswrapper[4792]: I0301 09:30:51.094191 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 01 09:30:51 crc kubenswrapper[4792]: I0301 09:30:51.705585 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 01 09:30:51 crc kubenswrapper[4792]: I0301 09:30:51.719758 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 01 09:30:51 crc kubenswrapper[4792]: I0301 09:30:51.936667 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c74598c69-2pgch"] Mar 01 09:30:51 crc kubenswrapper[4792]: I0301 09:30:51.942190 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c74598c69-2pgch" Mar 01 09:30:51 crc kubenswrapper[4792]: I0301 09:30:51.973936 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c74598c69-2pgch"] Mar 01 09:30:52 crc kubenswrapper[4792]: I0301 09:30:52.070850 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-dns-svc\") pod \"dnsmasq-dns-6c74598c69-2pgch\" (UID: \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\") " pod="openstack/dnsmasq-dns-6c74598c69-2pgch" Mar 01 09:30:52 crc kubenswrapper[4792]: I0301 09:30:52.070960 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-ovsdbserver-sb\") pod \"dnsmasq-dns-6c74598c69-2pgch\" (UID: \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\") " pod="openstack/dnsmasq-dns-6c74598c69-2pgch" Mar 01 09:30:52 crc kubenswrapper[4792]: I0301 09:30:52.071058 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-config\") pod \"dnsmasq-dns-6c74598c69-2pgch\" (UID: \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\") " pod="openstack/dnsmasq-dns-6c74598c69-2pgch" Mar 01 09:30:52 crc kubenswrapper[4792]: I0301 09:30:52.072227 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-ovsdbserver-nb\") pod \"dnsmasq-dns-6c74598c69-2pgch\" (UID: \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\") " pod="openstack/dnsmasq-dns-6c74598c69-2pgch" Mar 01 09:30:52 crc kubenswrapper[4792]: I0301 09:30:52.072266 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drdkm\" (UniqueName: \"kubernetes.io/projected/3dc02dae-4469-4e20-aca1-c85d7e451b7f-kube-api-access-drdkm\") pod \"dnsmasq-dns-6c74598c69-2pgch\" (UID: \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\") " pod="openstack/dnsmasq-dns-6c74598c69-2pgch" Mar 01 09:30:52 crc kubenswrapper[4792]: I0301 09:30:52.173776 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-ovsdbserver-nb\") pod \"dnsmasq-dns-6c74598c69-2pgch\" (UID: \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\") " pod="openstack/dnsmasq-dns-6c74598c69-2pgch" Mar 01 09:30:52 crc kubenswrapper[4792]: I0301 09:30:52.173848 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drdkm\" (UniqueName: \"kubernetes.io/projected/3dc02dae-4469-4e20-aca1-c85d7e451b7f-kube-api-access-drdkm\") pod \"dnsmasq-dns-6c74598c69-2pgch\" (UID: \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\") " pod="openstack/dnsmasq-dns-6c74598c69-2pgch" Mar 01 09:30:52 crc kubenswrapper[4792]: I0301 09:30:52.173898 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-dns-svc\") pod \"dnsmasq-dns-6c74598c69-2pgch\" (UID: \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\") " pod="openstack/dnsmasq-dns-6c74598c69-2pgch" Mar 01 09:30:52 crc kubenswrapper[4792]: I0301 09:30:52.174002 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-ovsdbserver-sb\") pod \"dnsmasq-dns-6c74598c69-2pgch\" (UID: \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\") " pod="openstack/dnsmasq-dns-6c74598c69-2pgch" Mar 01 09:30:52 crc kubenswrapper[4792]: I0301 09:30:52.174122 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-config\") pod \"dnsmasq-dns-6c74598c69-2pgch\" (UID: \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\") " pod="openstack/dnsmasq-dns-6c74598c69-2pgch" Mar 01 09:30:52 crc kubenswrapper[4792]: I0301 09:30:52.174729 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-ovsdbserver-nb\") pod \"dnsmasq-dns-6c74598c69-2pgch\" (UID: \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\") " pod="openstack/dnsmasq-dns-6c74598c69-2pgch" Mar 01 09:30:52 crc kubenswrapper[4792]: I0301 09:30:52.174852 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-dns-svc\") pod \"dnsmasq-dns-6c74598c69-2pgch\" (UID: \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\") " pod="openstack/dnsmasq-dns-6c74598c69-2pgch" Mar 01 09:30:52 crc kubenswrapper[4792]: I0301 09:30:52.174896 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-ovsdbserver-sb\") pod \"dnsmasq-dns-6c74598c69-2pgch\" (UID: \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\") " pod="openstack/dnsmasq-dns-6c74598c69-2pgch" Mar 01 09:30:52 crc kubenswrapper[4792]: I0301 09:30:52.175024 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-config\") pod \"dnsmasq-dns-6c74598c69-2pgch\" (UID: \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\") " pod="openstack/dnsmasq-dns-6c74598c69-2pgch" Mar 01 09:30:52 crc kubenswrapper[4792]: I0301 09:30:52.197596 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drdkm\" (UniqueName: \"kubernetes.io/projected/3dc02dae-4469-4e20-aca1-c85d7e451b7f-kube-api-access-drdkm\") pod \"dnsmasq-dns-6c74598c69-2pgch\" (UID: \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\") " pod="openstack/dnsmasq-dns-6c74598c69-2pgch" Mar 01 09:30:52 crc kubenswrapper[4792]: I0301 09:30:52.261699 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c74598c69-2pgch" Mar 01 09:30:52 crc kubenswrapper[4792]: I0301 09:30:52.764972 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c74598c69-2pgch"] Mar 01 09:30:52 crc kubenswrapper[4792]: W0301 09:30:52.771154 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3dc02dae_4469_4e20_aca1_c85d7e451b7f.slice/crio-6c7fd81ed9af7972987f7c71c071457fa812a6cd3376e81094962a9f69854811 WatchSource:0}: Error finding container 6c7fd81ed9af7972987f7c71c071457fa812a6cd3376e81094962a9f69854811: Status 404 returned error can't find the container with id 6c7fd81ed9af7972987f7c71c071457fa812a6cd3376e81094962a9f69854811 Mar 01 09:30:53 crc kubenswrapper[4792]: I0301 09:30:53.722530 4792 generic.go:334] "Generic (PLEG): container finished" podID="3dc02dae-4469-4e20-aca1-c85d7e451b7f" containerID="e23f1b79cd43eb2011bd766bf49f85132d9c752379520cf1459e680d0fc6f48c" exitCode=0 Mar 01 09:30:53 crc kubenswrapper[4792]: I0301 09:30:53.722681 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c74598c69-2pgch" event={"ID":"3dc02dae-4469-4e20-aca1-c85d7e451b7f","Type":"ContainerDied","Data":"e23f1b79cd43eb2011bd766bf49f85132d9c752379520cf1459e680d0fc6f48c"} Mar 01 09:30:53 crc kubenswrapper[4792]: I0301 09:30:53.722868 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c74598c69-2pgch" event={"ID":"3dc02dae-4469-4e20-aca1-c85d7e451b7f","Type":"ContainerStarted","Data":"6c7fd81ed9af7972987f7c71c071457fa812a6cd3376e81094962a9f69854811"} Mar 01 09:30:54 crc kubenswrapper[4792]: I0301 09:30:54.356606 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:54 crc kubenswrapper[4792]: I0301 09:30:54.515269 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 01 09:30:54 crc kubenswrapper[4792]: I0301 09:30:54.578788 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:30:54 crc kubenswrapper[4792]: I0301 09:30:54.583317 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="29759876-6cc9-4695-b6a1-b0204c0eeefe" containerName="ceilometer-central-agent" containerID="cri-o://527b30377b6256714a4388f7e23bcabdd65f592f13d11e6b52ac05c9342bfd2a" gracePeriod=30 Mar 01 09:30:54 crc kubenswrapper[4792]: I0301 09:30:54.583722 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="29759876-6cc9-4695-b6a1-b0204c0eeefe" containerName="sg-core" containerID="cri-o://34964964d66ee80be4a7e87aa9e1e71eff65ac803c33a8d8654d01bf69a0f013" gracePeriod=30 Mar 01 09:30:54 crc kubenswrapper[4792]: I0301 09:30:54.583774 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="29759876-6cc9-4695-b6a1-b0204c0eeefe" containerName="ceilometer-notification-agent" containerID="cri-o://27961c30989977627bc62cffbc34bfcddc0bf83961913bcb7f03340031c18bf3" gracePeriod=30 Mar 01 09:30:54 crc kubenswrapper[4792]: I0301 09:30:54.584183 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="29759876-6cc9-4695-b6a1-b0204c0eeefe" containerName="proxy-httpd" containerID="cri-o://ebc7b83a7e8bbf8a5ebc283245e6f34e1228e9544a5c7d8cdde9491f54d017af" gracePeriod=30 Mar 01 09:30:54 crc kubenswrapper[4792]: I0301 09:30:54.733738 4792 generic.go:334] "Generic (PLEG): container finished" podID="29759876-6cc9-4695-b6a1-b0204c0eeefe" containerID="34964964d66ee80be4a7e87aa9e1e71eff65ac803c33a8d8654d01bf69a0f013" exitCode=2 Mar 01 09:30:54 crc kubenswrapper[4792]: I0301 09:30:54.733782 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29759876-6cc9-4695-b6a1-b0204c0eeefe","Type":"ContainerDied","Data":"34964964d66ee80be4a7e87aa9e1e71eff65ac803c33a8d8654d01bf69a0f013"} Mar 01 09:30:54 crc kubenswrapper[4792]: I0301 09:30:54.735562 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c74598c69-2pgch" event={"ID":"3dc02dae-4469-4e20-aca1-c85d7e451b7f","Type":"ContainerStarted","Data":"a09dbf98a35444aba5d4b5bed5ccc4f027fcae4773128cc7e35e38da6f0a3954"} Mar 01 09:30:54 crc kubenswrapper[4792]: I0301 09:30:54.735656 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a9e82dc0-29af-47c8-bbef-1fd4bb999ff5" containerName="nova-api-log" containerID="cri-o://967bfda4431903ec2dd2aafcf91861e2b4d545ee88e602fcc124a7870529bbdf" gracePeriod=30 Mar 01 09:30:54 crc kubenswrapper[4792]: I0301 09:30:54.735765 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a9e82dc0-29af-47c8-bbef-1fd4bb999ff5" containerName="nova-api-api" containerID="cri-o://59421f9fcdf393f55cf74f2eb4e16fc9e1ecf0f2e44e7dfd8974930450018cc7" gracePeriod=30 Mar 01 09:30:54 crc kubenswrapper[4792]: I0301 09:30:54.758702 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c74598c69-2pgch" podStartSLOduration=3.758682628 podStartE2EDuration="3.758682628s" podCreationTimestamp="2026-03-01 09:30:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:30:54.752360676 +0000 UTC m=+1383.994239873" watchObservedRunningTime="2026-03-01 09:30:54.758682628 +0000 UTC m=+1384.000561825" Mar 01 09:30:55 crc kubenswrapper[4792]: I0301 09:30:55.749211 4792 generic.go:334] "Generic (PLEG): container finished" podID="29759876-6cc9-4695-b6a1-b0204c0eeefe" containerID="ebc7b83a7e8bbf8a5ebc283245e6f34e1228e9544a5c7d8cdde9491f54d017af" exitCode=0 Mar 01 09:30:55 crc kubenswrapper[4792]: I0301 09:30:55.749580 4792 generic.go:334] "Generic (PLEG): container finished" podID="29759876-6cc9-4695-b6a1-b0204c0eeefe" containerID="527b30377b6256714a4388f7e23bcabdd65f592f13d11e6b52ac05c9342bfd2a" exitCode=0 Mar 01 09:30:55 crc kubenswrapper[4792]: I0301 09:30:55.749274 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29759876-6cc9-4695-b6a1-b0204c0eeefe","Type":"ContainerDied","Data":"ebc7b83a7e8bbf8a5ebc283245e6f34e1228e9544a5c7d8cdde9491f54d017af"} Mar 01 09:30:55 crc kubenswrapper[4792]: I0301 09:30:55.749660 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29759876-6cc9-4695-b6a1-b0204c0eeefe","Type":"ContainerDied","Data":"527b30377b6256714a4388f7e23bcabdd65f592f13d11e6b52ac05c9342bfd2a"} Mar 01 09:30:55 crc kubenswrapper[4792]: I0301 09:30:55.752264 4792 generic.go:334] "Generic (PLEG): container finished" podID="a9e82dc0-29af-47c8-bbef-1fd4bb999ff5" containerID="967bfda4431903ec2dd2aafcf91861e2b4d545ee88e602fcc124a7870529bbdf" exitCode=143 Mar 01 09:30:55 crc kubenswrapper[4792]: I0301 09:30:55.752403 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5","Type":"ContainerDied","Data":"967bfda4431903ec2dd2aafcf91861e2b4d545ee88e602fcc124a7870529bbdf"} Mar 01 09:30:55 crc kubenswrapper[4792]: I0301 09:30:55.752740 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c74598c69-2pgch" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.201793 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.256628 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-sg-core-conf-yaml\") pod \"29759876-6cc9-4695-b6a1-b0204c0eeefe\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.256674 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nb7pg\" (UniqueName: \"kubernetes.io/projected/29759876-6cc9-4695-b6a1-b0204c0eeefe-kube-api-access-nb7pg\") pod \"29759876-6cc9-4695-b6a1-b0204c0eeefe\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.256714 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-ceilometer-tls-certs\") pod \"29759876-6cc9-4695-b6a1-b0204c0eeefe\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.256749 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-config-data\") pod \"29759876-6cc9-4695-b6a1-b0204c0eeefe\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.256774 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29759876-6cc9-4695-b6a1-b0204c0eeefe-run-httpd\") pod \"29759876-6cc9-4695-b6a1-b0204c0eeefe\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.256814 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-scripts\") pod \"29759876-6cc9-4695-b6a1-b0204c0eeefe\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.256856 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-combined-ca-bundle\") pod \"29759876-6cc9-4695-b6a1-b0204c0eeefe\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.256882 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29759876-6cc9-4695-b6a1-b0204c0eeefe-log-httpd\") pod \"29759876-6cc9-4695-b6a1-b0204c0eeefe\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.257161 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29759876-6cc9-4695-b6a1-b0204c0eeefe-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "29759876-6cc9-4695-b6a1-b0204c0eeefe" (UID: "29759876-6cc9-4695-b6a1-b0204c0eeefe"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.257484 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29759876-6cc9-4695-b6a1-b0204c0eeefe-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "29759876-6cc9-4695-b6a1-b0204c0eeefe" (UID: "29759876-6cc9-4695-b6a1-b0204c0eeefe"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.267818 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29759876-6cc9-4695-b6a1-b0204c0eeefe-kube-api-access-nb7pg" (OuterVolumeSpecName: "kube-api-access-nb7pg") pod "29759876-6cc9-4695-b6a1-b0204c0eeefe" (UID: "29759876-6cc9-4695-b6a1-b0204c0eeefe"). InnerVolumeSpecName "kube-api-access-nb7pg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.295579 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-scripts" (OuterVolumeSpecName: "scripts") pod "29759876-6cc9-4695-b6a1-b0204c0eeefe" (UID: "29759876-6cc9-4695-b6a1-b0204c0eeefe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.365797 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "29759876-6cc9-4695-b6a1-b0204c0eeefe" (UID: "29759876-6cc9-4695-b6a1-b0204c0eeefe"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.367334 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "29759876-6cc9-4695-b6a1-b0204c0eeefe" (UID: "29759876-6cc9-4695-b6a1-b0204c0eeefe"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.370634 4792 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29759876-6cc9-4695-b6a1-b0204c0eeefe-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.370678 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.370691 4792 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29759876-6cc9-4695-b6a1-b0204c0eeefe-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.370702 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nb7pg\" (UniqueName: \"kubernetes.io/projected/29759876-6cc9-4695-b6a1-b0204c0eeefe-kube-api-access-nb7pg\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.405658 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29759876-6cc9-4695-b6a1-b0204c0eeefe" (UID: "29759876-6cc9-4695-b6a1-b0204c0eeefe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.435287 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-config-data" (OuterVolumeSpecName: "config-data") pod "29759876-6cc9-4695-b6a1-b0204c0eeefe" (UID: "29759876-6cc9-4695-b6a1-b0204c0eeefe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.472087 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.472123 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.472138 4792 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.472146 4792 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.764712 4792 generic.go:334] "Generic (PLEG): container finished" podID="29759876-6cc9-4695-b6a1-b0204c0eeefe" containerID="27961c30989977627bc62cffbc34bfcddc0bf83961913bcb7f03340031c18bf3" exitCode=0 Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.764739 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29759876-6cc9-4695-b6a1-b0204c0eeefe","Type":"ContainerDied","Data":"27961c30989977627bc62cffbc34bfcddc0bf83961913bcb7f03340031c18bf3"} Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.764787 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29759876-6cc9-4695-b6a1-b0204c0eeefe","Type":"ContainerDied","Data":"af74a6693f05dc86b0605fd8f64e799df50bc1492dea119b6c7b89e5c5c846cf"} Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.764805 4792 scope.go:117] "RemoveContainer" containerID="ebc7b83a7e8bbf8a5ebc283245e6f34e1228e9544a5c7d8cdde9491f54d017af" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.764813 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.789081 4792 scope.go:117] "RemoveContainer" containerID="34964964d66ee80be4a7e87aa9e1e71eff65ac803c33a8d8654d01bf69a0f013" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.799147 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.811445 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.820747 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:30:56 crc kubenswrapper[4792]: E0301 09:30:56.821293 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29759876-6cc9-4695-b6a1-b0204c0eeefe" containerName="ceilometer-central-agent" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.821372 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="29759876-6cc9-4695-b6a1-b0204c0eeefe" containerName="ceilometer-central-agent" Mar 01 09:30:56 crc kubenswrapper[4792]: E0301 09:30:56.821444 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29759876-6cc9-4695-b6a1-b0204c0eeefe" containerName="ceilometer-notification-agent" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.821501 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="29759876-6cc9-4695-b6a1-b0204c0eeefe" containerName="ceilometer-notification-agent" Mar 01 09:30:56 crc kubenswrapper[4792]: E0301 09:30:56.821572 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29759876-6cc9-4695-b6a1-b0204c0eeefe" containerName="proxy-httpd" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.821628 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="29759876-6cc9-4695-b6a1-b0204c0eeefe" containerName="proxy-httpd" Mar 01 09:30:56 crc kubenswrapper[4792]: E0301 09:30:56.821691 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29759876-6cc9-4695-b6a1-b0204c0eeefe" containerName="sg-core" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.821741 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="29759876-6cc9-4695-b6a1-b0204c0eeefe" containerName="sg-core" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.821972 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="29759876-6cc9-4695-b6a1-b0204c0eeefe" containerName="ceilometer-notification-agent" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.822036 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="29759876-6cc9-4695-b6a1-b0204c0eeefe" containerName="proxy-httpd" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.822105 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="29759876-6cc9-4695-b6a1-b0204c0eeefe" containerName="sg-core" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.822167 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="29759876-6cc9-4695-b6a1-b0204c0eeefe" containerName="ceilometer-central-agent" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.824557 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.826935 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.827377 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.827498 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.842896 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.865486 4792 scope.go:117] "RemoveContainer" containerID="27961c30989977627bc62cffbc34bfcddc0bf83961913bcb7f03340031c18bf3" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.878950 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.879027 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.879133 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-scripts\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.879166 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4871267-e63c-4804-a404-869a0fdbd171-run-httpd\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.879199 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl6tg\" (UniqueName: \"kubernetes.io/projected/a4871267-e63c-4804-a404-869a0fdbd171-kube-api-access-cl6tg\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.879270 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-config-data\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.879315 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.879338 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4871267-e63c-4804-a404-869a0fdbd171-log-httpd\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.894929 4792 scope.go:117] "RemoveContainer" containerID="527b30377b6256714a4388f7e23bcabdd65f592f13d11e6b52ac05c9342bfd2a" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.915264 4792 scope.go:117] "RemoveContainer" containerID="ebc7b83a7e8bbf8a5ebc283245e6f34e1228e9544a5c7d8cdde9491f54d017af" Mar 01 09:30:56 crc kubenswrapper[4792]: E0301 09:30:56.915723 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebc7b83a7e8bbf8a5ebc283245e6f34e1228e9544a5c7d8cdde9491f54d017af\": container with ID starting with ebc7b83a7e8bbf8a5ebc283245e6f34e1228e9544a5c7d8cdde9491f54d017af not found: ID does not exist" containerID="ebc7b83a7e8bbf8a5ebc283245e6f34e1228e9544a5c7d8cdde9491f54d017af" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.915800 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebc7b83a7e8bbf8a5ebc283245e6f34e1228e9544a5c7d8cdde9491f54d017af"} err="failed to get container status \"ebc7b83a7e8bbf8a5ebc283245e6f34e1228e9544a5c7d8cdde9491f54d017af\": rpc error: code = NotFound desc = could not find container \"ebc7b83a7e8bbf8a5ebc283245e6f34e1228e9544a5c7d8cdde9491f54d017af\": container with ID starting with ebc7b83a7e8bbf8a5ebc283245e6f34e1228e9544a5c7d8cdde9491f54d017af not found: ID does not exist" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.915857 4792 scope.go:117] "RemoveContainer" containerID="34964964d66ee80be4a7e87aa9e1e71eff65ac803c33a8d8654d01bf69a0f013" Mar 01 09:30:56 crc kubenswrapper[4792]: E0301 09:30:56.916244 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34964964d66ee80be4a7e87aa9e1e71eff65ac803c33a8d8654d01bf69a0f013\": container with ID starting with 34964964d66ee80be4a7e87aa9e1e71eff65ac803c33a8d8654d01bf69a0f013 not found: ID does not exist" containerID="34964964d66ee80be4a7e87aa9e1e71eff65ac803c33a8d8654d01bf69a0f013" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.916291 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34964964d66ee80be4a7e87aa9e1e71eff65ac803c33a8d8654d01bf69a0f013"} err="failed to get container status \"34964964d66ee80be4a7e87aa9e1e71eff65ac803c33a8d8654d01bf69a0f013\": rpc error: code = NotFound desc = could not find container \"34964964d66ee80be4a7e87aa9e1e71eff65ac803c33a8d8654d01bf69a0f013\": container with ID starting with 34964964d66ee80be4a7e87aa9e1e71eff65ac803c33a8d8654d01bf69a0f013 not found: ID does not exist" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.916319 4792 scope.go:117] "RemoveContainer" containerID="27961c30989977627bc62cffbc34bfcddc0bf83961913bcb7f03340031c18bf3" Mar 01 09:30:56 crc kubenswrapper[4792]: E0301 09:30:56.916774 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27961c30989977627bc62cffbc34bfcddc0bf83961913bcb7f03340031c18bf3\": container with ID starting with 27961c30989977627bc62cffbc34bfcddc0bf83961913bcb7f03340031c18bf3 not found: ID does not exist" containerID="27961c30989977627bc62cffbc34bfcddc0bf83961913bcb7f03340031c18bf3" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.916798 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27961c30989977627bc62cffbc34bfcddc0bf83961913bcb7f03340031c18bf3"} err="failed to get container status \"27961c30989977627bc62cffbc34bfcddc0bf83961913bcb7f03340031c18bf3\": rpc error: code = NotFound desc = could not find container \"27961c30989977627bc62cffbc34bfcddc0bf83961913bcb7f03340031c18bf3\": container with ID starting with 27961c30989977627bc62cffbc34bfcddc0bf83961913bcb7f03340031c18bf3 not found: ID does not exist" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.916811 4792 scope.go:117] "RemoveContainer" containerID="527b30377b6256714a4388f7e23bcabdd65f592f13d11e6b52ac05c9342bfd2a" Mar 01 09:30:56 crc kubenswrapper[4792]: E0301 09:30:56.917244 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"527b30377b6256714a4388f7e23bcabdd65f592f13d11e6b52ac05c9342bfd2a\": container with ID starting with 527b30377b6256714a4388f7e23bcabdd65f592f13d11e6b52ac05c9342bfd2a not found: ID does not exist" containerID="527b30377b6256714a4388f7e23bcabdd65f592f13d11e6b52ac05c9342bfd2a" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.917266 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"527b30377b6256714a4388f7e23bcabdd65f592f13d11e6b52ac05c9342bfd2a"} err="failed to get container status \"527b30377b6256714a4388f7e23bcabdd65f592f13d11e6b52ac05c9342bfd2a\": rpc error: code = NotFound desc = could not find container \"527b30377b6256714a4388f7e23bcabdd65f592f13d11e6b52ac05c9342bfd2a\": container with ID starting with 527b30377b6256714a4388f7e23bcabdd65f592f13d11e6b52ac05c9342bfd2a not found: ID does not exist" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.981166 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-scripts\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.981249 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4871267-e63c-4804-a404-869a0fdbd171-run-httpd\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.981282 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl6tg\" (UniqueName: \"kubernetes.io/projected/a4871267-e63c-4804-a404-869a0fdbd171-kube-api-access-cl6tg\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.981341 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-config-data\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.981371 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.981390 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4871267-e63c-4804-a404-869a0fdbd171-log-httpd\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.981840 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.982077 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4871267-e63c-4804-a404-869a0fdbd171-run-httpd\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.982179 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4871267-e63c-4804-a404-869a0fdbd171-log-httpd\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.982516 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.985342 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-scripts\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.986388 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-config-data\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.986435 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:57 crc kubenswrapper[4792]: I0301 09:30:57.000451 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:57 crc kubenswrapper[4792]: I0301 09:30:57.000892 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:57 crc kubenswrapper[4792]: I0301 09:30:57.007656 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl6tg\" (UniqueName: \"kubernetes.io/projected/a4871267-e63c-4804-a404-869a0fdbd171-kube-api-access-cl6tg\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:57 crc kubenswrapper[4792]: I0301 09:30:57.163696 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:30:57 crc kubenswrapper[4792]: I0301 09:30:57.418187 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29759876-6cc9-4695-b6a1-b0204c0eeefe" path="/var/lib/kubelet/pods/29759876-6cc9-4695-b6a1-b0204c0eeefe/volumes" Mar 01 09:30:57 crc kubenswrapper[4792]: I0301 09:30:57.601962 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:30:57 crc kubenswrapper[4792]: W0301 09:30:57.604466 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4871267_e63c_4804_a404_869a0fdbd171.slice/crio-5daa2a8c28823d6b2f08c8830868b8a9480988afbc9d6f7999eee6cf3c7e7ff9 WatchSource:0}: Error finding container 5daa2a8c28823d6b2f08c8830868b8a9480988afbc9d6f7999eee6cf3c7e7ff9: Status 404 returned error can't find the container with id 5daa2a8c28823d6b2f08c8830868b8a9480988afbc9d6f7999eee6cf3c7e7ff9 Mar 01 09:30:57 crc kubenswrapper[4792]: I0301 09:30:57.773657 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4871267-e63c-4804-a404-869a0fdbd171","Type":"ContainerStarted","Data":"5daa2a8c28823d6b2f08c8830868b8a9480988afbc9d6f7999eee6cf3c7e7ff9"} Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.238999 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.312495 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-logs\") pod \"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5\" (UID: \"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5\") " Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.312664 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcn88\" (UniqueName: \"kubernetes.io/projected/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-kube-api-access-lcn88\") pod \"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5\" (UID: \"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5\") " Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.312754 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-combined-ca-bundle\") pod \"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5\" (UID: \"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5\") " Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.312816 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-config-data\") pod \"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5\" (UID: \"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5\") " Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.313708 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-logs" (OuterVolumeSpecName: "logs") pod "a9e82dc0-29af-47c8-bbef-1fd4bb999ff5" (UID: "a9e82dc0-29af-47c8-bbef-1fd4bb999ff5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.321671 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-kube-api-access-lcn88" (OuterVolumeSpecName: "kube-api-access-lcn88") pod "a9e82dc0-29af-47c8-bbef-1fd4bb999ff5" (UID: "a9e82dc0-29af-47c8-bbef-1fd4bb999ff5"). InnerVolumeSpecName "kube-api-access-lcn88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:30:58 crc kubenswrapper[4792]: E0301 09:30:58.346024 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-config-data podName:a9e82dc0-29af-47c8-bbef-1fd4bb999ff5 nodeName:}" failed. No retries permitted until 2026-03-01 09:30:58.846001704 +0000 UTC m=+1388.087880901 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-config-data") pod "a9e82dc0-29af-47c8-bbef-1fd4bb999ff5" (UID: "a9e82dc0-29af-47c8-bbef-1fd4bb999ff5") : error deleting /var/lib/kubelet/pods/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5/volume-subpaths: remove /var/lib/kubelet/pods/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5/volume-subpaths: no such file or directory Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.349328 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9e82dc0-29af-47c8-bbef-1fd4bb999ff5" (UID: "a9e82dc0-29af-47c8-bbef-1fd4bb999ff5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.415415 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.415450 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-logs\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.415465 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcn88\" (UniqueName: \"kubernetes.io/projected/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-kube-api-access-lcn88\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.782168 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4871267-e63c-4804-a404-869a0fdbd171","Type":"ContainerStarted","Data":"3d636a3af1c36dd5782bf30fc9b5448d960eda4cadc3be15e2341f14ec6c7b14"} Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.784090 4792 generic.go:334] "Generic (PLEG): container finished" podID="a9e82dc0-29af-47c8-bbef-1fd4bb999ff5" containerID="59421f9fcdf393f55cf74f2eb4e16fc9e1ecf0f2e44e7dfd8974930450018cc7" exitCode=0 Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.784136 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5","Type":"ContainerDied","Data":"59421f9fcdf393f55cf74f2eb4e16fc9e1ecf0f2e44e7dfd8974930450018cc7"} Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.784157 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5","Type":"ContainerDied","Data":"89a1a9514a04ba2ef6114510db9082d4635321014208a8dffde8aafd68862a7c"} Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.784175 4792 scope.go:117] "RemoveContainer" containerID="59421f9fcdf393f55cf74f2eb4e16fc9e1ecf0f2e44e7dfd8974930450018cc7" Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.784309 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.883567 4792 scope.go:117] "RemoveContainer" containerID="967bfda4431903ec2dd2aafcf91861e2b4d545ee88e602fcc124a7870529bbdf" Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.931420 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-config-data\") pod \"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5\" (UID: \"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5\") " Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.936631 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-config-data" (OuterVolumeSpecName: "config-data") pod "a9e82dc0-29af-47c8-bbef-1fd4bb999ff5" (UID: "a9e82dc0-29af-47c8-bbef-1fd4bb999ff5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.954166 4792 scope.go:117] "RemoveContainer" containerID="59421f9fcdf393f55cf74f2eb4e16fc9e1ecf0f2e44e7dfd8974930450018cc7" Mar 01 09:30:58 crc kubenswrapper[4792]: E0301 09:30:58.957592 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59421f9fcdf393f55cf74f2eb4e16fc9e1ecf0f2e44e7dfd8974930450018cc7\": container with ID starting with 59421f9fcdf393f55cf74f2eb4e16fc9e1ecf0f2e44e7dfd8974930450018cc7 not found: ID does not exist" containerID="59421f9fcdf393f55cf74f2eb4e16fc9e1ecf0f2e44e7dfd8974930450018cc7" Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.957660 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59421f9fcdf393f55cf74f2eb4e16fc9e1ecf0f2e44e7dfd8974930450018cc7"} err="failed to get container status \"59421f9fcdf393f55cf74f2eb4e16fc9e1ecf0f2e44e7dfd8974930450018cc7\": rpc error: code = NotFound desc = could not find container \"59421f9fcdf393f55cf74f2eb4e16fc9e1ecf0f2e44e7dfd8974930450018cc7\": container with ID starting with 59421f9fcdf393f55cf74f2eb4e16fc9e1ecf0f2e44e7dfd8974930450018cc7 not found: ID does not exist" Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.957684 4792 scope.go:117] "RemoveContainer" containerID="967bfda4431903ec2dd2aafcf91861e2b4d545ee88e602fcc124a7870529bbdf" Mar 01 09:30:58 crc kubenswrapper[4792]: E0301 09:30:58.958283 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"967bfda4431903ec2dd2aafcf91861e2b4d545ee88e602fcc124a7870529bbdf\": container with ID starting with 967bfda4431903ec2dd2aafcf91861e2b4d545ee88e602fcc124a7870529bbdf not found: ID does not exist" containerID="967bfda4431903ec2dd2aafcf91861e2b4d545ee88e602fcc124a7870529bbdf" Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.958311 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"967bfda4431903ec2dd2aafcf91861e2b4d545ee88e602fcc124a7870529bbdf"} err="failed to get container status \"967bfda4431903ec2dd2aafcf91861e2b4d545ee88e602fcc124a7870529bbdf\": rpc error: code = NotFound desc = could not find container \"967bfda4431903ec2dd2aafcf91861e2b4d545ee88e602fcc124a7870529bbdf\": container with ID starting with 967bfda4431903ec2dd2aafcf91861e2b4d545ee88e602fcc124a7870529bbdf not found: ID does not exist" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.036890 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.114478 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.123460 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.139576 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 01 09:30:59 crc kubenswrapper[4792]: E0301 09:30:59.140037 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9e82dc0-29af-47c8-bbef-1fd4bb999ff5" containerName="nova-api-api" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.140061 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9e82dc0-29af-47c8-bbef-1fd4bb999ff5" containerName="nova-api-api" Mar 01 09:30:59 crc kubenswrapper[4792]: E0301 09:30:59.140093 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9e82dc0-29af-47c8-bbef-1fd4bb999ff5" containerName="nova-api-log" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.140101 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9e82dc0-29af-47c8-bbef-1fd4bb999ff5" containerName="nova-api-log" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.140303 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9e82dc0-29af-47c8-bbef-1fd4bb999ff5" containerName="nova-api-log" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.140345 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9e82dc0-29af-47c8-bbef-1fd4bb999ff5" containerName="nova-api-api" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.141415 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.143771 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.145768 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.149557 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.154544 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.240700 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0641dd98-580b-48cf-87e8-4e0c891e18bd-logs\") pod \"nova-api-0\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " pod="openstack/nova-api-0" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.240836 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-config-data\") pod \"nova-api-0\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " pod="openstack/nova-api-0" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.240856 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-public-tls-certs\") pod \"nova-api-0\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " pod="openstack/nova-api-0" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.240980 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " pod="openstack/nova-api-0" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.241016 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtjdm\" (UniqueName: \"kubernetes.io/projected/0641dd98-580b-48cf-87e8-4e0c891e18bd-kube-api-access-dtjdm\") pod \"nova-api-0\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " pod="openstack/nova-api-0" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.241045 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " pod="openstack/nova-api-0" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.342098 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtjdm\" (UniqueName: \"kubernetes.io/projected/0641dd98-580b-48cf-87e8-4e0c891e18bd-kube-api-access-dtjdm\") pod \"nova-api-0\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " pod="openstack/nova-api-0" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.342177 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " pod="openstack/nova-api-0" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.342220 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0641dd98-580b-48cf-87e8-4e0c891e18bd-logs\") pod \"nova-api-0\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " pod="openstack/nova-api-0" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.342302 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-config-data\") pod \"nova-api-0\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " pod="openstack/nova-api-0" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.342351 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-public-tls-certs\") pod \"nova-api-0\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " pod="openstack/nova-api-0" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.342413 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " pod="openstack/nova-api-0" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.342998 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0641dd98-580b-48cf-87e8-4e0c891e18bd-logs\") pod \"nova-api-0\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " pod="openstack/nova-api-0" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.346447 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " pod="openstack/nova-api-0" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.347148 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-public-tls-certs\") pod \"nova-api-0\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " pod="openstack/nova-api-0" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.347221 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-config-data\") pod \"nova-api-0\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " pod="openstack/nova-api-0" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.356225 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.358276 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " pod="openstack/nova-api-0" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.362393 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtjdm\" (UniqueName: \"kubernetes.io/projected/0641dd98-580b-48cf-87e8-4e0c891e18bd-kube-api-access-dtjdm\") pod \"nova-api-0\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " pod="openstack/nova-api-0" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.384437 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.424056 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9e82dc0-29af-47c8-bbef-1fd4bb999ff5" path="/var/lib/kubelet/pods/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5/volumes" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.456634 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.795080 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4871267-e63c-4804-a404-869a0fdbd171","Type":"ContainerStarted","Data":"9953bdd8b70e316fcb7e17da4271cb1872c378bfc503f331c38bf892ec73dc20"} Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.795361 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4871267-e63c-4804-a404-869a0fdbd171","Type":"ContainerStarted","Data":"61259115634a264180b7ef973ccc70cba261d421d03c9f32f31608df7c9afe57"} Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.812782 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.933106 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 01 09:31:00 crc kubenswrapper[4792]: I0301 09:31:00.017405 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-6q8nq"] Mar 01 09:31:00 crc kubenswrapper[4792]: I0301 09:31:00.018474 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6q8nq" Mar 01 09:31:00 crc kubenswrapper[4792]: I0301 09:31:00.020697 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-6q8nq"] Mar 01 09:31:00 crc kubenswrapper[4792]: I0301 09:31:00.021891 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 01 09:31:00 crc kubenswrapper[4792]: I0301 09:31:00.022528 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 01 09:31:00 crc kubenswrapper[4792]: I0301 09:31:00.061069 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfbmw\" (UniqueName: \"kubernetes.io/projected/8782d670-70cd-42cc-b4d7-c0c8275e457b-kube-api-access-jfbmw\") pod \"nova-cell1-cell-mapping-6q8nq\" (UID: \"8782d670-70cd-42cc-b4d7-c0c8275e457b\") " pod="openstack/nova-cell1-cell-mapping-6q8nq" Mar 01 09:31:00 crc kubenswrapper[4792]: I0301 09:31:00.061224 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8782d670-70cd-42cc-b4d7-c0c8275e457b-config-data\") pod \"nova-cell1-cell-mapping-6q8nq\" (UID: \"8782d670-70cd-42cc-b4d7-c0c8275e457b\") " pod="openstack/nova-cell1-cell-mapping-6q8nq" Mar 01 09:31:00 crc kubenswrapper[4792]: I0301 09:31:00.061260 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8782d670-70cd-42cc-b4d7-c0c8275e457b-scripts\") pod \"nova-cell1-cell-mapping-6q8nq\" (UID: \"8782d670-70cd-42cc-b4d7-c0c8275e457b\") " pod="openstack/nova-cell1-cell-mapping-6q8nq" Mar 01 09:31:00 crc kubenswrapper[4792]: I0301 09:31:00.061281 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8782d670-70cd-42cc-b4d7-c0c8275e457b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6q8nq\" (UID: \"8782d670-70cd-42cc-b4d7-c0c8275e457b\") " pod="openstack/nova-cell1-cell-mapping-6q8nq" Mar 01 09:31:00 crc kubenswrapper[4792]: I0301 09:31:00.162718 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8782d670-70cd-42cc-b4d7-c0c8275e457b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6q8nq\" (UID: \"8782d670-70cd-42cc-b4d7-c0c8275e457b\") " pod="openstack/nova-cell1-cell-mapping-6q8nq" Mar 01 09:31:00 crc kubenswrapper[4792]: I0301 09:31:00.163170 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfbmw\" (UniqueName: \"kubernetes.io/projected/8782d670-70cd-42cc-b4d7-c0c8275e457b-kube-api-access-jfbmw\") pod \"nova-cell1-cell-mapping-6q8nq\" (UID: \"8782d670-70cd-42cc-b4d7-c0c8275e457b\") " pod="openstack/nova-cell1-cell-mapping-6q8nq" Mar 01 09:31:00 crc kubenswrapper[4792]: I0301 09:31:00.163307 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8782d670-70cd-42cc-b4d7-c0c8275e457b-config-data\") pod \"nova-cell1-cell-mapping-6q8nq\" (UID: \"8782d670-70cd-42cc-b4d7-c0c8275e457b\") " pod="openstack/nova-cell1-cell-mapping-6q8nq" Mar 01 09:31:00 crc kubenswrapper[4792]: I0301 09:31:00.163340 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8782d670-70cd-42cc-b4d7-c0c8275e457b-scripts\") pod \"nova-cell1-cell-mapping-6q8nq\" (UID: \"8782d670-70cd-42cc-b4d7-c0c8275e457b\") " pod="openstack/nova-cell1-cell-mapping-6q8nq" Mar 01 09:31:00 crc kubenswrapper[4792]: I0301 09:31:00.166191 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8782d670-70cd-42cc-b4d7-c0c8275e457b-scripts\") pod \"nova-cell1-cell-mapping-6q8nq\" (UID: \"8782d670-70cd-42cc-b4d7-c0c8275e457b\") " pod="openstack/nova-cell1-cell-mapping-6q8nq" Mar 01 09:31:00 crc kubenswrapper[4792]: I0301 09:31:00.166239 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8782d670-70cd-42cc-b4d7-c0c8275e457b-config-data\") pod \"nova-cell1-cell-mapping-6q8nq\" (UID: \"8782d670-70cd-42cc-b4d7-c0c8275e457b\") " pod="openstack/nova-cell1-cell-mapping-6q8nq" Mar 01 09:31:00 crc kubenswrapper[4792]: I0301 09:31:00.167591 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8782d670-70cd-42cc-b4d7-c0c8275e457b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6q8nq\" (UID: \"8782d670-70cd-42cc-b4d7-c0c8275e457b\") " pod="openstack/nova-cell1-cell-mapping-6q8nq" Mar 01 09:31:00 crc kubenswrapper[4792]: I0301 09:31:00.187682 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfbmw\" (UniqueName: \"kubernetes.io/projected/8782d670-70cd-42cc-b4d7-c0c8275e457b-kube-api-access-jfbmw\") pod \"nova-cell1-cell-mapping-6q8nq\" (UID: \"8782d670-70cd-42cc-b4d7-c0c8275e457b\") " pod="openstack/nova-cell1-cell-mapping-6q8nq" Mar 01 09:31:00 crc kubenswrapper[4792]: I0301 09:31:00.388286 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6q8nq" Mar 01 09:31:00 crc kubenswrapper[4792]: I0301 09:31:00.805352 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0641dd98-580b-48cf-87e8-4e0c891e18bd","Type":"ContainerStarted","Data":"14d9113367f02407293d9d8ec1d10c56235ad48cabbbe3989c80dba6f7114608"} Mar 01 09:31:00 crc kubenswrapper[4792]: I0301 09:31:00.805403 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0641dd98-580b-48cf-87e8-4e0c891e18bd","Type":"ContainerStarted","Data":"b775ffba9d9cb2b16b5f1e635b93b908a1badacd8016ff7bb4d2a11a1bd09ef9"} Mar 01 09:31:00 crc kubenswrapper[4792]: I0301 09:31:00.805417 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0641dd98-580b-48cf-87e8-4e0c891e18bd","Type":"ContainerStarted","Data":"c1554809fd8548d7b1bbefcc4dff40233ceca12e7183c20dd377bc2e84b15644"} Mar 01 09:31:00 crc kubenswrapper[4792]: I0301 09:31:00.826730 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.8267148770000001 podStartE2EDuration="1.826714877s" podCreationTimestamp="2026-03-01 09:30:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:31:00.822831183 +0000 UTC m=+1390.064710390" watchObservedRunningTime="2026-03-01 09:31:00.826714877 +0000 UTC m=+1390.068594074" Mar 01 09:31:00 crc kubenswrapper[4792]: I0301 09:31:00.878490 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-6q8nq"] Mar 01 09:31:01 crc kubenswrapper[4792]: I0301 09:31:01.815937 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4871267-e63c-4804-a404-869a0fdbd171","Type":"ContainerStarted","Data":"6a717ecdd9676dfa29b9bb443090712efe20cbd520284d6f5c7116f4cf10875d"} Mar 01 09:31:01 crc kubenswrapper[4792]: I0301 09:31:01.816567 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 01 09:31:01 crc kubenswrapper[4792]: I0301 09:31:01.819063 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6q8nq" event={"ID":"8782d670-70cd-42cc-b4d7-c0c8275e457b","Type":"ContainerStarted","Data":"c8dd8174a7f29b772205511318ce1d6a20b6f7ef7820849db345c8f1e46e0166"} Mar 01 09:31:01 crc kubenswrapper[4792]: I0301 09:31:01.819130 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6q8nq" event={"ID":"8782d670-70cd-42cc-b4d7-c0c8275e457b","Type":"ContainerStarted","Data":"2788d692590d5ea686a1f6514becf743d99bbd2329e64654d1e2c869653ccfb4"} Mar 01 09:31:01 crc kubenswrapper[4792]: I0301 09:31:01.845760 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.369367844 podStartE2EDuration="5.845737461s" podCreationTimestamp="2026-03-01 09:30:56 +0000 UTC" firstStartedPulling="2026-03-01 09:30:57.606781819 +0000 UTC m=+1386.848661016" lastFinishedPulling="2026-03-01 09:31:01.083151436 +0000 UTC m=+1390.325030633" observedRunningTime="2026-03-01 09:31:01.836860498 +0000 UTC m=+1391.078739695" watchObservedRunningTime="2026-03-01 09:31:01.845737461 +0000 UTC m=+1391.087616668" Mar 01 09:31:01 crc kubenswrapper[4792]: I0301 09:31:01.869226 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-6q8nq" podStartSLOduration=2.869203794 podStartE2EDuration="2.869203794s" podCreationTimestamp="2026-03-01 09:30:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:31:01.861507019 +0000 UTC m=+1391.103386256" watchObservedRunningTime="2026-03-01 09:31:01.869203794 +0000 UTC m=+1391.111083001" Mar 01 09:31:02 crc kubenswrapper[4792]: I0301 09:31:02.263939 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c74598c69-2pgch" Mar 01 09:31:02 crc kubenswrapper[4792]: I0301 09:31:02.319987 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw"] Mar 01 09:31:02 crc kubenswrapper[4792]: I0301 09:31:02.320282 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" podUID="dae0c901-5f9c-4248-96dd-08acb2b5d278" containerName="dnsmasq-dns" containerID="cri-o://36ef0d467aa78ea6675158c30915ca7e3bff42a879c627274dd90d0498f741fd" gracePeriod=10 Mar 01 09:31:02 crc kubenswrapper[4792]: I0301 09:31:02.848797 4792 generic.go:334] "Generic (PLEG): container finished" podID="dae0c901-5f9c-4248-96dd-08acb2b5d278" containerID="36ef0d467aa78ea6675158c30915ca7e3bff42a879c627274dd90d0498f741fd" exitCode=0 Mar 01 09:31:02 crc kubenswrapper[4792]: I0301 09:31:02.849386 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" event={"ID":"dae0c901-5f9c-4248-96dd-08acb2b5d278","Type":"ContainerDied","Data":"36ef0d467aa78ea6675158c30915ca7e3bff42a879c627274dd90d0498f741fd"} Mar 01 09:31:02 crc kubenswrapper[4792]: I0301 09:31:02.971979 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" Mar 01 09:31:03 crc kubenswrapper[4792]: I0301 09:31:03.028278 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-dns-svc\") pod \"dae0c901-5f9c-4248-96dd-08acb2b5d278\" (UID: \"dae0c901-5f9c-4248-96dd-08acb2b5d278\") " Mar 01 09:31:03 crc kubenswrapper[4792]: I0301 09:31:03.028360 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-config\") pod \"dae0c901-5f9c-4248-96dd-08acb2b5d278\" (UID: \"dae0c901-5f9c-4248-96dd-08acb2b5d278\") " Mar 01 09:31:03 crc kubenswrapper[4792]: I0301 09:31:03.028381 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-ovsdbserver-nb\") pod \"dae0c901-5f9c-4248-96dd-08acb2b5d278\" (UID: \"dae0c901-5f9c-4248-96dd-08acb2b5d278\") " Mar 01 09:31:03 crc kubenswrapper[4792]: I0301 09:31:03.028417 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhfgl\" (UniqueName: \"kubernetes.io/projected/dae0c901-5f9c-4248-96dd-08acb2b5d278-kube-api-access-rhfgl\") pod \"dae0c901-5f9c-4248-96dd-08acb2b5d278\" (UID: \"dae0c901-5f9c-4248-96dd-08acb2b5d278\") " Mar 01 09:31:03 crc kubenswrapper[4792]: I0301 09:31:03.028492 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-ovsdbserver-sb\") pod \"dae0c901-5f9c-4248-96dd-08acb2b5d278\" (UID: \"dae0c901-5f9c-4248-96dd-08acb2b5d278\") " Mar 01 09:31:03 crc kubenswrapper[4792]: I0301 09:31:03.078956 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dae0c901-5f9c-4248-96dd-08acb2b5d278-kube-api-access-rhfgl" (OuterVolumeSpecName: "kube-api-access-rhfgl") pod "dae0c901-5f9c-4248-96dd-08acb2b5d278" (UID: "dae0c901-5f9c-4248-96dd-08acb2b5d278"). InnerVolumeSpecName "kube-api-access-rhfgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:31:03 crc kubenswrapper[4792]: I0301 09:31:03.126063 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-config" (OuterVolumeSpecName: "config") pod "dae0c901-5f9c-4248-96dd-08acb2b5d278" (UID: "dae0c901-5f9c-4248-96dd-08acb2b5d278"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:31:03 crc kubenswrapper[4792]: I0301 09:31:03.131604 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:03 crc kubenswrapper[4792]: I0301 09:31:03.131787 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhfgl\" (UniqueName: \"kubernetes.io/projected/dae0c901-5f9c-4248-96dd-08acb2b5d278-kube-api-access-rhfgl\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:03 crc kubenswrapper[4792]: I0301 09:31:03.147809 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dae0c901-5f9c-4248-96dd-08acb2b5d278" (UID: "dae0c901-5f9c-4248-96dd-08acb2b5d278"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:31:03 crc kubenswrapper[4792]: I0301 09:31:03.154407 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dae0c901-5f9c-4248-96dd-08acb2b5d278" (UID: "dae0c901-5f9c-4248-96dd-08acb2b5d278"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:31:03 crc kubenswrapper[4792]: I0301 09:31:03.170425 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dae0c901-5f9c-4248-96dd-08acb2b5d278" (UID: "dae0c901-5f9c-4248-96dd-08acb2b5d278"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:31:03 crc kubenswrapper[4792]: I0301 09:31:03.233006 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:03 crc kubenswrapper[4792]: I0301 09:31:03.233039 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:03 crc kubenswrapper[4792]: I0301 09:31:03.233050 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:03 crc kubenswrapper[4792]: I0301 09:31:03.859873 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" event={"ID":"dae0c901-5f9c-4248-96dd-08acb2b5d278","Type":"ContainerDied","Data":"c9bb9d1c4be64e9583f3d8ffde433440cf4e39ab72b4447ef9f1e98ba96850ca"} Mar 01 09:31:03 crc kubenswrapper[4792]: I0301 09:31:03.859961 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" Mar 01 09:31:03 crc kubenswrapper[4792]: I0301 09:31:03.860193 4792 scope.go:117] "RemoveContainer" containerID="36ef0d467aa78ea6675158c30915ca7e3bff42a879c627274dd90d0498f741fd" Mar 01 09:31:03 crc kubenswrapper[4792]: I0301 09:31:03.910453 4792 scope.go:117] "RemoveContainer" containerID="4f80ddb9167cc5bebd3ccdc43bf19f478b728967aa30e43898dc24927a2246f9" Mar 01 09:31:03 crc kubenswrapper[4792]: I0301 09:31:03.934548 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw"] Mar 01 09:31:03 crc kubenswrapper[4792]: I0301 09:31:03.942743 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw"] Mar 01 09:31:05 crc kubenswrapper[4792]: I0301 09:31:05.420611 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dae0c901-5f9c-4248-96dd-08acb2b5d278" path="/var/lib/kubelet/pods/dae0c901-5f9c-4248-96dd-08acb2b5d278/volumes" Mar 01 09:31:06 crc kubenswrapper[4792]: I0301 09:31:06.890010 4792 generic.go:334] "Generic (PLEG): container finished" podID="8782d670-70cd-42cc-b4d7-c0c8275e457b" containerID="c8dd8174a7f29b772205511318ce1d6a20b6f7ef7820849db345c8f1e46e0166" exitCode=0 Mar 01 09:31:06 crc kubenswrapper[4792]: I0301 09:31:06.890089 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6q8nq" event={"ID":"8782d670-70cd-42cc-b4d7-c0c8275e457b","Type":"ContainerDied","Data":"c8dd8174a7f29b772205511318ce1d6a20b6f7ef7820849db345c8f1e46e0166"} Mar 01 09:31:07 crc kubenswrapper[4792]: I0301 09:31:07.714782 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" podUID="dae0c901-5f9c-4248-96dd-08acb2b5d278" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.181:5353: i/o timeout" Mar 01 09:31:08 crc kubenswrapper[4792]: I0301 09:31:08.248094 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6q8nq" Mar 01 09:31:08 crc kubenswrapper[4792]: I0301 09:31:08.323523 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8782d670-70cd-42cc-b4d7-c0c8275e457b-config-data\") pod \"8782d670-70cd-42cc-b4d7-c0c8275e457b\" (UID: \"8782d670-70cd-42cc-b4d7-c0c8275e457b\") " Mar 01 09:31:08 crc kubenswrapper[4792]: I0301 09:31:08.323659 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8782d670-70cd-42cc-b4d7-c0c8275e457b-combined-ca-bundle\") pod \"8782d670-70cd-42cc-b4d7-c0c8275e457b\" (UID: \"8782d670-70cd-42cc-b4d7-c0c8275e457b\") " Mar 01 09:31:08 crc kubenswrapper[4792]: I0301 09:31:08.323720 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8782d670-70cd-42cc-b4d7-c0c8275e457b-scripts\") pod \"8782d670-70cd-42cc-b4d7-c0c8275e457b\" (UID: \"8782d670-70cd-42cc-b4d7-c0c8275e457b\") " Mar 01 09:31:08 crc kubenswrapper[4792]: I0301 09:31:08.323837 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfbmw\" (UniqueName: \"kubernetes.io/projected/8782d670-70cd-42cc-b4d7-c0c8275e457b-kube-api-access-jfbmw\") pod \"8782d670-70cd-42cc-b4d7-c0c8275e457b\" (UID: \"8782d670-70cd-42cc-b4d7-c0c8275e457b\") " Mar 01 09:31:08 crc kubenswrapper[4792]: I0301 09:31:08.330172 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8782d670-70cd-42cc-b4d7-c0c8275e457b-scripts" (OuterVolumeSpecName: "scripts") pod "8782d670-70cd-42cc-b4d7-c0c8275e457b" (UID: "8782d670-70cd-42cc-b4d7-c0c8275e457b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:31:08 crc kubenswrapper[4792]: I0301 09:31:08.333338 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8782d670-70cd-42cc-b4d7-c0c8275e457b-kube-api-access-jfbmw" (OuterVolumeSpecName: "kube-api-access-jfbmw") pod "8782d670-70cd-42cc-b4d7-c0c8275e457b" (UID: "8782d670-70cd-42cc-b4d7-c0c8275e457b"). InnerVolumeSpecName "kube-api-access-jfbmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:31:08 crc kubenswrapper[4792]: I0301 09:31:08.352870 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8782d670-70cd-42cc-b4d7-c0c8275e457b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8782d670-70cd-42cc-b4d7-c0c8275e457b" (UID: "8782d670-70cd-42cc-b4d7-c0c8275e457b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:31:08 crc kubenswrapper[4792]: I0301 09:31:08.362780 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8782d670-70cd-42cc-b4d7-c0c8275e457b-config-data" (OuterVolumeSpecName: "config-data") pod "8782d670-70cd-42cc-b4d7-c0c8275e457b" (UID: "8782d670-70cd-42cc-b4d7-c0c8275e457b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:31:08 crc kubenswrapper[4792]: I0301 09:31:08.426229 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8782d670-70cd-42cc-b4d7-c0c8275e457b-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:08 crc kubenswrapper[4792]: I0301 09:31:08.426265 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfbmw\" (UniqueName: \"kubernetes.io/projected/8782d670-70cd-42cc-b4d7-c0c8275e457b-kube-api-access-jfbmw\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:08 crc kubenswrapper[4792]: I0301 09:31:08.426279 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8782d670-70cd-42cc-b4d7-c0c8275e457b-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:08 crc kubenswrapper[4792]: I0301 09:31:08.426294 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8782d670-70cd-42cc-b4d7-c0c8275e457b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:08 crc kubenswrapper[4792]: I0301 09:31:08.909679 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6q8nq" event={"ID":"8782d670-70cd-42cc-b4d7-c0c8275e457b","Type":"ContainerDied","Data":"2788d692590d5ea686a1f6514becf743d99bbd2329e64654d1e2c869653ccfb4"} Mar 01 09:31:08 crc kubenswrapper[4792]: I0301 09:31:08.910145 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2788d692590d5ea686a1f6514becf743d99bbd2329e64654d1e2c869653ccfb4" Mar 01 09:31:08 crc kubenswrapper[4792]: I0301 09:31:08.910275 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6q8nq" Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.100437 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.100692 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="4b5702fe-b26d-43ee-b702-4ac5527947cd" containerName="nova-scheduler-scheduler" containerID="cri-o://1130ef58928c6df8139553b0ba2bbd03e66b69c09b50f9867cc8cd0a70ce6c1e" gracePeriod=30 Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.113566 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.114180 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0641dd98-580b-48cf-87e8-4e0c891e18bd" containerName="nova-api-log" containerID="cri-o://b775ffba9d9cb2b16b5f1e635b93b908a1badacd8016ff7bb4d2a11a1bd09ef9" gracePeriod=30 Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.114208 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0641dd98-580b-48cf-87e8-4e0c891e18bd" containerName="nova-api-api" containerID="cri-o://14d9113367f02407293d9d8ec1d10c56235ad48cabbbe3989c80dba6f7114608" gracePeriod=30 Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.197531 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.197767 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="66b40740-5f2c-4f3a-9d20-3307335829ed" containerName="nova-metadata-log" containerID="cri-o://4824e5f7f0b4542375429e0324e26b504dd03e3e54b0250d4667215c3a5784ed" gracePeriod=30 Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.197996 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="66b40740-5f2c-4f3a-9d20-3307335829ed" containerName="nova-metadata-metadata" containerID="cri-o://a14ebd244facee732540cda9c7f5ef2f0b1a5f3035c360a40a8a1d517d895ef8" gracePeriod=30 Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.693799 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.747797 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-public-tls-certs\") pod \"0641dd98-580b-48cf-87e8-4e0c891e18bd\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.748096 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-internal-tls-certs\") pod \"0641dd98-580b-48cf-87e8-4e0c891e18bd\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.748225 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0641dd98-580b-48cf-87e8-4e0c891e18bd-logs\") pod \"0641dd98-580b-48cf-87e8-4e0c891e18bd\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.748343 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-combined-ca-bundle\") pod \"0641dd98-580b-48cf-87e8-4e0c891e18bd\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.748438 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtjdm\" (UniqueName: \"kubernetes.io/projected/0641dd98-580b-48cf-87e8-4e0c891e18bd-kube-api-access-dtjdm\") pod \"0641dd98-580b-48cf-87e8-4e0c891e18bd\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.748543 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-config-data\") pod \"0641dd98-580b-48cf-87e8-4e0c891e18bd\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.759857 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0641dd98-580b-48cf-87e8-4e0c891e18bd-logs" (OuterVolumeSpecName: "logs") pod "0641dd98-580b-48cf-87e8-4e0c891e18bd" (UID: "0641dd98-580b-48cf-87e8-4e0c891e18bd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.766081 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0641dd98-580b-48cf-87e8-4e0c891e18bd-kube-api-access-dtjdm" (OuterVolumeSpecName: "kube-api-access-dtjdm") pod "0641dd98-580b-48cf-87e8-4e0c891e18bd" (UID: "0641dd98-580b-48cf-87e8-4e0c891e18bd"). InnerVolumeSpecName "kube-api-access-dtjdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.815074 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-config-data" (OuterVolumeSpecName: "config-data") pod "0641dd98-580b-48cf-87e8-4e0c891e18bd" (UID: "0641dd98-580b-48cf-87e8-4e0c891e18bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.830540 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0641dd98-580b-48cf-87e8-4e0c891e18bd" (UID: "0641dd98-580b-48cf-87e8-4e0c891e18bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.831709 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0641dd98-580b-48cf-87e8-4e0c891e18bd" (UID: "0641dd98-580b-48cf-87e8-4e0c891e18bd"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.850746 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.850931 4792 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.851004 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0641dd98-580b-48cf-87e8-4e0c891e18bd-logs\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.851061 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.851121 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtjdm\" (UniqueName: \"kubernetes.io/projected/0641dd98-580b-48cf-87e8-4e0c891e18bd-kube-api-access-dtjdm\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.853351 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0641dd98-580b-48cf-87e8-4e0c891e18bd" (UID: "0641dd98-580b-48cf-87e8-4e0c891e18bd"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.925612 4792 generic.go:334] "Generic (PLEG): container finished" podID="0641dd98-580b-48cf-87e8-4e0c891e18bd" containerID="14d9113367f02407293d9d8ec1d10c56235ad48cabbbe3989c80dba6f7114608" exitCode=0 Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.925660 4792 generic.go:334] "Generic (PLEG): container finished" podID="0641dd98-580b-48cf-87e8-4e0c891e18bd" containerID="b775ffba9d9cb2b16b5f1e635b93b908a1badacd8016ff7bb4d2a11a1bd09ef9" exitCode=143 Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.925765 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.926016 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0641dd98-580b-48cf-87e8-4e0c891e18bd","Type":"ContainerDied","Data":"14d9113367f02407293d9d8ec1d10c56235ad48cabbbe3989c80dba6f7114608"} Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.926129 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0641dd98-580b-48cf-87e8-4e0c891e18bd","Type":"ContainerDied","Data":"b775ffba9d9cb2b16b5f1e635b93b908a1badacd8016ff7bb4d2a11a1bd09ef9"} Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.926203 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0641dd98-580b-48cf-87e8-4e0c891e18bd","Type":"ContainerDied","Data":"c1554809fd8548d7b1bbefcc4dff40233ceca12e7183c20dd377bc2e84b15644"} Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.926312 4792 scope.go:117] "RemoveContainer" containerID="14d9113367f02407293d9d8ec1d10c56235ad48cabbbe3989c80dba6f7114608" Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.928582 4792 generic.go:334] "Generic (PLEG): container finished" podID="4b5702fe-b26d-43ee-b702-4ac5527947cd" containerID="1130ef58928c6df8139553b0ba2bbd03e66b69c09b50f9867cc8cd0a70ce6c1e" exitCode=0 Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.928643 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4b5702fe-b26d-43ee-b702-4ac5527947cd","Type":"ContainerDied","Data":"1130ef58928c6df8139553b0ba2bbd03e66b69c09b50f9867cc8cd0a70ce6c1e"} Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.940205 4792 generic.go:334] "Generic (PLEG): container finished" podID="66b40740-5f2c-4f3a-9d20-3307335829ed" containerID="4824e5f7f0b4542375429e0324e26b504dd03e3e54b0250d4667215c3a5784ed" exitCode=143 Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.940267 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"66b40740-5f2c-4f3a-9d20-3307335829ed","Type":"ContainerDied","Data":"4824e5f7f0b4542375429e0324e26b504dd03e3e54b0250d4667215c3a5784ed"} Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.953068 4792 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.962931 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.979020 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.992455 4792 scope.go:117] "RemoveContainer" containerID="b775ffba9d9cb2b16b5f1e635b93b908a1badacd8016ff7bb4d2a11a1bd09ef9" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.022273 4792 scope.go:117] "RemoveContainer" containerID="14d9113367f02407293d9d8ec1d10c56235ad48cabbbe3989c80dba6f7114608" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.022502 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 01 09:31:10 crc kubenswrapper[4792]: E0301 09:31:10.022930 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dae0c901-5f9c-4248-96dd-08acb2b5d278" containerName="dnsmasq-dns" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.023014 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="dae0c901-5f9c-4248-96dd-08acb2b5d278" containerName="dnsmasq-dns" Mar 01 09:31:10 crc kubenswrapper[4792]: E0301 09:31:10.023074 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0641dd98-580b-48cf-87e8-4e0c891e18bd" containerName="nova-api-log" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.023139 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0641dd98-580b-48cf-87e8-4e0c891e18bd" containerName="nova-api-log" Mar 01 09:31:10 crc kubenswrapper[4792]: E0301 09:31:10.023195 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8782d670-70cd-42cc-b4d7-c0c8275e457b" containerName="nova-manage" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.023244 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8782d670-70cd-42cc-b4d7-c0c8275e457b" containerName="nova-manage" Mar 01 09:31:10 crc kubenswrapper[4792]: E0301 09:31:10.023314 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0641dd98-580b-48cf-87e8-4e0c891e18bd" containerName="nova-api-api" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.023365 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0641dd98-580b-48cf-87e8-4e0c891e18bd" containerName="nova-api-api" Mar 01 09:31:10 crc kubenswrapper[4792]: E0301 09:31:10.023425 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dae0c901-5f9c-4248-96dd-08acb2b5d278" containerName="init" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.023477 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="dae0c901-5f9c-4248-96dd-08acb2b5d278" containerName="init" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.023686 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="8782d670-70cd-42cc-b4d7-c0c8275e457b" containerName="nova-manage" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.023768 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="dae0c901-5f9c-4248-96dd-08acb2b5d278" containerName="dnsmasq-dns" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.023830 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0641dd98-580b-48cf-87e8-4e0c891e18bd" containerName="nova-api-api" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.023884 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0641dd98-580b-48cf-87e8-4e0c891e18bd" containerName="nova-api-log" Mar 01 09:31:10 crc kubenswrapper[4792]: E0301 09:31:10.024249 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14d9113367f02407293d9d8ec1d10c56235ad48cabbbe3989c80dba6f7114608\": container with ID starting with 14d9113367f02407293d9d8ec1d10c56235ad48cabbbe3989c80dba6f7114608 not found: ID does not exist" containerID="14d9113367f02407293d9d8ec1d10c56235ad48cabbbe3989c80dba6f7114608" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.024280 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14d9113367f02407293d9d8ec1d10c56235ad48cabbbe3989c80dba6f7114608"} err="failed to get container status \"14d9113367f02407293d9d8ec1d10c56235ad48cabbbe3989c80dba6f7114608\": rpc error: code = NotFound desc = could not find container \"14d9113367f02407293d9d8ec1d10c56235ad48cabbbe3989c80dba6f7114608\": container with ID starting with 14d9113367f02407293d9d8ec1d10c56235ad48cabbbe3989c80dba6f7114608 not found: ID does not exist" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.024301 4792 scope.go:117] "RemoveContainer" containerID="b775ffba9d9cb2b16b5f1e635b93b908a1badacd8016ff7bb4d2a11a1bd09ef9" Mar 01 09:31:10 crc kubenswrapper[4792]: E0301 09:31:10.024519 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b775ffba9d9cb2b16b5f1e635b93b908a1badacd8016ff7bb4d2a11a1bd09ef9\": container with ID starting with b775ffba9d9cb2b16b5f1e635b93b908a1badacd8016ff7bb4d2a11a1bd09ef9 not found: ID does not exist" containerID="b775ffba9d9cb2b16b5f1e635b93b908a1badacd8016ff7bb4d2a11a1bd09ef9" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.024535 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b775ffba9d9cb2b16b5f1e635b93b908a1badacd8016ff7bb4d2a11a1bd09ef9"} err="failed to get container status \"b775ffba9d9cb2b16b5f1e635b93b908a1badacd8016ff7bb4d2a11a1bd09ef9\": rpc error: code = NotFound desc = could not find container \"b775ffba9d9cb2b16b5f1e635b93b908a1badacd8016ff7bb4d2a11a1bd09ef9\": container with ID starting with b775ffba9d9cb2b16b5f1e635b93b908a1badacd8016ff7bb4d2a11a1bd09ef9 not found: ID does not exist" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.024547 4792 scope.go:117] "RemoveContainer" containerID="14d9113367f02407293d9d8ec1d10c56235ad48cabbbe3989c80dba6f7114608" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.024707 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14d9113367f02407293d9d8ec1d10c56235ad48cabbbe3989c80dba6f7114608"} err="failed to get container status \"14d9113367f02407293d9d8ec1d10c56235ad48cabbbe3989c80dba6f7114608\": rpc error: code = NotFound desc = could not find container \"14d9113367f02407293d9d8ec1d10c56235ad48cabbbe3989c80dba6f7114608\": container with ID starting with 14d9113367f02407293d9d8ec1d10c56235ad48cabbbe3989c80dba6f7114608 not found: ID does not exist" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.024721 4792 scope.go:117] "RemoveContainer" containerID="b775ffba9d9cb2b16b5f1e635b93b908a1badacd8016ff7bb4d2a11a1bd09ef9" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.025257 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b775ffba9d9cb2b16b5f1e635b93b908a1badacd8016ff7bb4d2a11a1bd09ef9"} err="failed to get container status \"b775ffba9d9cb2b16b5f1e635b93b908a1badacd8016ff7bb4d2a11a1bd09ef9\": rpc error: code = NotFound desc = could not find container \"b775ffba9d9cb2b16b5f1e635b93b908a1badacd8016ff7bb4d2a11a1bd09ef9\": container with ID starting with b775ffba9d9cb2b16b5f1e635b93b908a1badacd8016ff7bb4d2a11a1bd09ef9 not found: ID does not exist" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.029303 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.032919 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.034799 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.034799 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.035439 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.055946 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c6de822-b7f5-4530-bb5b-ca879ff899fc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9c6de822-b7f5-4530-bb5b-ca879ff899fc\") " pod="openstack/nova-api-0" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.055996 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c6de822-b7f5-4530-bb5b-ca879ff899fc-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9c6de822-b7f5-4530-bb5b-ca879ff899fc\") " pod="openstack/nova-api-0" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.056019 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c6de822-b7f5-4530-bb5b-ca879ff899fc-logs\") pod \"nova-api-0\" (UID: \"9c6de822-b7f5-4530-bb5b-ca879ff899fc\") " pod="openstack/nova-api-0" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.056132 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c6de822-b7f5-4530-bb5b-ca879ff899fc-public-tls-certs\") pod \"nova-api-0\" (UID: \"9c6de822-b7f5-4530-bb5b-ca879ff899fc\") " pod="openstack/nova-api-0" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.056179 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c6de822-b7f5-4530-bb5b-ca879ff899fc-config-data\") pod \"nova-api-0\" (UID: \"9c6de822-b7f5-4530-bb5b-ca879ff899fc\") " pod="openstack/nova-api-0" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.056199 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpt2c\" (UniqueName: \"kubernetes.io/projected/9c6de822-b7f5-4530-bb5b-ca879ff899fc-kube-api-access-dpt2c\") pod \"nova-api-0\" (UID: \"9c6de822-b7f5-4530-bb5b-ca879ff899fc\") " pod="openstack/nova-api-0" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.158480 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c6de822-b7f5-4530-bb5b-ca879ff899fc-public-tls-certs\") pod \"nova-api-0\" (UID: \"9c6de822-b7f5-4530-bb5b-ca879ff899fc\") " pod="openstack/nova-api-0" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.158734 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c6de822-b7f5-4530-bb5b-ca879ff899fc-config-data\") pod \"nova-api-0\" (UID: \"9c6de822-b7f5-4530-bb5b-ca879ff899fc\") " pod="openstack/nova-api-0" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.158760 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpt2c\" (UniqueName: \"kubernetes.io/projected/9c6de822-b7f5-4530-bb5b-ca879ff899fc-kube-api-access-dpt2c\") pod \"nova-api-0\" (UID: \"9c6de822-b7f5-4530-bb5b-ca879ff899fc\") " pod="openstack/nova-api-0" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.158823 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c6de822-b7f5-4530-bb5b-ca879ff899fc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9c6de822-b7f5-4530-bb5b-ca879ff899fc\") " pod="openstack/nova-api-0" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.158862 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c6de822-b7f5-4530-bb5b-ca879ff899fc-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9c6de822-b7f5-4530-bb5b-ca879ff899fc\") " pod="openstack/nova-api-0" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.158884 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c6de822-b7f5-4530-bb5b-ca879ff899fc-logs\") pod \"nova-api-0\" (UID: \"9c6de822-b7f5-4530-bb5b-ca879ff899fc\") " pod="openstack/nova-api-0" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.159376 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c6de822-b7f5-4530-bb5b-ca879ff899fc-logs\") pod \"nova-api-0\" (UID: \"9c6de822-b7f5-4530-bb5b-ca879ff899fc\") " pod="openstack/nova-api-0" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.167931 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c6de822-b7f5-4530-bb5b-ca879ff899fc-config-data\") pod \"nova-api-0\" (UID: \"9c6de822-b7f5-4530-bb5b-ca879ff899fc\") " pod="openstack/nova-api-0" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.169802 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c6de822-b7f5-4530-bb5b-ca879ff899fc-public-tls-certs\") pod \"nova-api-0\" (UID: \"9c6de822-b7f5-4530-bb5b-ca879ff899fc\") " pod="openstack/nova-api-0" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.170201 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c6de822-b7f5-4530-bb5b-ca879ff899fc-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9c6de822-b7f5-4530-bb5b-ca879ff899fc\") " pod="openstack/nova-api-0" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.174095 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c6de822-b7f5-4530-bb5b-ca879ff899fc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9c6de822-b7f5-4530-bb5b-ca879ff899fc\") " pod="openstack/nova-api-0" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.181446 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpt2c\" (UniqueName: \"kubernetes.io/projected/9c6de822-b7f5-4530-bb5b-ca879ff899fc-kube-api-access-dpt2c\") pod \"nova-api-0\" (UID: \"9c6de822-b7f5-4530-bb5b-ca879ff899fc\") " pod="openstack/nova-api-0" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.362524 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.415677 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.569945 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4tl5\" (UniqueName: \"kubernetes.io/projected/4b5702fe-b26d-43ee-b702-4ac5527947cd-kube-api-access-k4tl5\") pod \"4b5702fe-b26d-43ee-b702-4ac5527947cd\" (UID: \"4b5702fe-b26d-43ee-b702-4ac5527947cd\") " Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.570024 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b5702fe-b26d-43ee-b702-4ac5527947cd-config-data\") pod \"4b5702fe-b26d-43ee-b702-4ac5527947cd\" (UID: \"4b5702fe-b26d-43ee-b702-4ac5527947cd\") " Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.570050 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b5702fe-b26d-43ee-b702-4ac5527947cd-combined-ca-bundle\") pod \"4b5702fe-b26d-43ee-b702-4ac5527947cd\" (UID: \"4b5702fe-b26d-43ee-b702-4ac5527947cd\") " Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.581103 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b5702fe-b26d-43ee-b702-4ac5527947cd-kube-api-access-k4tl5" (OuterVolumeSpecName: "kube-api-access-k4tl5") pod "4b5702fe-b26d-43ee-b702-4ac5527947cd" (UID: "4b5702fe-b26d-43ee-b702-4ac5527947cd"). InnerVolumeSpecName "kube-api-access-k4tl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.608929 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b5702fe-b26d-43ee-b702-4ac5527947cd-config-data" (OuterVolumeSpecName: "config-data") pod "4b5702fe-b26d-43ee-b702-4ac5527947cd" (UID: "4b5702fe-b26d-43ee-b702-4ac5527947cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.610121 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b5702fe-b26d-43ee-b702-4ac5527947cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b5702fe-b26d-43ee-b702-4ac5527947cd" (UID: "4b5702fe-b26d-43ee-b702-4ac5527947cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.675160 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4tl5\" (UniqueName: \"kubernetes.io/projected/4b5702fe-b26d-43ee-b702-4ac5527947cd-kube-api-access-k4tl5\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.675403 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b5702fe-b26d-43ee-b702-4ac5527947cd-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.675414 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b5702fe-b26d-43ee-b702-4ac5527947cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.957182 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4b5702fe-b26d-43ee-b702-4ac5527947cd","Type":"ContainerDied","Data":"100c00aaeb772803e831be7330ba512b669313e4b7b45f0161f7576070d6c332"} Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.957237 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.957239 4792 scope.go:117] "RemoveContainer" containerID="1130ef58928c6df8139553b0ba2bbd03e66b69c09b50f9867cc8cd0a70ce6c1e" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.991518 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.998992 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.016170 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 01 09:31:11 crc kubenswrapper[4792]: E0301 09:31:11.016553 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b5702fe-b26d-43ee-b702-4ac5527947cd" containerName="nova-scheduler-scheduler" Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.016566 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b5702fe-b26d-43ee-b702-4ac5527947cd" containerName="nova-scheduler-scheduler" Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.016754 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b5702fe-b26d-43ee-b702-4ac5527947cd" containerName="nova-scheduler-scheduler" Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.017332 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.029080 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.048708 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.084371 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a38c1a1-88bc-4bce-aea4-13e676aab111-config-data\") pod \"nova-scheduler-0\" (UID: \"3a38c1a1-88bc-4bce-aea4-13e676aab111\") " pod="openstack/nova-scheduler-0" Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.084516 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5dn4\" (UniqueName: \"kubernetes.io/projected/3a38c1a1-88bc-4bce-aea4-13e676aab111-kube-api-access-b5dn4\") pod \"nova-scheduler-0\" (UID: \"3a38c1a1-88bc-4bce-aea4-13e676aab111\") " pod="openstack/nova-scheduler-0" Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.084556 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a38c1a1-88bc-4bce-aea4-13e676aab111-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3a38c1a1-88bc-4bce-aea4-13e676aab111\") " pod="openstack/nova-scheduler-0" Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.102717 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.186067 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a38c1a1-88bc-4bce-aea4-13e676aab111-config-data\") pod \"nova-scheduler-0\" (UID: \"3a38c1a1-88bc-4bce-aea4-13e676aab111\") " pod="openstack/nova-scheduler-0" Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.186179 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5dn4\" (UniqueName: \"kubernetes.io/projected/3a38c1a1-88bc-4bce-aea4-13e676aab111-kube-api-access-b5dn4\") pod \"nova-scheduler-0\" (UID: \"3a38c1a1-88bc-4bce-aea4-13e676aab111\") " pod="openstack/nova-scheduler-0" Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.186208 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a38c1a1-88bc-4bce-aea4-13e676aab111-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3a38c1a1-88bc-4bce-aea4-13e676aab111\") " pod="openstack/nova-scheduler-0" Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.190388 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a38c1a1-88bc-4bce-aea4-13e676aab111-config-data\") pod \"nova-scheduler-0\" (UID: \"3a38c1a1-88bc-4bce-aea4-13e676aab111\") " pod="openstack/nova-scheduler-0" Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.190707 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a38c1a1-88bc-4bce-aea4-13e676aab111-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3a38c1a1-88bc-4bce-aea4-13e676aab111\") " pod="openstack/nova-scheduler-0" Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.202054 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5dn4\" (UniqueName: \"kubernetes.io/projected/3a38c1a1-88bc-4bce-aea4-13e676aab111-kube-api-access-b5dn4\") pod \"nova-scheduler-0\" (UID: \"3a38c1a1-88bc-4bce-aea4-13e676aab111\") " pod="openstack/nova-scheduler-0" Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.333809 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.436583 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0641dd98-580b-48cf-87e8-4e0c891e18bd" path="/var/lib/kubelet/pods/0641dd98-580b-48cf-87e8-4e0c891e18bd/volumes" Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.437724 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b5702fe-b26d-43ee-b702-4ac5527947cd" path="/var/lib/kubelet/pods/4b5702fe-b26d-43ee-b702-4ac5527947cd/volumes" Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.794948 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.968939 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3a38c1a1-88bc-4bce-aea4-13e676aab111","Type":"ContainerStarted","Data":"f7bebc52f2b39ad43604457deeb01e7bc04c34789e04adf6c5dbee2cda5995b7"} Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.970019 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3a38c1a1-88bc-4bce-aea4-13e676aab111","Type":"ContainerStarted","Data":"4c3ff6612499c7c31496ed0b4566e6edddcc71e37f473a254a4a649a94b6bdb0"} Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.972421 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9c6de822-b7f5-4530-bb5b-ca879ff899fc","Type":"ContainerStarted","Data":"9b6cf9899e3d0346caebae9b1112afc79f6f3651efb183116d398b187fb43516"} Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.972460 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9c6de822-b7f5-4530-bb5b-ca879ff899fc","Type":"ContainerStarted","Data":"7491a57576f3b0a70db6735da990c539632ed6193f106bb5da1431b74de073b5"} Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.972471 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9c6de822-b7f5-4530-bb5b-ca879ff899fc","Type":"ContainerStarted","Data":"ba581e26517112c0e6879caacc25eab02457091162faddb066689dd4c874ddef"} Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.983098 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.9830791140000001 podStartE2EDuration="1.983079114s" podCreationTimestamp="2026-03-01 09:31:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:31:11.981450665 +0000 UTC m=+1401.223329872" watchObservedRunningTime="2026-03-01 09:31:11.983079114 +0000 UTC m=+1401.224958311" Mar 01 09:31:12 crc kubenswrapper[4792]: I0301 09:31:12.004477 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.004463206 podStartE2EDuration="3.004463206s" podCreationTimestamp="2026-03-01 09:31:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:31:12.000710076 +0000 UTC m=+1401.242589283" watchObservedRunningTime="2026-03-01 09:31:12.004463206 +0000 UTC m=+1401.246342403" Mar 01 09:31:12 crc kubenswrapper[4792]: I0301 09:31:12.337470 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="66b40740-5f2c-4f3a-9d20-3307335829ed" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.186:8775/\": read tcp 10.217.0.2:55656->10.217.0.186:8775: read: connection reset by peer" Mar 01 09:31:12 crc kubenswrapper[4792]: I0301 09:31:12.338169 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="66b40740-5f2c-4f3a-9d20-3307335829ed" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.186:8775/\": read tcp 10.217.0.2:55670->10.217.0.186:8775: read: connection reset by peer" Mar 01 09:31:12 crc kubenswrapper[4792]: I0301 09:31:12.736725 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 01 09:31:12 crc kubenswrapper[4792]: I0301 09:31:12.815549 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/66b40740-5f2c-4f3a-9d20-3307335829ed-nova-metadata-tls-certs\") pod \"66b40740-5f2c-4f3a-9d20-3307335829ed\" (UID: \"66b40740-5f2c-4f3a-9d20-3307335829ed\") " Mar 01 09:31:12 crc kubenswrapper[4792]: I0301 09:31:12.815636 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66b40740-5f2c-4f3a-9d20-3307335829ed-config-data\") pod \"66b40740-5f2c-4f3a-9d20-3307335829ed\" (UID: \"66b40740-5f2c-4f3a-9d20-3307335829ed\") " Mar 01 09:31:12 crc kubenswrapper[4792]: I0301 09:31:12.815721 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66b40740-5f2c-4f3a-9d20-3307335829ed-logs\") pod \"66b40740-5f2c-4f3a-9d20-3307335829ed\" (UID: \"66b40740-5f2c-4f3a-9d20-3307335829ed\") " Mar 01 09:31:12 crc kubenswrapper[4792]: I0301 09:31:12.815745 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqzd2\" (UniqueName: \"kubernetes.io/projected/66b40740-5f2c-4f3a-9d20-3307335829ed-kube-api-access-lqzd2\") pod \"66b40740-5f2c-4f3a-9d20-3307335829ed\" (UID: \"66b40740-5f2c-4f3a-9d20-3307335829ed\") " Mar 01 09:31:12 crc kubenswrapper[4792]: I0301 09:31:12.815835 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66b40740-5f2c-4f3a-9d20-3307335829ed-combined-ca-bundle\") pod \"66b40740-5f2c-4f3a-9d20-3307335829ed\" (UID: \"66b40740-5f2c-4f3a-9d20-3307335829ed\") " Mar 01 09:31:12 crc kubenswrapper[4792]: I0301 09:31:12.816348 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66b40740-5f2c-4f3a-9d20-3307335829ed-logs" (OuterVolumeSpecName: "logs") pod "66b40740-5f2c-4f3a-9d20-3307335829ed" (UID: "66b40740-5f2c-4f3a-9d20-3307335829ed"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:31:12 crc kubenswrapper[4792]: I0301 09:31:12.837816 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66b40740-5f2c-4f3a-9d20-3307335829ed-kube-api-access-lqzd2" (OuterVolumeSpecName: "kube-api-access-lqzd2") pod "66b40740-5f2c-4f3a-9d20-3307335829ed" (UID: "66b40740-5f2c-4f3a-9d20-3307335829ed"). InnerVolumeSpecName "kube-api-access-lqzd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:31:12 crc kubenswrapper[4792]: I0301 09:31:12.867004 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66b40740-5f2c-4f3a-9d20-3307335829ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "66b40740-5f2c-4f3a-9d20-3307335829ed" (UID: "66b40740-5f2c-4f3a-9d20-3307335829ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:31:12 crc kubenswrapper[4792]: E0301 09:31:12.882280 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66b40740-5f2c-4f3a-9d20-3307335829ed-config-data podName:66b40740-5f2c-4f3a-9d20-3307335829ed nodeName:}" failed. No retries permitted until 2026-03-01 09:31:13.382255214 +0000 UTC m=+1402.624134411 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/66b40740-5f2c-4f3a-9d20-3307335829ed-config-data") pod "66b40740-5f2c-4f3a-9d20-3307335829ed" (UID: "66b40740-5f2c-4f3a-9d20-3307335829ed") : error deleting /var/lib/kubelet/pods/66b40740-5f2c-4f3a-9d20-3307335829ed/volume-subpaths: remove /var/lib/kubelet/pods/66b40740-5f2c-4f3a-9d20-3307335829ed/volume-subpaths: no such file or directory Mar 01 09:31:12 crc kubenswrapper[4792]: I0301 09:31:12.894185 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66b40740-5f2c-4f3a-9d20-3307335829ed-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "66b40740-5f2c-4f3a-9d20-3307335829ed" (UID: "66b40740-5f2c-4f3a-9d20-3307335829ed"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:31:12 crc kubenswrapper[4792]: I0301 09:31:12.917844 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66b40740-5f2c-4f3a-9d20-3307335829ed-logs\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:12 crc kubenswrapper[4792]: I0301 09:31:12.918183 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqzd2\" (UniqueName: \"kubernetes.io/projected/66b40740-5f2c-4f3a-9d20-3307335829ed-kube-api-access-lqzd2\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:12 crc kubenswrapper[4792]: I0301 09:31:12.918288 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66b40740-5f2c-4f3a-9d20-3307335829ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:12 crc kubenswrapper[4792]: I0301 09:31:12.918371 4792 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/66b40740-5f2c-4f3a-9d20-3307335829ed-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:12 crc kubenswrapper[4792]: I0301 09:31:12.983557 4792 generic.go:334] "Generic (PLEG): container finished" podID="66b40740-5f2c-4f3a-9d20-3307335829ed" containerID="a14ebd244facee732540cda9c7f5ef2f0b1a5f3035c360a40a8a1d517d895ef8" exitCode=0 Mar 01 09:31:12 crc kubenswrapper[4792]: I0301 09:31:12.983606 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 01 09:31:12 crc kubenswrapper[4792]: I0301 09:31:12.983633 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"66b40740-5f2c-4f3a-9d20-3307335829ed","Type":"ContainerDied","Data":"a14ebd244facee732540cda9c7f5ef2f0b1a5f3035c360a40a8a1d517d895ef8"} Mar 01 09:31:12 crc kubenswrapper[4792]: I0301 09:31:12.984378 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"66b40740-5f2c-4f3a-9d20-3307335829ed","Type":"ContainerDied","Data":"243858d3fba357536368a417ecf1a917b39cc53d20c017fb5d7083ee13365f6d"} Mar 01 09:31:12 crc kubenswrapper[4792]: I0301 09:31:12.984407 4792 scope.go:117] "RemoveContainer" containerID="a14ebd244facee732540cda9c7f5ef2f0b1a5f3035c360a40a8a1d517d895ef8" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.009137 4792 scope.go:117] "RemoveContainer" containerID="4824e5f7f0b4542375429e0324e26b504dd03e3e54b0250d4667215c3a5784ed" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.040380 4792 scope.go:117] "RemoveContainer" containerID="a14ebd244facee732540cda9c7f5ef2f0b1a5f3035c360a40a8a1d517d895ef8" Mar 01 09:31:13 crc kubenswrapper[4792]: E0301 09:31:13.040964 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a14ebd244facee732540cda9c7f5ef2f0b1a5f3035c360a40a8a1d517d895ef8\": container with ID starting with a14ebd244facee732540cda9c7f5ef2f0b1a5f3035c360a40a8a1d517d895ef8 not found: ID does not exist" containerID="a14ebd244facee732540cda9c7f5ef2f0b1a5f3035c360a40a8a1d517d895ef8" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.041072 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a14ebd244facee732540cda9c7f5ef2f0b1a5f3035c360a40a8a1d517d895ef8"} err="failed to get container status \"a14ebd244facee732540cda9c7f5ef2f0b1a5f3035c360a40a8a1d517d895ef8\": rpc error: code = NotFound desc = could not find container \"a14ebd244facee732540cda9c7f5ef2f0b1a5f3035c360a40a8a1d517d895ef8\": container with ID starting with a14ebd244facee732540cda9c7f5ef2f0b1a5f3035c360a40a8a1d517d895ef8 not found: ID does not exist" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.041178 4792 scope.go:117] "RemoveContainer" containerID="4824e5f7f0b4542375429e0324e26b504dd03e3e54b0250d4667215c3a5784ed" Mar 01 09:31:13 crc kubenswrapper[4792]: E0301 09:31:13.041533 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4824e5f7f0b4542375429e0324e26b504dd03e3e54b0250d4667215c3a5784ed\": container with ID starting with 4824e5f7f0b4542375429e0324e26b504dd03e3e54b0250d4667215c3a5784ed not found: ID does not exist" containerID="4824e5f7f0b4542375429e0324e26b504dd03e3e54b0250d4667215c3a5784ed" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.041580 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4824e5f7f0b4542375429e0324e26b504dd03e3e54b0250d4667215c3a5784ed"} err="failed to get container status \"4824e5f7f0b4542375429e0324e26b504dd03e3e54b0250d4667215c3a5784ed\": rpc error: code = NotFound desc = could not find container \"4824e5f7f0b4542375429e0324e26b504dd03e3e54b0250d4667215c3a5784ed\": container with ID starting with 4824e5f7f0b4542375429e0324e26b504dd03e3e54b0250d4667215c3a5784ed not found: ID does not exist" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.426466 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66b40740-5f2c-4f3a-9d20-3307335829ed-config-data\") pod \"66b40740-5f2c-4f3a-9d20-3307335829ed\" (UID: \"66b40740-5f2c-4f3a-9d20-3307335829ed\") " Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.441320 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66b40740-5f2c-4f3a-9d20-3307335829ed-config-data" (OuterVolumeSpecName: "config-data") pod "66b40740-5f2c-4f3a-9d20-3307335829ed" (UID: "66b40740-5f2c-4f3a-9d20-3307335829ed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.530157 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66b40740-5f2c-4f3a-9d20-3307335829ed-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.625546 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.635649 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.653433 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 01 09:31:13 crc kubenswrapper[4792]: E0301 09:31:13.654440 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66b40740-5f2c-4f3a-9d20-3307335829ed" containerName="nova-metadata-log" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.654464 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="66b40740-5f2c-4f3a-9d20-3307335829ed" containerName="nova-metadata-log" Mar 01 09:31:13 crc kubenswrapper[4792]: E0301 09:31:13.654517 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66b40740-5f2c-4f3a-9d20-3307335829ed" containerName="nova-metadata-metadata" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.654525 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="66b40740-5f2c-4f3a-9d20-3307335829ed" containerName="nova-metadata-metadata" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.654802 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="66b40740-5f2c-4f3a-9d20-3307335829ed" containerName="nova-metadata-log" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.654843 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="66b40740-5f2c-4f3a-9d20-3307335829ed" containerName="nova-metadata-metadata" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.656534 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.662944 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.663753 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.681752 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.836060 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbf9560f-212f-460a-9a4d-250e20b00d18-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cbf9560f-212f-460a-9a4d-250e20b00d18\") " pod="openstack/nova-metadata-0" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.836410 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbf9560f-212f-460a-9a4d-250e20b00d18-config-data\") pod \"nova-metadata-0\" (UID: \"cbf9560f-212f-460a-9a4d-250e20b00d18\") " pod="openstack/nova-metadata-0" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.836641 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbf9560f-212f-460a-9a4d-250e20b00d18-logs\") pod \"nova-metadata-0\" (UID: \"cbf9560f-212f-460a-9a4d-250e20b00d18\") " pod="openstack/nova-metadata-0" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.836705 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l26tw\" (UniqueName: \"kubernetes.io/projected/cbf9560f-212f-460a-9a4d-250e20b00d18-kube-api-access-l26tw\") pod \"nova-metadata-0\" (UID: \"cbf9560f-212f-460a-9a4d-250e20b00d18\") " pod="openstack/nova-metadata-0" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.836829 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbf9560f-212f-460a-9a4d-250e20b00d18-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cbf9560f-212f-460a-9a4d-250e20b00d18\") " pod="openstack/nova-metadata-0" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.938811 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbf9560f-212f-460a-9a4d-250e20b00d18-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cbf9560f-212f-460a-9a4d-250e20b00d18\") " pod="openstack/nova-metadata-0" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.938883 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbf9560f-212f-460a-9a4d-250e20b00d18-config-data\") pod \"nova-metadata-0\" (UID: \"cbf9560f-212f-460a-9a4d-250e20b00d18\") " pod="openstack/nova-metadata-0" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.938944 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbf9560f-212f-460a-9a4d-250e20b00d18-logs\") pod \"nova-metadata-0\" (UID: \"cbf9560f-212f-460a-9a4d-250e20b00d18\") " pod="openstack/nova-metadata-0" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.938967 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l26tw\" (UniqueName: \"kubernetes.io/projected/cbf9560f-212f-460a-9a4d-250e20b00d18-kube-api-access-l26tw\") pod \"nova-metadata-0\" (UID: \"cbf9560f-212f-460a-9a4d-250e20b00d18\") " pod="openstack/nova-metadata-0" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.939012 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbf9560f-212f-460a-9a4d-250e20b00d18-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cbf9560f-212f-460a-9a4d-250e20b00d18\") " pod="openstack/nova-metadata-0" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.939947 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbf9560f-212f-460a-9a4d-250e20b00d18-logs\") pod \"nova-metadata-0\" (UID: \"cbf9560f-212f-460a-9a4d-250e20b00d18\") " pod="openstack/nova-metadata-0" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.944813 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbf9560f-212f-460a-9a4d-250e20b00d18-config-data\") pod \"nova-metadata-0\" (UID: \"cbf9560f-212f-460a-9a4d-250e20b00d18\") " pod="openstack/nova-metadata-0" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.947814 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbf9560f-212f-460a-9a4d-250e20b00d18-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cbf9560f-212f-460a-9a4d-250e20b00d18\") " pod="openstack/nova-metadata-0" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.969606 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbf9560f-212f-460a-9a4d-250e20b00d18-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cbf9560f-212f-460a-9a4d-250e20b00d18\") " pod="openstack/nova-metadata-0" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.969758 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l26tw\" (UniqueName: \"kubernetes.io/projected/cbf9560f-212f-460a-9a4d-250e20b00d18-kube-api-access-l26tw\") pod \"nova-metadata-0\" (UID: \"cbf9560f-212f-460a-9a4d-250e20b00d18\") " pod="openstack/nova-metadata-0" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.993236 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 01 09:31:14 crc kubenswrapper[4792]: I0301 09:31:14.581288 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 01 09:31:14 crc kubenswrapper[4792]: W0301 09:31:14.586998 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbf9560f_212f_460a_9a4d_250e20b00d18.slice/crio-6a2c86e35a00a104b3d5a7496abb4ae76c54acf5921be847bd8052b9a10ab207 WatchSource:0}: Error finding container 6a2c86e35a00a104b3d5a7496abb4ae76c54acf5921be847bd8052b9a10ab207: Status 404 returned error can't find the container with id 6a2c86e35a00a104b3d5a7496abb4ae76c54acf5921be847bd8052b9a10ab207 Mar 01 09:31:15 crc kubenswrapper[4792]: I0301 09:31:15.003423 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cbf9560f-212f-460a-9a4d-250e20b00d18","Type":"ContainerStarted","Data":"9217f2a680050cc383512f0d64b46639ceeeab526b10ba87c7de4c729b4ec997"} Mar 01 09:31:15 crc kubenswrapper[4792]: I0301 09:31:15.003766 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cbf9560f-212f-460a-9a4d-250e20b00d18","Type":"ContainerStarted","Data":"1e0244cd2b05cdedbcffd147075822a2efed61e61904b9966abbaf4c48301cc9"} Mar 01 09:31:15 crc kubenswrapper[4792]: I0301 09:31:15.003778 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cbf9560f-212f-460a-9a4d-250e20b00d18","Type":"ContainerStarted","Data":"6a2c86e35a00a104b3d5a7496abb4ae76c54acf5921be847bd8052b9a10ab207"} Mar 01 09:31:15 crc kubenswrapper[4792]: I0301 09:31:15.021077 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.021059748 podStartE2EDuration="2.021059748s" podCreationTimestamp="2026-03-01 09:31:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:31:15.020670509 +0000 UTC m=+1404.262549706" watchObservedRunningTime="2026-03-01 09:31:15.021059748 +0000 UTC m=+1404.262938945" Mar 01 09:31:15 crc kubenswrapper[4792]: I0301 09:31:15.420198 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66b40740-5f2c-4f3a-9d20-3307335829ed" path="/var/lib/kubelet/pods/66b40740-5f2c-4f3a-9d20-3307335829ed/volumes" Mar 01 09:31:16 crc kubenswrapper[4792]: I0301 09:31:16.334221 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 01 09:31:18 crc kubenswrapper[4792]: I0301 09:31:18.993628 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 01 09:31:18 crc kubenswrapper[4792]: I0301 09:31:18.994996 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 01 09:31:20 crc kubenswrapper[4792]: I0301 09:31:20.362896 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 01 09:31:20 crc kubenswrapper[4792]: I0301 09:31:20.364456 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 01 09:31:21 crc kubenswrapper[4792]: I0301 09:31:21.335209 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 01 09:31:21 crc kubenswrapper[4792]: I0301 09:31:21.357792 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 01 09:31:21 crc kubenswrapper[4792]: I0301 09:31:21.373136 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9c6de822-b7f5-4530-bb5b-ca879ff899fc" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.196:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 01 09:31:21 crc kubenswrapper[4792]: I0301 09:31:21.373163 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9c6de822-b7f5-4530-bb5b-ca879ff899fc" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.196:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 01 09:31:22 crc kubenswrapper[4792]: I0301 09:31:22.140532 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 01 09:31:23 crc kubenswrapper[4792]: I0301 09:31:23.993375 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 01 09:31:23 crc kubenswrapper[4792]: I0301 09:31:23.993689 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 01 09:31:25 crc kubenswrapper[4792]: I0301 09:31:25.005081 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="cbf9560f-212f-460a-9a4d-250e20b00d18" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.198:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 01 09:31:25 crc kubenswrapper[4792]: I0301 09:31:25.005082 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="cbf9560f-212f-460a-9a4d-250e20b00d18" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.198:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 01 09:31:27 crc kubenswrapper[4792]: I0301 09:31:27.175613 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 01 09:31:27 crc kubenswrapper[4792]: I0301 09:31:27.265698 4792 scope.go:117] "RemoveContainer" containerID="f4b48983a710c40494648dd6a515d3975deee5c28f7b927750a63de93e040785" Mar 01 09:31:27 crc kubenswrapper[4792]: I0301 09:31:27.298133 4792 scope.go:117] "RemoveContainer" containerID="8fa90defd7fd5e3f42f2b3e3f4e2672234268081fbce5bff10b4ec243b2afba1" Mar 01 09:31:27 crc kubenswrapper[4792]: I0301 09:31:27.328970 4792 scope.go:117] "RemoveContainer" containerID="2712402bbe2a400f1f172cc0f249c7e35edf7b64593d06c6d1cfd9d81ee06f57" Mar 01 09:31:30 crc kubenswrapper[4792]: I0301 09:31:30.371141 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 01 09:31:30 crc kubenswrapper[4792]: I0301 09:31:30.372568 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 01 09:31:30 crc kubenswrapper[4792]: I0301 09:31:30.373131 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 01 09:31:30 crc kubenswrapper[4792]: I0301 09:31:30.382410 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 01 09:31:31 crc kubenswrapper[4792]: I0301 09:31:31.140748 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 01 09:31:31 crc kubenswrapper[4792]: I0301 09:31:31.147872 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 01 09:31:33 crc kubenswrapper[4792]: I0301 09:31:33.999192 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 01 09:31:34 crc kubenswrapper[4792]: I0301 09:31:34.006049 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 01 09:31:34 crc kubenswrapper[4792]: I0301 09:31:34.007925 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 01 09:31:34 crc kubenswrapper[4792]: I0301 09:31:34.170459 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 01 09:31:41 crc kubenswrapper[4792]: I0301 09:31:41.303400 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 01 09:31:42 crc kubenswrapper[4792]: I0301 09:31:42.839776 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 01 09:31:46 crc kubenswrapper[4792]: I0301 09:31:46.872476 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="6252a079-917c-46e8-a848-10569e1e057e" containerName="rabbitmq" containerID="cri-o://e872a8250debe35b2b405169cd43bdbc962c34739bd277e35d8038f3fa166251" gracePeriod=604795 Mar 01 09:31:47 crc kubenswrapper[4792]: I0301 09:31:47.596132 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="2d63e2d4-fa65-425a-8578-d7b9f8c5ba24" containerName="rabbitmq" containerID="cri-o://763a17ce168e713296e67217a330ca93fd37d2fe6e80cda59899f22b0afab4c5" gracePeriod=604796 Mar 01 09:31:52 crc kubenswrapper[4792]: I0301 09:31:52.928120 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="2d63e2d4-fa65-425a-8578-d7b9f8c5ba24" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.356275 4792 generic.go:334] "Generic (PLEG): container finished" podID="6252a079-917c-46e8-a848-10569e1e057e" containerID="e872a8250debe35b2b405169cd43bdbc962c34739bd277e35d8038f3fa166251" exitCode=0 Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.356362 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6252a079-917c-46e8-a848-10569e1e057e","Type":"ContainerDied","Data":"e872a8250debe35b2b405169cd43bdbc962c34739bd277e35d8038f3fa166251"} Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.356577 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6252a079-917c-46e8-a848-10569e1e057e","Type":"ContainerDied","Data":"dda196254ac808b630ce64a84572b51fae2b42596880acb14f053e9eec301086"} Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.356592 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dda196254ac808b630ce64a84572b51fae2b42596880acb14f053e9eec301086" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.423262 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.599557 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6252a079-917c-46e8-a848-10569e1e057e-pod-info\") pod \"6252a079-917c-46e8-a848-10569e1e057e\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.599725 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"6252a079-917c-46e8-a848-10569e1e057e\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.599769 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6252a079-917c-46e8-a848-10569e1e057e-plugins-conf\") pod \"6252a079-917c-46e8-a848-10569e1e057e\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.599800 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6252a079-917c-46e8-a848-10569e1e057e-erlang-cookie-secret\") pod \"6252a079-917c-46e8-a848-10569e1e057e\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.599823 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-confd\") pod \"6252a079-917c-46e8-a848-10569e1e057e\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.599880 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-tls\") pod \"6252a079-917c-46e8-a848-10569e1e057e\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.599898 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnscv\" (UniqueName: \"kubernetes.io/projected/6252a079-917c-46e8-a848-10569e1e057e-kube-api-access-qnscv\") pod \"6252a079-917c-46e8-a848-10569e1e057e\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.599973 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-erlang-cookie\") pod \"6252a079-917c-46e8-a848-10569e1e057e\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.599990 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-plugins\") pod \"6252a079-917c-46e8-a848-10569e1e057e\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.600014 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6252a079-917c-46e8-a848-10569e1e057e-config-data\") pod \"6252a079-917c-46e8-a848-10569e1e057e\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.600035 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6252a079-917c-46e8-a848-10569e1e057e-server-conf\") pod \"6252a079-917c-46e8-a848-10569e1e057e\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.601523 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "6252a079-917c-46e8-a848-10569e1e057e" (UID: "6252a079-917c-46e8-a848-10569e1e057e"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.601592 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6252a079-917c-46e8-a848-10569e1e057e-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "6252a079-917c-46e8-a848-10569e1e057e" (UID: "6252a079-917c-46e8-a848-10569e1e057e"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.601594 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "6252a079-917c-46e8-a848-10569e1e057e" (UID: "6252a079-917c-46e8-a848-10569e1e057e"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.610811 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/6252a079-917c-46e8-a848-10569e1e057e-pod-info" (OuterVolumeSpecName: "pod-info") pod "6252a079-917c-46e8-a848-10569e1e057e" (UID: "6252a079-917c-46e8-a848-10569e1e057e"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.610860 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "6252a079-917c-46e8-a848-10569e1e057e" (UID: "6252a079-917c-46e8-a848-10569e1e057e"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.611034 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "6252a079-917c-46e8-a848-10569e1e057e" (UID: "6252a079-917c-46e8-a848-10569e1e057e"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.613138 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6252a079-917c-46e8-a848-10569e1e057e-kube-api-access-qnscv" (OuterVolumeSpecName: "kube-api-access-qnscv") pod "6252a079-917c-46e8-a848-10569e1e057e" (UID: "6252a079-917c-46e8-a848-10569e1e057e"). InnerVolumeSpecName "kube-api-access-qnscv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.632326 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6252a079-917c-46e8-a848-10569e1e057e-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "6252a079-917c-46e8-a848-10569e1e057e" (UID: "6252a079-917c-46e8-a848-10569e1e057e"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.690401 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6252a079-917c-46e8-a848-10569e1e057e-config-data" (OuterVolumeSpecName: "config-data") pod "6252a079-917c-46e8-a848-10569e1e057e" (UID: "6252a079-917c-46e8-a848-10569e1e057e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.703426 4792 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.703488 4792 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6252a079-917c-46e8-a848-10569e1e057e-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.703502 4792 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6252a079-917c-46e8-a848-10569e1e057e-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.703512 4792 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.703520 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnscv\" (UniqueName: \"kubernetes.io/projected/6252a079-917c-46e8-a848-10569e1e057e-kube-api-access-qnscv\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.703528 4792 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.703536 4792 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.703543 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6252a079-917c-46e8-a848-10569e1e057e-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.703553 4792 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6252a079-917c-46e8-a848-10569e1e057e-pod-info\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.708123 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6252a079-917c-46e8-a848-10569e1e057e-server-conf" (OuterVolumeSpecName: "server-conf") pod "6252a079-917c-46e8-a848-10569e1e057e" (UID: "6252a079-917c-46e8-a848-10569e1e057e"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.743328 4792 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.763598 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "6252a079-917c-46e8-a848-10569e1e057e" (UID: "6252a079-917c-46e8-a848-10569e1e057e"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.804591 4792 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.804743 4792 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.804753 4792 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6252a079-917c-46e8-a848-10569e1e057e-server-conf\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.055163 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.218158 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-confd\") pod \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.218199 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.218253 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-server-conf\") pod \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.218297 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-pod-info\") pod \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.218318 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-tls\") pod \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.218377 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8fhc\" (UniqueName: \"kubernetes.io/projected/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-kube-api-access-q8fhc\") pod \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.218435 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-plugins\") pod \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.218479 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-erlang-cookie-secret\") pod \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.218509 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-erlang-cookie\") pod \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.218580 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-config-data\") pod \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.218610 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-plugins-conf\") pod \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.219467 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "2d63e2d4-fa65-425a-8578-d7b9f8c5ba24" (UID: "2d63e2d4-fa65-425a-8578-d7b9f8c5ba24"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.219733 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "2d63e2d4-fa65-425a-8578-d7b9f8c5ba24" (UID: "2d63e2d4-fa65-425a-8578-d7b9f8c5ba24"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.221839 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "2d63e2d4-fa65-425a-8578-d7b9f8c5ba24" (UID: "2d63e2d4-fa65-425a-8578-d7b9f8c5ba24"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.224878 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-kube-api-access-q8fhc" (OuterVolumeSpecName: "kube-api-access-q8fhc") pod "2d63e2d4-fa65-425a-8578-d7b9f8c5ba24" (UID: "2d63e2d4-fa65-425a-8578-d7b9f8c5ba24"). InnerVolumeSpecName "kube-api-access-q8fhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.224997 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "2d63e2d4-fa65-425a-8578-d7b9f8c5ba24" (UID: "2d63e2d4-fa65-425a-8578-d7b9f8c5ba24"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.226133 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "2d63e2d4-fa65-425a-8578-d7b9f8c5ba24" (UID: "2d63e2d4-fa65-425a-8578-d7b9f8c5ba24"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.235093 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-pod-info" (OuterVolumeSpecName: "pod-info") pod "2d63e2d4-fa65-425a-8578-d7b9f8c5ba24" (UID: "2d63e2d4-fa65-425a-8578-d7b9f8c5ba24"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.246065 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "2d63e2d4-fa65-425a-8578-d7b9f8c5ba24" (UID: "2d63e2d4-fa65-425a-8578-d7b9f8c5ba24"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.277218 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-config-data" (OuterVolumeSpecName: "config-data") pod "2d63e2d4-fa65-425a-8578-d7b9f8c5ba24" (UID: "2d63e2d4-fa65-425a-8578-d7b9f8c5ba24"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.320986 4792 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-pod-info\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.321015 4792 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.321025 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8fhc\" (UniqueName: \"kubernetes.io/projected/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-kube-api-access-q8fhc\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.321191 4792 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.321205 4792 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.321213 4792 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.321222 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.321229 4792 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.321260 4792 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.321980 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-server-conf" (OuterVolumeSpecName: "server-conf") pod "2d63e2d4-fa65-425a-8578-d7b9f8c5ba24" (UID: "2d63e2d4-fa65-425a-8578-d7b9f8c5ba24"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.339152 4792 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.362525 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "2d63e2d4-fa65-425a-8578-d7b9f8c5ba24" (UID: "2d63e2d4-fa65-425a-8578-d7b9f8c5ba24"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.372488 4792 generic.go:334] "Generic (PLEG): container finished" podID="2d63e2d4-fa65-425a-8578-d7b9f8c5ba24" containerID="763a17ce168e713296e67217a330ca93fd37d2fe6e80cda59899f22b0afab4c5" exitCode=0 Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.372578 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.372576 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24","Type":"ContainerDied","Data":"763a17ce168e713296e67217a330ca93fd37d2fe6e80cda59899f22b0afab4c5"} Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.372637 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24","Type":"ContainerDied","Data":"3e8de91b3c58261b32cbdb52401a16acdc8aa762850b0b7a587dfa85e98e1d6e"} Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.372657 4792 scope.go:117] "RemoveContainer" containerID="763a17ce168e713296e67217a330ca93fd37d2fe6e80cda59899f22b0afab4c5" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.373526 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.422547 4792 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.422575 4792 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.422585 4792 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-server-conf\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.431865 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.442127 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.450855 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.474427 4792 scope.go:117] "RemoveContainer" containerID="6ee0d7e342e549ff7a0c75a829bbad1f0458089ec98c553779c523b140c0f36b" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.509998 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.531028 4792 scope.go:117] "RemoveContainer" containerID="763a17ce168e713296e67217a330ca93fd37d2fe6e80cda59899f22b0afab4c5" Mar 01 09:31:54 crc kubenswrapper[4792]: E0301 09:31:54.531483 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"763a17ce168e713296e67217a330ca93fd37d2fe6e80cda59899f22b0afab4c5\": container with ID starting with 763a17ce168e713296e67217a330ca93fd37d2fe6e80cda59899f22b0afab4c5 not found: ID does not exist" containerID="763a17ce168e713296e67217a330ca93fd37d2fe6e80cda59899f22b0afab4c5" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.531592 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"763a17ce168e713296e67217a330ca93fd37d2fe6e80cda59899f22b0afab4c5"} err="failed to get container status \"763a17ce168e713296e67217a330ca93fd37d2fe6e80cda59899f22b0afab4c5\": rpc error: code = NotFound desc = could not find container \"763a17ce168e713296e67217a330ca93fd37d2fe6e80cda59899f22b0afab4c5\": container with ID starting with 763a17ce168e713296e67217a330ca93fd37d2fe6e80cda59899f22b0afab4c5 not found: ID does not exist" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.531694 4792 scope.go:117] "RemoveContainer" containerID="6ee0d7e342e549ff7a0c75a829bbad1f0458089ec98c553779c523b140c0f36b" Mar 01 09:31:54 crc kubenswrapper[4792]: E0301 09:31:54.532193 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ee0d7e342e549ff7a0c75a829bbad1f0458089ec98c553779c523b140c0f36b\": container with ID starting with 6ee0d7e342e549ff7a0c75a829bbad1f0458089ec98c553779c523b140c0f36b not found: ID does not exist" containerID="6ee0d7e342e549ff7a0c75a829bbad1f0458089ec98c553779c523b140c0f36b" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.532323 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ee0d7e342e549ff7a0c75a829bbad1f0458089ec98c553779c523b140c0f36b"} err="failed to get container status \"6ee0d7e342e549ff7a0c75a829bbad1f0458089ec98c553779c523b140c0f36b\": rpc error: code = NotFound desc = could not find container \"6ee0d7e342e549ff7a0c75a829bbad1f0458089ec98c553779c523b140c0f36b\": container with ID starting with 6ee0d7e342e549ff7a0c75a829bbad1f0458089ec98c553779c523b140c0f36b not found: ID does not exist" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.547605 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 01 09:31:54 crc kubenswrapper[4792]: E0301 09:31:54.548012 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d63e2d4-fa65-425a-8578-d7b9f8c5ba24" containerName="rabbitmq" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.548031 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d63e2d4-fa65-425a-8578-d7b9f8c5ba24" containerName="rabbitmq" Mar 01 09:31:54 crc kubenswrapper[4792]: E0301 09:31:54.548057 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6252a079-917c-46e8-a848-10569e1e057e" containerName="rabbitmq" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.548063 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6252a079-917c-46e8-a848-10569e1e057e" containerName="rabbitmq" Mar 01 09:31:54 crc kubenswrapper[4792]: E0301 09:31:54.548074 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6252a079-917c-46e8-a848-10569e1e057e" containerName="setup-container" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.548080 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6252a079-917c-46e8-a848-10569e1e057e" containerName="setup-container" Mar 01 09:31:54 crc kubenswrapper[4792]: E0301 09:31:54.548089 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d63e2d4-fa65-425a-8578-d7b9f8c5ba24" containerName="setup-container" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.548095 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d63e2d4-fa65-425a-8578-d7b9f8c5ba24" containerName="setup-container" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.548253 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d63e2d4-fa65-425a-8578-d7b9f8c5ba24" containerName="rabbitmq" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.548267 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="6252a079-917c-46e8-a848-10569e1e057e" containerName="rabbitmq" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.549232 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.552478 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.552799 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.553239 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.553786 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.553815 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.553814 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-5zwb6" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.553872 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.571603 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.575614 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.580705 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.580975 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.581155 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.581166 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.581003 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.581073 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-584kl" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.582244 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.588334 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.600412 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.625984 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/63658b27-63d9-4a0f-afca-3a3c245b9b9d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.626043 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/63658b27-63d9-4a0f-afca-3a3c245b9b9d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.626066 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/63658b27-63d9-4a0f-afca-3a3c245b9b9d-config-data\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.626089 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/63658b27-63d9-4a0f-afca-3a3c245b9b9d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.626116 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/63658b27-63d9-4a0f-afca-3a3c245b9b9d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.626140 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/63658b27-63d9-4a0f-afca-3a3c245b9b9d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.626160 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/63658b27-63d9-4a0f-afca-3a3c245b9b9d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.626183 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.626195 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/63658b27-63d9-4a0f-afca-3a3c245b9b9d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.626213 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/63658b27-63d9-4a0f-afca-3a3c245b9b9d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.626226 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nnp2\" (UniqueName: \"kubernetes.io/projected/63658b27-63d9-4a0f-afca-3a3c245b9b9d-kube-api-access-5nnp2\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.727982 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.728290 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/63658b27-63d9-4a0f-afca-3a3c245b9b9d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.728399 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.728494 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.728554 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/63658b27-63d9-4a0f-afca-3a3c245b9b9d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.728578 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/63658b27-63d9-4a0f-afca-3a3c245b9b9d-config-data\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.728601 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/63658b27-63d9-4a0f-afca-3a3c245b9b9d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.728621 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.728648 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhq6d\" (UniqueName: \"kubernetes.io/projected/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-kube-api-access-vhq6d\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.728686 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.728716 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/63658b27-63d9-4a0f-afca-3a3c245b9b9d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.728740 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.728774 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/63658b27-63d9-4a0f-afca-3a3c245b9b9d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.728799 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.728830 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/63658b27-63d9-4a0f-afca-3a3c245b9b9d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.728856 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.728882 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/63658b27-63d9-4a0f-afca-3a3c245b9b9d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.728899 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.728935 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/63658b27-63d9-4a0f-afca-3a3c245b9b9d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.728951 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nnp2\" (UniqueName: \"kubernetes.io/projected/63658b27-63d9-4a0f-afca-3a3c245b9b9d-kube-api-access-5nnp2\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.729038 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.729055 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.729307 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.729643 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/63658b27-63d9-4a0f-afca-3a3c245b9b9d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.729762 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/63658b27-63d9-4a0f-afca-3a3c245b9b9d-config-data\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.730049 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/63658b27-63d9-4a0f-afca-3a3c245b9b9d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.730243 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/63658b27-63d9-4a0f-afca-3a3c245b9b9d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.730522 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/63658b27-63d9-4a0f-afca-3a3c245b9b9d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.733224 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/63658b27-63d9-4a0f-afca-3a3c245b9b9d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.735695 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/63658b27-63d9-4a0f-afca-3a3c245b9b9d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.736193 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/63658b27-63d9-4a0f-afca-3a3c245b9b9d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.736745 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/63658b27-63d9-4a0f-afca-3a3c245b9b9d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.749104 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nnp2\" (UniqueName: \"kubernetes.io/projected/63658b27-63d9-4a0f-afca-3a3c245b9b9d-kube-api-access-5nnp2\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.763796 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.830169 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.830237 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.830281 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.830317 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.830376 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.830793 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.830825 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.830922 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.830962 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.830986 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.831088 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.831116 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhq6d\" (UniqueName: \"kubernetes.io/projected/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-kube-api-access-vhq6d\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.831767 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.832103 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.832359 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.832770 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.833351 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.835490 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.835629 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.835944 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.836395 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.849276 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhq6d\" (UniqueName: \"kubernetes.io/projected/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-kube-api-access-vhq6d\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.858186 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.867325 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.892819 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:55 crc kubenswrapper[4792]: I0301 09:31:55.353352 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 01 09:31:55 crc kubenswrapper[4792]: W0301 09:31:55.361532 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63658b27_63d9_4a0f_afca_3a3c245b9b9d.slice/crio-f695ccbc44c0b0adbed26c0a4d27156ab9f6b736a46cdb18c00e990b0e4ef3c0 WatchSource:0}: Error finding container f695ccbc44c0b0adbed26c0a4d27156ab9f6b736a46cdb18c00e990b0e4ef3c0: Status 404 returned error can't find the container with id f695ccbc44c0b0adbed26c0a4d27156ab9f6b736a46cdb18c00e990b0e4ef3c0 Mar 01 09:31:55 crc kubenswrapper[4792]: I0301 09:31:55.381671 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"63658b27-63d9-4a0f-afca-3a3c245b9b9d","Type":"ContainerStarted","Data":"f695ccbc44c0b0adbed26c0a4d27156ab9f6b736a46cdb18c00e990b0e4ef3c0"} Mar 01 09:31:55 crc kubenswrapper[4792]: I0301 09:31:55.423375 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d63e2d4-fa65-425a-8578-d7b9f8c5ba24" path="/var/lib/kubelet/pods/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24/volumes" Mar 01 09:31:55 crc kubenswrapper[4792]: I0301 09:31:55.424156 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6252a079-917c-46e8-a848-10569e1e057e" path="/var/lib/kubelet/pods/6252a079-917c-46e8-a848-10569e1e057e/volumes" Mar 01 09:31:55 crc kubenswrapper[4792]: I0301 09:31:55.454274 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 01 09:31:55 crc kubenswrapper[4792]: W0301 09:31:55.470600 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0e1dd7a_6a53_446d_bf90_5813f7a3fda0.slice/crio-a99e25d1f8fff02f96e102db5bcabc13c7b4a765ad26ce58ed52f48bec020deb WatchSource:0}: Error finding container a99e25d1f8fff02f96e102db5bcabc13c7b4a765ad26ce58ed52f48bec020deb: Status 404 returned error can't find the container with id a99e25d1f8fff02f96e102db5bcabc13c7b4a765ad26ce58ed52f48bec020deb Mar 01 09:31:56 crc kubenswrapper[4792]: I0301 09:31:56.392927 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0","Type":"ContainerStarted","Data":"a99e25d1f8fff02f96e102db5bcabc13c7b4a765ad26ce58ed52f48bec020deb"} Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.271601 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59c44489bc-lcb28"] Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.274026 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.279980 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.287052 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59c44489bc-lcb28"] Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.382535 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-openstack-edpm-ipam\") pod \"dnsmasq-dns-59c44489bc-lcb28\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.382627 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-ovsdbserver-sb\") pod \"dnsmasq-dns-59c44489bc-lcb28\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.382722 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-config\") pod \"dnsmasq-dns-59c44489bc-lcb28\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.382756 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9kw4\" (UniqueName: \"kubernetes.io/projected/241ce805-8049-491d-bdf9-eeadf9ea4080-kube-api-access-p9kw4\") pod \"dnsmasq-dns-59c44489bc-lcb28\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.382799 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-dns-svc\") pod \"dnsmasq-dns-59c44489bc-lcb28\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.382844 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-ovsdbserver-nb\") pod \"dnsmasq-dns-59c44489bc-lcb28\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.402015 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"63658b27-63d9-4a0f-afca-3a3c245b9b9d","Type":"ContainerStarted","Data":"2ab8068f1ad5a6321a051380bfb498b59d9b17d29b4726fc01b0c882c34ec764"} Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.404804 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0","Type":"ContainerStarted","Data":"a31d63c5aa5d050434ce6a4f0e9ec1378f77889b76bca1387d32b037ac0b6ede"} Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.484672 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-ovsdbserver-sb\") pod \"dnsmasq-dns-59c44489bc-lcb28\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.484808 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-config\") pod \"dnsmasq-dns-59c44489bc-lcb28\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.484840 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9kw4\" (UniqueName: \"kubernetes.io/projected/241ce805-8049-491d-bdf9-eeadf9ea4080-kube-api-access-p9kw4\") pod \"dnsmasq-dns-59c44489bc-lcb28\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.484979 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-dns-svc\") pod \"dnsmasq-dns-59c44489bc-lcb28\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.485085 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-ovsdbserver-nb\") pod \"dnsmasq-dns-59c44489bc-lcb28\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.485169 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-openstack-edpm-ipam\") pod \"dnsmasq-dns-59c44489bc-lcb28\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.486826 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-config\") pod \"dnsmasq-dns-59c44489bc-lcb28\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.487029 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-ovsdbserver-sb\") pod \"dnsmasq-dns-59c44489bc-lcb28\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.487745 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-dns-svc\") pod \"dnsmasq-dns-59c44489bc-lcb28\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.487781 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-ovsdbserver-nb\") pod \"dnsmasq-dns-59c44489bc-lcb28\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.488239 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-openstack-edpm-ipam\") pod \"dnsmasq-dns-59c44489bc-lcb28\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.508800 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9kw4\" (UniqueName: \"kubernetes.io/projected/241ce805-8049-491d-bdf9-eeadf9ea4080-kube-api-access-p9kw4\") pod \"dnsmasq-dns-59c44489bc-lcb28\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.597190 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.933244 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59c44489bc-lcb28"] Mar 01 09:31:58 crc kubenswrapper[4792]: I0301 09:31:58.232543 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="6252a079-917c-46e8-a848-10569e1e057e" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: i/o timeout" Mar 01 09:31:58 crc kubenswrapper[4792]: I0301 09:31:58.413351 4792 generic.go:334] "Generic (PLEG): container finished" podID="241ce805-8049-491d-bdf9-eeadf9ea4080" containerID="05e67a0819aa5ed5e27ec53a14d250ee562957cc561df53697e3217eab454f4c" exitCode=0 Mar 01 09:31:58 crc kubenswrapper[4792]: I0301 09:31:58.413385 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59c44489bc-lcb28" event={"ID":"241ce805-8049-491d-bdf9-eeadf9ea4080","Type":"ContainerDied","Data":"05e67a0819aa5ed5e27ec53a14d250ee562957cc561df53697e3217eab454f4c"} Mar 01 09:31:58 crc kubenswrapper[4792]: I0301 09:31:58.413424 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59c44489bc-lcb28" event={"ID":"241ce805-8049-491d-bdf9-eeadf9ea4080","Type":"ContainerStarted","Data":"31553e15cd3569da29e5d2b0a1053e7e96e46dcf7276c14267ec5547dc01b54a"} Mar 01 09:31:59 crc kubenswrapper[4792]: I0301 09:31:59.422830 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59c44489bc-lcb28" event={"ID":"241ce805-8049-491d-bdf9-eeadf9ea4080","Type":"ContainerStarted","Data":"6d67b5704dbd88e3a06443bdd2c44be45361687723ea422a2d3f017f9864c781"} Mar 01 09:31:59 crc kubenswrapper[4792]: I0301 09:31:59.423133 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:31:59 crc kubenswrapper[4792]: I0301 09:31:59.451873 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59c44489bc-lcb28" podStartSLOduration=2.451854309 podStartE2EDuration="2.451854309s" podCreationTimestamp="2026-03-01 09:31:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:31:59.443009503 +0000 UTC m=+1448.684888700" watchObservedRunningTime="2026-03-01 09:31:59.451854309 +0000 UTC m=+1448.693733506" Mar 01 09:32:00 crc kubenswrapper[4792]: I0301 09:32:00.132417 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539292-vszff"] Mar 01 09:32:00 crc kubenswrapper[4792]: I0301 09:32:00.133580 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539292-vszff" Mar 01 09:32:00 crc kubenswrapper[4792]: I0301 09:32:00.137104 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:32:00 crc kubenswrapper[4792]: I0301 09:32:00.138477 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:32:00 crc kubenswrapper[4792]: I0301 09:32:00.138756 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:32:00 crc kubenswrapper[4792]: I0301 09:32:00.146858 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539292-vszff"] Mar 01 09:32:00 crc kubenswrapper[4792]: I0301 09:32:00.229362 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwrnp\" (UniqueName: \"kubernetes.io/projected/13d3d8c8-6b26-4e5c-94f9-6fe6ae0519fc-kube-api-access-kwrnp\") pod \"auto-csr-approver-29539292-vszff\" (UID: \"13d3d8c8-6b26-4e5c-94f9-6fe6ae0519fc\") " pod="openshift-infra/auto-csr-approver-29539292-vszff" Mar 01 09:32:00 crc kubenswrapper[4792]: I0301 09:32:00.331137 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwrnp\" (UniqueName: \"kubernetes.io/projected/13d3d8c8-6b26-4e5c-94f9-6fe6ae0519fc-kube-api-access-kwrnp\") pod \"auto-csr-approver-29539292-vszff\" (UID: \"13d3d8c8-6b26-4e5c-94f9-6fe6ae0519fc\") " pod="openshift-infra/auto-csr-approver-29539292-vszff" Mar 01 09:32:00 crc kubenswrapper[4792]: I0301 09:32:00.349636 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwrnp\" (UniqueName: \"kubernetes.io/projected/13d3d8c8-6b26-4e5c-94f9-6fe6ae0519fc-kube-api-access-kwrnp\") pod \"auto-csr-approver-29539292-vszff\" (UID: \"13d3d8c8-6b26-4e5c-94f9-6fe6ae0519fc\") " pod="openshift-infra/auto-csr-approver-29539292-vszff" Mar 01 09:32:00 crc kubenswrapper[4792]: I0301 09:32:00.502335 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539292-vszff" Mar 01 09:32:00 crc kubenswrapper[4792]: I0301 09:32:00.920583 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539292-vszff"] Mar 01 09:32:01 crc kubenswrapper[4792]: I0301 09:32:01.444920 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539292-vszff" event={"ID":"13d3d8c8-6b26-4e5c-94f9-6fe6ae0519fc","Type":"ContainerStarted","Data":"37a36460b8475148e96ac7c4dad0fb77508fa47b713a2dfd5024df5d2510ac5c"} Mar 01 09:32:03 crc kubenswrapper[4792]: I0301 09:32:03.464607 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539292-vszff" event={"ID":"13d3d8c8-6b26-4e5c-94f9-6fe6ae0519fc","Type":"ContainerStarted","Data":"8ffaa11ad79d37459635175d7fbda620c8204659760d7268eb92ed800bd1a03d"} Mar 01 09:32:03 crc kubenswrapper[4792]: I0301 09:32:03.484750 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29539292-vszff" podStartSLOduration=1.6695277339999999 podStartE2EDuration="3.484734774s" podCreationTimestamp="2026-03-01 09:32:00 +0000 UTC" firstStartedPulling="2026-03-01 09:32:00.923057737 +0000 UTC m=+1450.164936944" lastFinishedPulling="2026-03-01 09:32:02.738264777 +0000 UTC m=+1451.980143984" observedRunningTime="2026-03-01 09:32:03.479030761 +0000 UTC m=+1452.720909958" watchObservedRunningTime="2026-03-01 09:32:03.484734774 +0000 UTC m=+1452.726613981" Mar 01 09:32:04 crc kubenswrapper[4792]: I0301 09:32:04.476166 4792 generic.go:334] "Generic (PLEG): container finished" podID="13d3d8c8-6b26-4e5c-94f9-6fe6ae0519fc" containerID="8ffaa11ad79d37459635175d7fbda620c8204659760d7268eb92ed800bd1a03d" exitCode=0 Mar 01 09:32:04 crc kubenswrapper[4792]: I0301 09:32:04.476231 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539292-vszff" event={"ID":"13d3d8c8-6b26-4e5c-94f9-6fe6ae0519fc","Type":"ContainerDied","Data":"8ffaa11ad79d37459635175d7fbda620c8204659760d7268eb92ed800bd1a03d"} Mar 01 09:32:05 crc kubenswrapper[4792]: I0301 09:32:05.828039 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539292-vszff" Mar 01 09:32:05 crc kubenswrapper[4792]: I0301 09:32:05.953580 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwrnp\" (UniqueName: \"kubernetes.io/projected/13d3d8c8-6b26-4e5c-94f9-6fe6ae0519fc-kube-api-access-kwrnp\") pod \"13d3d8c8-6b26-4e5c-94f9-6fe6ae0519fc\" (UID: \"13d3d8c8-6b26-4e5c-94f9-6fe6ae0519fc\") " Mar 01 09:32:05 crc kubenswrapper[4792]: I0301 09:32:05.959185 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13d3d8c8-6b26-4e5c-94f9-6fe6ae0519fc-kube-api-access-kwrnp" (OuterVolumeSpecName: "kube-api-access-kwrnp") pod "13d3d8c8-6b26-4e5c-94f9-6fe6ae0519fc" (UID: "13d3d8c8-6b26-4e5c-94f9-6fe6ae0519fc"). InnerVolumeSpecName "kube-api-access-kwrnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:32:06 crc kubenswrapper[4792]: I0301 09:32:06.055796 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwrnp\" (UniqueName: \"kubernetes.io/projected/13d3d8c8-6b26-4e5c-94f9-6fe6ae0519fc-kube-api-access-kwrnp\") on node \"crc\" DevicePath \"\"" Mar 01 09:32:06 crc kubenswrapper[4792]: I0301 09:32:06.494738 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539292-vszff" event={"ID":"13d3d8c8-6b26-4e5c-94f9-6fe6ae0519fc","Type":"ContainerDied","Data":"37a36460b8475148e96ac7c4dad0fb77508fa47b713a2dfd5024df5d2510ac5c"} Mar 01 09:32:06 crc kubenswrapper[4792]: I0301 09:32:06.494782 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37a36460b8475148e96ac7c4dad0fb77508fa47b713a2dfd5024df5d2510ac5c" Mar 01 09:32:06 crc kubenswrapper[4792]: I0301 09:32:06.494840 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539292-vszff" Mar 01 09:32:06 crc kubenswrapper[4792]: I0301 09:32:06.555180 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539286-l47hq"] Mar 01 09:32:06 crc kubenswrapper[4792]: I0301 09:32:06.563824 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539286-l47hq"] Mar 01 09:32:07 crc kubenswrapper[4792]: I0301 09:32:07.418757 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2eeb77af-03ae-4e32-80a6-3c16ed5ef64e" path="/var/lib/kubelet/pods/2eeb77af-03ae-4e32-80a6-3c16ed5ef64e/volumes" Mar 01 09:32:07 crc kubenswrapper[4792]: I0301 09:32:07.599125 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:32:07 crc kubenswrapper[4792]: I0301 09:32:07.684014 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c74598c69-2pgch"] Mar 01 09:32:07 crc kubenswrapper[4792]: I0301 09:32:07.684217 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c74598c69-2pgch" podUID="3dc02dae-4469-4e20-aca1-c85d7e451b7f" containerName="dnsmasq-dns" containerID="cri-o://a09dbf98a35444aba5d4b5bed5ccc4f027fcae4773128cc7e35e38da6f0a3954" gracePeriod=10 Mar 01 09:32:07 crc kubenswrapper[4792]: I0301 09:32:07.853987 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cb7494899-9x44w"] Mar 01 09:32:07 crc kubenswrapper[4792]: E0301 09:32:07.854349 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13d3d8c8-6b26-4e5c-94f9-6fe6ae0519fc" containerName="oc" Mar 01 09:32:07 crc kubenswrapper[4792]: I0301 09:32:07.854363 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="13d3d8c8-6b26-4e5c-94f9-6fe6ae0519fc" containerName="oc" Mar 01 09:32:07 crc kubenswrapper[4792]: I0301 09:32:07.854542 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="13d3d8c8-6b26-4e5c-94f9-6fe6ae0519fc" containerName="oc" Mar 01 09:32:07 crc kubenswrapper[4792]: I0301 09:32:07.864362 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 09:32:07 crc kubenswrapper[4792]: I0301 09:32:07.871035 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb7494899-9x44w"] Mar 01 09:32:07 crc kubenswrapper[4792]: I0301 09:32:07.995582 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb7494899-9x44w\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 09:32:07 crc kubenswrapper[4792]: I0301 09:32:07.995674 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-ovsdbserver-nb\") pod \"dnsmasq-dns-cb7494899-9x44w\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 09:32:07 crc kubenswrapper[4792]: I0301 09:32:07.995701 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjnt9\" (UniqueName: \"kubernetes.io/projected/a3a408d8-0510-4867-8517-e609d614a5d2-kube-api-access-cjnt9\") pod \"dnsmasq-dns-cb7494899-9x44w\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 09:32:07 crc kubenswrapper[4792]: I0301 09:32:07.995735 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-dns-svc\") pod \"dnsmasq-dns-cb7494899-9x44w\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 09:32:07 crc kubenswrapper[4792]: I0301 09:32:07.995795 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-config\") pod \"dnsmasq-dns-cb7494899-9x44w\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 09:32:07 crc kubenswrapper[4792]: I0301 09:32:07.995829 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-ovsdbserver-sb\") pod \"dnsmasq-dns-cb7494899-9x44w\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.097324 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb7494899-9x44w\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.097416 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-ovsdbserver-nb\") pod \"dnsmasq-dns-cb7494899-9x44w\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.097437 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjnt9\" (UniqueName: \"kubernetes.io/projected/a3a408d8-0510-4867-8517-e609d614a5d2-kube-api-access-cjnt9\") pod \"dnsmasq-dns-cb7494899-9x44w\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.097458 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-dns-svc\") pod \"dnsmasq-dns-cb7494899-9x44w\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.097478 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-config\") pod \"dnsmasq-dns-cb7494899-9x44w\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.097501 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-ovsdbserver-sb\") pod \"dnsmasq-dns-cb7494899-9x44w\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.098396 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-ovsdbserver-sb\") pod \"dnsmasq-dns-cb7494899-9x44w\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.098420 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-ovsdbserver-nb\") pod \"dnsmasq-dns-cb7494899-9x44w\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.098786 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-config\") pod \"dnsmasq-dns-cb7494899-9x44w\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.099147 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb7494899-9x44w\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.099180 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-dns-svc\") pod \"dnsmasq-dns-cb7494899-9x44w\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.130137 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjnt9\" (UniqueName: \"kubernetes.io/projected/a3a408d8-0510-4867-8517-e609d614a5d2-kube-api-access-cjnt9\") pod \"dnsmasq-dns-cb7494899-9x44w\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.195256 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.206533 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c74598c69-2pgch" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.301270 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-ovsdbserver-sb\") pod \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\" (UID: \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\") " Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.301519 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drdkm\" (UniqueName: \"kubernetes.io/projected/3dc02dae-4469-4e20-aca1-c85d7e451b7f-kube-api-access-drdkm\") pod \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\" (UID: \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\") " Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.301570 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-config\") pod \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\" (UID: \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\") " Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.301643 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-ovsdbserver-nb\") pod \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\" (UID: \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\") " Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.301674 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-dns-svc\") pod \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\" (UID: \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\") " Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.322939 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dc02dae-4469-4e20-aca1-c85d7e451b7f-kube-api-access-drdkm" (OuterVolumeSpecName: "kube-api-access-drdkm") pod "3dc02dae-4469-4e20-aca1-c85d7e451b7f" (UID: "3dc02dae-4469-4e20-aca1-c85d7e451b7f"). InnerVolumeSpecName "kube-api-access-drdkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.357300 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3dc02dae-4469-4e20-aca1-c85d7e451b7f" (UID: "3dc02dae-4469-4e20-aca1-c85d7e451b7f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.358850 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-config" (OuterVolumeSpecName: "config") pod "3dc02dae-4469-4e20-aca1-c85d7e451b7f" (UID: "3dc02dae-4469-4e20-aca1-c85d7e451b7f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.361534 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3dc02dae-4469-4e20-aca1-c85d7e451b7f" (UID: "3dc02dae-4469-4e20-aca1-c85d7e451b7f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.365574 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3dc02dae-4469-4e20-aca1-c85d7e451b7f" (UID: "3dc02dae-4469-4e20-aca1-c85d7e451b7f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.404938 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.404968 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.404982 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drdkm\" (UniqueName: \"kubernetes.io/projected/3dc02dae-4469-4e20-aca1-c85d7e451b7f-kube-api-access-drdkm\") on node \"crc\" DevicePath \"\"" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.404993 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.405006 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.512595 4792 generic.go:334] "Generic (PLEG): container finished" podID="3dc02dae-4469-4e20-aca1-c85d7e451b7f" containerID="a09dbf98a35444aba5d4b5bed5ccc4f027fcae4773128cc7e35e38da6f0a3954" exitCode=0 Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.512637 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c74598c69-2pgch" event={"ID":"3dc02dae-4469-4e20-aca1-c85d7e451b7f","Type":"ContainerDied","Data":"a09dbf98a35444aba5d4b5bed5ccc4f027fcae4773128cc7e35e38da6f0a3954"} Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.512667 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c74598c69-2pgch" event={"ID":"3dc02dae-4469-4e20-aca1-c85d7e451b7f","Type":"ContainerDied","Data":"6c7fd81ed9af7972987f7c71c071457fa812a6cd3376e81094962a9f69854811"} Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.512687 4792 scope.go:117] "RemoveContainer" containerID="a09dbf98a35444aba5d4b5bed5ccc4f027fcae4773128cc7e35e38da6f0a3954" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.512831 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c74598c69-2pgch" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.536404 4792 scope.go:117] "RemoveContainer" containerID="e23f1b79cd43eb2011bd766bf49f85132d9c752379520cf1459e680d0fc6f48c" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.557385 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c74598c69-2pgch"] Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.564105 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c74598c69-2pgch"] Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.575702 4792 scope.go:117] "RemoveContainer" containerID="a09dbf98a35444aba5d4b5bed5ccc4f027fcae4773128cc7e35e38da6f0a3954" Mar 01 09:32:08 crc kubenswrapper[4792]: E0301 09:32:08.576449 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a09dbf98a35444aba5d4b5bed5ccc4f027fcae4773128cc7e35e38da6f0a3954\": container with ID starting with a09dbf98a35444aba5d4b5bed5ccc4f027fcae4773128cc7e35e38da6f0a3954 not found: ID does not exist" containerID="a09dbf98a35444aba5d4b5bed5ccc4f027fcae4773128cc7e35e38da6f0a3954" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.576475 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a09dbf98a35444aba5d4b5bed5ccc4f027fcae4773128cc7e35e38da6f0a3954"} err="failed to get container status \"a09dbf98a35444aba5d4b5bed5ccc4f027fcae4773128cc7e35e38da6f0a3954\": rpc error: code = NotFound desc = could not find container \"a09dbf98a35444aba5d4b5bed5ccc4f027fcae4773128cc7e35e38da6f0a3954\": container with ID starting with a09dbf98a35444aba5d4b5bed5ccc4f027fcae4773128cc7e35e38da6f0a3954 not found: ID does not exist" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.576496 4792 scope.go:117] "RemoveContainer" containerID="e23f1b79cd43eb2011bd766bf49f85132d9c752379520cf1459e680d0fc6f48c" Mar 01 09:32:08 crc kubenswrapper[4792]: E0301 09:32:08.577478 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e23f1b79cd43eb2011bd766bf49f85132d9c752379520cf1459e680d0fc6f48c\": container with ID starting with e23f1b79cd43eb2011bd766bf49f85132d9c752379520cf1459e680d0fc6f48c not found: ID does not exist" containerID="e23f1b79cd43eb2011bd766bf49f85132d9c752379520cf1459e680d0fc6f48c" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.577501 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e23f1b79cd43eb2011bd766bf49f85132d9c752379520cf1459e680d0fc6f48c"} err="failed to get container status \"e23f1b79cd43eb2011bd766bf49f85132d9c752379520cf1459e680d0fc6f48c\": rpc error: code = NotFound desc = could not find container \"e23f1b79cd43eb2011bd766bf49f85132d9c752379520cf1459e680d0fc6f48c\": container with ID starting with e23f1b79cd43eb2011bd766bf49f85132d9c752379520cf1459e680d0fc6f48c not found: ID does not exist" Mar 01 09:32:09 crc kubenswrapper[4792]: I0301 09:32:09.266691 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb7494899-9x44w"] Mar 01 09:32:09 crc kubenswrapper[4792]: W0301 09:32:09.280033 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3a408d8_0510_4867_8517_e609d614a5d2.slice/crio-3830dbf250a4bedcc384f157cb50c2184899780d10921fec2974544052904874 WatchSource:0}: Error finding container 3830dbf250a4bedcc384f157cb50c2184899780d10921fec2974544052904874: Status 404 returned error can't find the container with id 3830dbf250a4bedcc384f157cb50c2184899780d10921fec2974544052904874 Mar 01 09:32:09 crc kubenswrapper[4792]: I0301 09:32:09.421335 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dc02dae-4469-4e20-aca1-c85d7e451b7f" path="/var/lib/kubelet/pods/3dc02dae-4469-4e20-aca1-c85d7e451b7f/volumes" Mar 01 09:32:09 crc kubenswrapper[4792]: I0301 09:32:09.528758 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb7494899-9x44w" event={"ID":"a3a408d8-0510-4867-8517-e609d614a5d2","Type":"ContainerStarted","Data":"3830dbf250a4bedcc384f157cb50c2184899780d10921fec2974544052904874"} Mar 01 09:32:10 crc kubenswrapper[4792]: I0301 09:32:10.539055 4792 generic.go:334] "Generic (PLEG): container finished" podID="a3a408d8-0510-4867-8517-e609d614a5d2" containerID="9c10172b37a7ca756da9dc968292ad3961c9a1084ca116b725c3c50da7e6ecd8" exitCode=0 Mar 01 09:32:10 crc kubenswrapper[4792]: I0301 09:32:10.539145 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb7494899-9x44w" event={"ID":"a3a408d8-0510-4867-8517-e609d614a5d2","Type":"ContainerDied","Data":"9c10172b37a7ca756da9dc968292ad3961c9a1084ca116b725c3c50da7e6ecd8"} Mar 01 09:32:11 crc kubenswrapper[4792]: I0301 09:32:11.548088 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb7494899-9x44w" event={"ID":"a3a408d8-0510-4867-8517-e609d614a5d2","Type":"ContainerStarted","Data":"8b04ed1bd42aa863eef16fe08f4819a294c0cd44d80b6329225717f3d7d610c0"} Mar 01 09:32:11 crc kubenswrapper[4792]: I0301 09:32:11.548749 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 09:32:11 crc kubenswrapper[4792]: I0301 09:32:11.599112 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cb7494899-9x44w" podStartSLOduration=4.59909393 podStartE2EDuration="4.59909393s" podCreationTimestamp="2026-03-01 09:32:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:32:11.571484795 +0000 UTC m=+1460.813364042" watchObservedRunningTime="2026-03-01 09:32:11.59909393 +0000 UTC m=+1460.840973127" Mar 01 09:32:18 crc kubenswrapper[4792]: I0301 09:32:18.197076 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 09:32:18 crc kubenswrapper[4792]: I0301 09:32:18.289764 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59c44489bc-lcb28"] Mar 01 09:32:18 crc kubenswrapper[4792]: I0301 09:32:18.290102 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59c44489bc-lcb28" podUID="241ce805-8049-491d-bdf9-eeadf9ea4080" containerName="dnsmasq-dns" containerID="cri-o://6d67b5704dbd88e3a06443bdd2c44be45361687723ea422a2d3f017f9864c781" gracePeriod=10 Mar 01 09:32:18 crc kubenswrapper[4792]: I0301 09:32:18.622304 4792 generic.go:334] "Generic (PLEG): container finished" podID="241ce805-8049-491d-bdf9-eeadf9ea4080" containerID="6d67b5704dbd88e3a06443bdd2c44be45361687723ea422a2d3f017f9864c781" exitCode=0 Mar 01 09:32:18 crc kubenswrapper[4792]: I0301 09:32:18.622360 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59c44489bc-lcb28" event={"ID":"241ce805-8049-491d-bdf9-eeadf9ea4080","Type":"ContainerDied","Data":"6d67b5704dbd88e3a06443bdd2c44be45361687723ea422a2d3f017f9864c781"} Mar 01 09:32:18 crc kubenswrapper[4792]: I0301 09:32:18.777134 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:32:18 crc kubenswrapper[4792]: I0301 09:32:18.931131 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-config\") pod \"241ce805-8049-491d-bdf9-eeadf9ea4080\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " Mar 01 09:32:18 crc kubenswrapper[4792]: I0301 09:32:18.931206 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-dns-svc\") pod \"241ce805-8049-491d-bdf9-eeadf9ea4080\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " Mar 01 09:32:18 crc kubenswrapper[4792]: I0301 09:32:18.931253 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-ovsdbserver-sb\") pod \"241ce805-8049-491d-bdf9-eeadf9ea4080\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " Mar 01 09:32:18 crc kubenswrapper[4792]: I0301 09:32:18.931346 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-ovsdbserver-nb\") pod \"241ce805-8049-491d-bdf9-eeadf9ea4080\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " Mar 01 09:32:18 crc kubenswrapper[4792]: I0301 09:32:18.931414 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-openstack-edpm-ipam\") pod \"241ce805-8049-491d-bdf9-eeadf9ea4080\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " Mar 01 09:32:18 crc kubenswrapper[4792]: I0301 09:32:18.931431 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9kw4\" (UniqueName: \"kubernetes.io/projected/241ce805-8049-491d-bdf9-eeadf9ea4080-kube-api-access-p9kw4\") pod \"241ce805-8049-491d-bdf9-eeadf9ea4080\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " Mar 01 09:32:18 crc kubenswrapper[4792]: I0301 09:32:18.954304 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/241ce805-8049-491d-bdf9-eeadf9ea4080-kube-api-access-p9kw4" (OuterVolumeSpecName: "kube-api-access-p9kw4") pod "241ce805-8049-491d-bdf9-eeadf9ea4080" (UID: "241ce805-8049-491d-bdf9-eeadf9ea4080"). InnerVolumeSpecName "kube-api-access-p9kw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:32:18 crc kubenswrapper[4792]: I0301 09:32:18.990298 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "241ce805-8049-491d-bdf9-eeadf9ea4080" (UID: "241ce805-8049-491d-bdf9-eeadf9ea4080"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:32:18 crc kubenswrapper[4792]: I0301 09:32:18.990631 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "241ce805-8049-491d-bdf9-eeadf9ea4080" (UID: "241ce805-8049-491d-bdf9-eeadf9ea4080"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:32:19 crc kubenswrapper[4792]: I0301 09:32:19.011405 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "241ce805-8049-491d-bdf9-eeadf9ea4080" (UID: "241ce805-8049-491d-bdf9-eeadf9ea4080"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:32:19 crc kubenswrapper[4792]: I0301 09:32:19.011412 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "241ce805-8049-491d-bdf9-eeadf9ea4080" (UID: "241ce805-8049-491d-bdf9-eeadf9ea4080"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:32:19 crc kubenswrapper[4792]: I0301 09:32:19.017415 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-config" (OuterVolumeSpecName: "config") pod "241ce805-8049-491d-bdf9-eeadf9ea4080" (UID: "241ce805-8049-491d-bdf9-eeadf9ea4080"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:32:19 crc kubenswrapper[4792]: I0301 09:32:19.033954 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 01 09:32:19 crc kubenswrapper[4792]: I0301 09:32:19.033985 4792 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:32:19 crc kubenswrapper[4792]: I0301 09:32:19.033995 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9kw4\" (UniqueName: \"kubernetes.io/projected/241ce805-8049-491d-bdf9-eeadf9ea4080-kube-api-access-p9kw4\") on node \"crc\" DevicePath \"\"" Mar 01 09:32:19 crc kubenswrapper[4792]: I0301 09:32:19.034012 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:32:19 crc kubenswrapper[4792]: I0301 09:32:19.034024 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 01 09:32:19 crc kubenswrapper[4792]: I0301 09:32:19.034035 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 01 09:32:19 crc kubenswrapper[4792]: I0301 09:32:19.631643 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59c44489bc-lcb28" event={"ID":"241ce805-8049-491d-bdf9-eeadf9ea4080","Type":"ContainerDied","Data":"31553e15cd3569da29e5d2b0a1053e7e96e46dcf7276c14267ec5547dc01b54a"} Mar 01 09:32:19 crc kubenswrapper[4792]: I0301 09:32:19.631694 4792 scope.go:117] "RemoveContainer" containerID="6d67b5704dbd88e3a06443bdd2c44be45361687723ea422a2d3f017f9864c781" Mar 01 09:32:19 crc kubenswrapper[4792]: I0301 09:32:19.632674 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:32:19 crc kubenswrapper[4792]: I0301 09:32:19.657095 4792 scope.go:117] "RemoveContainer" containerID="05e67a0819aa5ed5e27ec53a14d250ee562957cc561df53697e3217eab454f4c" Mar 01 09:32:19 crc kubenswrapper[4792]: I0301 09:32:19.660423 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59c44489bc-lcb28"] Mar 01 09:32:19 crc kubenswrapper[4792]: I0301 09:32:19.667586 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59c44489bc-lcb28"] Mar 01 09:32:21 crc kubenswrapper[4792]: I0301 09:32:21.418625 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="241ce805-8049-491d-bdf9-eeadf9ea4080" path="/var/lib/kubelet/pods/241ce805-8049-491d-bdf9-eeadf9ea4080/volumes" Mar 01 09:32:27 crc kubenswrapper[4792]: I0301 09:32:27.534783 4792 scope.go:117] "RemoveContainer" containerID="30d57fe1f686a0e7d648422ad7801f657bc274b2e9502cf906d12a5e85e207f4" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.350877 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln"] Mar 01 09:32:28 crc kubenswrapper[4792]: E0301 09:32:28.352024 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dc02dae-4469-4e20-aca1-c85d7e451b7f" containerName="init" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.352049 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dc02dae-4469-4e20-aca1-c85d7e451b7f" containerName="init" Mar 01 09:32:28 crc kubenswrapper[4792]: E0301 09:32:28.352085 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="241ce805-8049-491d-bdf9-eeadf9ea4080" containerName="dnsmasq-dns" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.352094 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="241ce805-8049-491d-bdf9-eeadf9ea4080" containerName="dnsmasq-dns" Mar 01 09:32:28 crc kubenswrapper[4792]: E0301 09:32:28.352111 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dc02dae-4469-4e20-aca1-c85d7e451b7f" containerName="dnsmasq-dns" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.352121 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dc02dae-4469-4e20-aca1-c85d7e451b7f" containerName="dnsmasq-dns" Mar 01 09:32:28 crc kubenswrapper[4792]: E0301 09:32:28.352141 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="241ce805-8049-491d-bdf9-eeadf9ea4080" containerName="init" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.352150 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="241ce805-8049-491d-bdf9-eeadf9ea4080" containerName="init" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.352574 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dc02dae-4469-4e20-aca1-c85d7e451b7f" containerName="dnsmasq-dns" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.352610 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="241ce805-8049-491d-bdf9-eeadf9ea4080" containerName="dnsmasq-dns" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.353670 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.358608 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.358811 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.372495 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.372786 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.387615 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln"] Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.503299 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54e68c85-54c7-4855-b4a0-a85d2014c7b7-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln\" (UID: \"54e68c85-54c7-4855-b4a0-a85d2014c7b7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.503349 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/54e68c85-54c7-4855-b4a0-a85d2014c7b7-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln\" (UID: \"54e68c85-54c7-4855-b4a0-a85d2014c7b7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.503397 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/54e68c85-54c7-4855-b4a0-a85d2014c7b7-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln\" (UID: \"54e68c85-54c7-4855-b4a0-a85d2014c7b7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.503632 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p25kk\" (UniqueName: \"kubernetes.io/projected/54e68c85-54c7-4855-b4a0-a85d2014c7b7-kube-api-access-p25kk\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln\" (UID: \"54e68c85-54c7-4855-b4a0-a85d2014c7b7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.605200 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54e68c85-54c7-4855-b4a0-a85d2014c7b7-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln\" (UID: \"54e68c85-54c7-4855-b4a0-a85d2014c7b7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.605261 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/54e68c85-54c7-4855-b4a0-a85d2014c7b7-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln\" (UID: \"54e68c85-54c7-4855-b4a0-a85d2014c7b7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.605336 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/54e68c85-54c7-4855-b4a0-a85d2014c7b7-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln\" (UID: \"54e68c85-54c7-4855-b4a0-a85d2014c7b7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.605379 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p25kk\" (UniqueName: \"kubernetes.io/projected/54e68c85-54c7-4855-b4a0-a85d2014c7b7-kube-api-access-p25kk\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln\" (UID: \"54e68c85-54c7-4855-b4a0-a85d2014c7b7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.611960 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/54e68c85-54c7-4855-b4a0-a85d2014c7b7-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln\" (UID: \"54e68c85-54c7-4855-b4a0-a85d2014c7b7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.616484 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/54e68c85-54c7-4855-b4a0-a85d2014c7b7-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln\" (UID: \"54e68c85-54c7-4855-b4a0-a85d2014c7b7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.619100 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54e68c85-54c7-4855-b4a0-a85d2014c7b7-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln\" (UID: \"54e68c85-54c7-4855-b4a0-a85d2014c7b7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.619697 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p25kk\" (UniqueName: \"kubernetes.io/projected/54e68c85-54c7-4855-b4a0-a85d2014c7b7-kube-api-access-p25kk\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln\" (UID: \"54e68c85-54c7-4855-b4a0-a85d2014c7b7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.677712 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln" Mar 01 09:32:29 crc kubenswrapper[4792]: I0301 09:32:29.180987 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln"] Mar 01 09:32:29 crc kubenswrapper[4792]: I0301 09:32:29.736629 4792 generic.go:334] "Generic (PLEG): container finished" podID="63658b27-63d9-4a0f-afca-3a3c245b9b9d" containerID="2ab8068f1ad5a6321a051380bfb498b59d9b17d29b4726fc01b0c882c34ec764" exitCode=0 Mar 01 09:32:29 crc kubenswrapper[4792]: I0301 09:32:29.736740 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"63658b27-63d9-4a0f-afca-3a3c245b9b9d","Type":"ContainerDied","Data":"2ab8068f1ad5a6321a051380bfb498b59d9b17d29b4726fc01b0c882c34ec764"} Mar 01 09:32:29 crc kubenswrapper[4792]: I0301 09:32:29.739631 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln" event={"ID":"54e68c85-54c7-4855-b4a0-a85d2014c7b7","Type":"ContainerStarted","Data":"d567265e3ca5a368f3df31052aed27c28d8ae817e07fd00d4fefda5d841711b8"} Mar 01 09:32:29 crc kubenswrapper[4792]: I0301 09:32:29.741830 4792 generic.go:334] "Generic (PLEG): container finished" podID="e0e1dd7a-6a53-446d-bf90-5813f7a3fda0" containerID="a31d63c5aa5d050434ce6a4f0e9ec1378f77889b76bca1387d32b037ac0b6ede" exitCode=0 Mar 01 09:32:29 crc kubenswrapper[4792]: I0301 09:32:29.742004 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0","Type":"ContainerDied","Data":"a31d63c5aa5d050434ce6a4f0e9ec1378f77889b76bca1387d32b037ac0b6ede"} Mar 01 09:32:30 crc kubenswrapper[4792]: I0301 09:32:30.755749 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"63658b27-63d9-4a0f-afca-3a3c245b9b9d","Type":"ContainerStarted","Data":"d696bdd63d2cda6818c556f518a2d022a5026e40dcd06f6b79dce7c77e643e51"} Mar 01 09:32:30 crc kubenswrapper[4792]: I0301 09:32:30.756606 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 01 09:32:30 crc kubenswrapper[4792]: I0301 09:32:30.760967 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0","Type":"ContainerStarted","Data":"bd13573e587f769a572a7e418917f49e78ffad74eac437c621b579cbfab272e8"} Mar 01 09:32:30 crc kubenswrapper[4792]: I0301 09:32:30.761871 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:32:30 crc kubenswrapper[4792]: I0301 09:32:30.784447 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.784429925 podStartE2EDuration="36.784429925s" podCreationTimestamp="2026-03-01 09:31:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:32:30.781504167 +0000 UTC m=+1480.023383364" watchObservedRunningTime="2026-03-01 09:32:30.784429925 +0000 UTC m=+1480.026309122" Mar 01 09:32:31 crc kubenswrapper[4792]: I0301 09:32:31.435223 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.435201019 podStartE2EDuration="37.435201019s" podCreationTimestamp="2026-03-01 09:31:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:32:30.806586862 +0000 UTC m=+1480.048466079" watchObservedRunningTime="2026-03-01 09:32:31.435201019 +0000 UTC m=+1480.677080206" Mar 01 09:32:41 crc kubenswrapper[4792]: I0301 09:32:41.850774 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln" event={"ID":"54e68c85-54c7-4855-b4a0-a85d2014c7b7","Type":"ContainerStarted","Data":"5773fb565f7454c83bd5a97647f2258db8af4721a7199d6a6eb816399f3a0abe"} Mar 01 09:32:44 crc kubenswrapper[4792]: I0301 09:32:44.872086 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 01 09:32:44 crc kubenswrapper[4792]: I0301 09:32:44.895132 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:32:44 crc kubenswrapper[4792]: I0301 09:32:44.905673 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln" podStartSLOduration=5.374944135 podStartE2EDuration="16.905652483s" podCreationTimestamp="2026-03-01 09:32:28 +0000 UTC" firstStartedPulling="2026-03-01 09:32:29.182630847 +0000 UTC m=+1478.424510044" lastFinishedPulling="2026-03-01 09:32:40.713339195 +0000 UTC m=+1489.955218392" observedRunningTime="2026-03-01 09:32:41.880041104 +0000 UTC m=+1491.121920371" watchObservedRunningTime="2026-03-01 09:32:44.905652483 +0000 UTC m=+1494.147531700" Mar 01 09:32:45 crc kubenswrapper[4792]: I0301 09:32:45.561797 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8np4j"] Mar 01 09:32:45 crc kubenswrapper[4792]: I0301 09:32:45.564168 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8np4j" Mar 01 09:32:45 crc kubenswrapper[4792]: I0301 09:32:45.629099 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8np4j"] Mar 01 09:32:45 crc kubenswrapper[4792]: I0301 09:32:45.702930 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9469s\" (UniqueName: \"kubernetes.io/projected/6ade9641-e262-4086-b871-3d010d48a86a-kube-api-access-9469s\") pod \"redhat-operators-8np4j\" (UID: \"6ade9641-e262-4086-b871-3d010d48a86a\") " pod="openshift-marketplace/redhat-operators-8np4j" Mar 01 09:32:45 crc kubenswrapper[4792]: I0301 09:32:45.702993 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ade9641-e262-4086-b871-3d010d48a86a-utilities\") pod \"redhat-operators-8np4j\" (UID: \"6ade9641-e262-4086-b871-3d010d48a86a\") " pod="openshift-marketplace/redhat-operators-8np4j" Mar 01 09:32:45 crc kubenswrapper[4792]: I0301 09:32:45.703042 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ade9641-e262-4086-b871-3d010d48a86a-catalog-content\") pod \"redhat-operators-8np4j\" (UID: \"6ade9641-e262-4086-b871-3d010d48a86a\") " pod="openshift-marketplace/redhat-operators-8np4j" Mar 01 09:32:45 crc kubenswrapper[4792]: I0301 09:32:45.804777 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9469s\" (UniqueName: \"kubernetes.io/projected/6ade9641-e262-4086-b871-3d010d48a86a-kube-api-access-9469s\") pod \"redhat-operators-8np4j\" (UID: \"6ade9641-e262-4086-b871-3d010d48a86a\") " pod="openshift-marketplace/redhat-operators-8np4j" Mar 01 09:32:45 crc kubenswrapper[4792]: I0301 09:32:45.804848 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ade9641-e262-4086-b871-3d010d48a86a-utilities\") pod \"redhat-operators-8np4j\" (UID: \"6ade9641-e262-4086-b871-3d010d48a86a\") " pod="openshift-marketplace/redhat-operators-8np4j" Mar 01 09:32:45 crc kubenswrapper[4792]: I0301 09:32:45.804933 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ade9641-e262-4086-b871-3d010d48a86a-catalog-content\") pod \"redhat-operators-8np4j\" (UID: \"6ade9641-e262-4086-b871-3d010d48a86a\") " pod="openshift-marketplace/redhat-operators-8np4j" Mar 01 09:32:45 crc kubenswrapper[4792]: I0301 09:32:45.805441 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ade9641-e262-4086-b871-3d010d48a86a-utilities\") pod \"redhat-operators-8np4j\" (UID: \"6ade9641-e262-4086-b871-3d010d48a86a\") " pod="openshift-marketplace/redhat-operators-8np4j" Mar 01 09:32:45 crc kubenswrapper[4792]: I0301 09:32:45.805446 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ade9641-e262-4086-b871-3d010d48a86a-catalog-content\") pod \"redhat-operators-8np4j\" (UID: \"6ade9641-e262-4086-b871-3d010d48a86a\") " pod="openshift-marketplace/redhat-operators-8np4j" Mar 01 09:32:45 crc kubenswrapper[4792]: I0301 09:32:45.824947 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9469s\" (UniqueName: \"kubernetes.io/projected/6ade9641-e262-4086-b871-3d010d48a86a-kube-api-access-9469s\") pod \"redhat-operators-8np4j\" (UID: \"6ade9641-e262-4086-b871-3d010d48a86a\") " pod="openshift-marketplace/redhat-operators-8np4j" Mar 01 09:32:45 crc kubenswrapper[4792]: I0301 09:32:45.884838 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8np4j" Mar 01 09:32:46 crc kubenswrapper[4792]: I0301 09:32:46.444921 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8np4j"] Mar 01 09:32:46 crc kubenswrapper[4792]: I0301 09:32:46.888500 4792 generic.go:334] "Generic (PLEG): container finished" podID="6ade9641-e262-4086-b871-3d010d48a86a" containerID="80ca295b5ffcae69a52408094affab38679155b5f4a8145c7ff37412f7ceb954" exitCode=0 Mar 01 09:32:46 crc kubenswrapper[4792]: I0301 09:32:46.888606 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8np4j" event={"ID":"6ade9641-e262-4086-b871-3d010d48a86a","Type":"ContainerDied","Data":"80ca295b5ffcae69a52408094affab38679155b5f4a8145c7ff37412f7ceb954"} Mar 01 09:32:46 crc kubenswrapper[4792]: I0301 09:32:46.888778 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8np4j" event={"ID":"6ade9641-e262-4086-b871-3d010d48a86a","Type":"ContainerStarted","Data":"0191adc5316cf18ee8f837f299248bbe54008a3de338797be3e29fa3dacc63bb"} Mar 01 09:32:47 crc kubenswrapper[4792]: I0301 09:32:47.908021 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8np4j" event={"ID":"6ade9641-e262-4086-b871-3d010d48a86a","Type":"ContainerStarted","Data":"f3ae7ab1c5a82ee5a1b41247907dc695e85724c88c6834ad6eff45f31feeda9e"} Mar 01 09:32:52 crc kubenswrapper[4792]: I0301 09:32:52.955044 4792 generic.go:334] "Generic (PLEG): container finished" podID="54e68c85-54c7-4855-b4a0-a85d2014c7b7" containerID="5773fb565f7454c83bd5a97647f2258db8af4721a7199d6a6eb816399f3a0abe" exitCode=0 Mar 01 09:32:52 crc kubenswrapper[4792]: I0301 09:32:52.955109 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln" event={"ID":"54e68c85-54c7-4855-b4a0-a85d2014c7b7","Type":"ContainerDied","Data":"5773fb565f7454c83bd5a97647f2258db8af4721a7199d6a6eb816399f3a0abe"} Mar 01 09:32:54 crc kubenswrapper[4792]: I0301 09:32:54.543381 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln" Mar 01 09:32:54 crc kubenswrapper[4792]: I0301 09:32:54.599777 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54e68c85-54c7-4855-b4a0-a85d2014c7b7-repo-setup-combined-ca-bundle\") pod \"54e68c85-54c7-4855-b4a0-a85d2014c7b7\" (UID: \"54e68c85-54c7-4855-b4a0-a85d2014c7b7\") " Mar 01 09:32:54 crc kubenswrapper[4792]: I0301 09:32:54.600135 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/54e68c85-54c7-4855-b4a0-a85d2014c7b7-inventory\") pod \"54e68c85-54c7-4855-b4a0-a85d2014c7b7\" (UID: \"54e68c85-54c7-4855-b4a0-a85d2014c7b7\") " Mar 01 09:32:54 crc kubenswrapper[4792]: I0301 09:32:54.600421 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/54e68c85-54c7-4855-b4a0-a85d2014c7b7-ssh-key-openstack-edpm-ipam\") pod \"54e68c85-54c7-4855-b4a0-a85d2014c7b7\" (UID: \"54e68c85-54c7-4855-b4a0-a85d2014c7b7\") " Mar 01 09:32:54 crc kubenswrapper[4792]: I0301 09:32:54.601022 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p25kk\" (UniqueName: \"kubernetes.io/projected/54e68c85-54c7-4855-b4a0-a85d2014c7b7-kube-api-access-p25kk\") pod \"54e68c85-54c7-4855-b4a0-a85d2014c7b7\" (UID: \"54e68c85-54c7-4855-b4a0-a85d2014c7b7\") " Mar 01 09:32:54 crc kubenswrapper[4792]: I0301 09:32:54.607454 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54e68c85-54c7-4855-b4a0-a85d2014c7b7-kube-api-access-p25kk" (OuterVolumeSpecName: "kube-api-access-p25kk") pod "54e68c85-54c7-4855-b4a0-a85d2014c7b7" (UID: "54e68c85-54c7-4855-b4a0-a85d2014c7b7"). InnerVolumeSpecName "kube-api-access-p25kk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:32:54 crc kubenswrapper[4792]: I0301 09:32:54.623146 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54e68c85-54c7-4855-b4a0-a85d2014c7b7-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "54e68c85-54c7-4855-b4a0-a85d2014c7b7" (UID: "54e68c85-54c7-4855-b4a0-a85d2014c7b7"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:32:54 crc kubenswrapper[4792]: I0301 09:32:54.632615 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54e68c85-54c7-4855-b4a0-a85d2014c7b7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "54e68c85-54c7-4855-b4a0-a85d2014c7b7" (UID: "54e68c85-54c7-4855-b4a0-a85d2014c7b7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:32:54 crc kubenswrapper[4792]: I0301 09:32:54.650817 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54e68c85-54c7-4855-b4a0-a85d2014c7b7-inventory" (OuterVolumeSpecName: "inventory") pod "54e68c85-54c7-4855-b4a0-a85d2014c7b7" (UID: "54e68c85-54c7-4855-b4a0-a85d2014c7b7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:32:54 crc kubenswrapper[4792]: I0301 09:32:54.704586 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/54e68c85-54c7-4855-b4a0-a85d2014c7b7-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 09:32:54 crc kubenswrapper[4792]: I0301 09:32:54.704662 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/54e68c85-54c7-4855-b4a0-a85d2014c7b7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:32:54 crc kubenswrapper[4792]: I0301 09:32:54.704689 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p25kk\" (UniqueName: \"kubernetes.io/projected/54e68c85-54c7-4855-b4a0-a85d2014c7b7-kube-api-access-p25kk\") on node \"crc\" DevicePath \"\"" Mar 01 09:32:54 crc kubenswrapper[4792]: I0301 09:32:54.704709 4792 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54e68c85-54c7-4855-b4a0-a85d2014c7b7-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:32:54 crc kubenswrapper[4792]: I0301 09:32:54.974520 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln" event={"ID":"54e68c85-54c7-4855-b4a0-a85d2014c7b7","Type":"ContainerDied","Data":"d567265e3ca5a368f3df31052aed27c28d8ae817e07fd00d4fefda5d841711b8"} Mar 01 09:32:54 crc kubenswrapper[4792]: I0301 09:32:54.974566 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d567265e3ca5a368f3df31052aed27c28d8ae817e07fd00d4fefda5d841711b8" Mar 01 09:32:54 crc kubenswrapper[4792]: I0301 09:32:54.974536 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln" Mar 01 09:32:54 crc kubenswrapper[4792]: I0301 09:32:54.977154 4792 generic.go:334] "Generic (PLEG): container finished" podID="6ade9641-e262-4086-b871-3d010d48a86a" containerID="f3ae7ab1c5a82ee5a1b41247907dc695e85724c88c6834ad6eff45f31feeda9e" exitCode=0 Mar 01 09:32:54 crc kubenswrapper[4792]: I0301 09:32:54.977190 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8np4j" event={"ID":"6ade9641-e262-4086-b871-3d010d48a86a","Type":"ContainerDied","Data":"f3ae7ab1c5a82ee5a1b41247907dc695e85724c88c6834ad6eff45f31feeda9e"} Mar 01 09:32:55 crc kubenswrapper[4792]: I0301 09:32:55.084825 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs"] Mar 01 09:32:55 crc kubenswrapper[4792]: E0301 09:32:55.085211 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54e68c85-54c7-4855-b4a0-a85d2014c7b7" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 01 09:32:55 crc kubenswrapper[4792]: I0301 09:32:55.085228 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="54e68c85-54c7-4855-b4a0-a85d2014c7b7" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 01 09:32:55 crc kubenswrapper[4792]: I0301 09:32:55.085389 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="54e68c85-54c7-4855-b4a0-a85d2014c7b7" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 01 09:32:55 crc kubenswrapper[4792]: I0301 09:32:55.085955 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs" Mar 01 09:32:55 crc kubenswrapper[4792]: I0301 09:32:55.088687 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:32:55 crc kubenswrapper[4792]: I0301 09:32:55.089158 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:32:55 crc kubenswrapper[4792]: I0301 09:32:55.089451 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:32:55 crc kubenswrapper[4792]: I0301 09:32:55.089630 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:32:55 crc kubenswrapper[4792]: I0301 09:32:55.101332 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs"] Mar 01 09:32:55 crc kubenswrapper[4792]: I0301 09:32:55.109274 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a742181-aebe-42f8-a83e-fee7b480366b-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs\" (UID: \"4a742181-aebe-42f8-a83e-fee7b480366b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs" Mar 01 09:32:55 crc kubenswrapper[4792]: I0301 09:32:55.109473 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a742181-aebe-42f8-a83e-fee7b480366b-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs\" (UID: \"4a742181-aebe-42f8-a83e-fee7b480366b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs" Mar 01 09:32:55 crc kubenswrapper[4792]: I0301 09:32:55.109559 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlgpb\" (UniqueName: \"kubernetes.io/projected/4a742181-aebe-42f8-a83e-fee7b480366b-kube-api-access-xlgpb\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs\" (UID: \"4a742181-aebe-42f8-a83e-fee7b480366b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs" Mar 01 09:32:55 crc kubenswrapper[4792]: I0301 09:32:55.109635 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4a742181-aebe-42f8-a83e-fee7b480366b-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs\" (UID: \"4a742181-aebe-42f8-a83e-fee7b480366b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs" Mar 01 09:32:55 crc kubenswrapper[4792]: I0301 09:32:55.211285 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlgpb\" (UniqueName: \"kubernetes.io/projected/4a742181-aebe-42f8-a83e-fee7b480366b-kube-api-access-xlgpb\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs\" (UID: \"4a742181-aebe-42f8-a83e-fee7b480366b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs" Mar 01 09:32:55 crc kubenswrapper[4792]: I0301 09:32:55.211363 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4a742181-aebe-42f8-a83e-fee7b480366b-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs\" (UID: \"4a742181-aebe-42f8-a83e-fee7b480366b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs" Mar 01 09:32:55 crc kubenswrapper[4792]: I0301 09:32:55.211601 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a742181-aebe-42f8-a83e-fee7b480366b-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs\" (UID: \"4a742181-aebe-42f8-a83e-fee7b480366b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs" Mar 01 09:32:55 crc kubenswrapper[4792]: I0301 09:32:55.211654 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a742181-aebe-42f8-a83e-fee7b480366b-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs\" (UID: \"4a742181-aebe-42f8-a83e-fee7b480366b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs" Mar 01 09:32:55 crc kubenswrapper[4792]: I0301 09:32:55.215686 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a742181-aebe-42f8-a83e-fee7b480366b-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs\" (UID: \"4a742181-aebe-42f8-a83e-fee7b480366b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs" Mar 01 09:32:55 crc kubenswrapper[4792]: I0301 09:32:55.215870 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4a742181-aebe-42f8-a83e-fee7b480366b-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs\" (UID: \"4a742181-aebe-42f8-a83e-fee7b480366b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs" Mar 01 09:32:55 crc kubenswrapper[4792]: I0301 09:32:55.216253 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a742181-aebe-42f8-a83e-fee7b480366b-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs\" (UID: \"4a742181-aebe-42f8-a83e-fee7b480366b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs" Mar 01 09:32:55 crc kubenswrapper[4792]: I0301 09:32:55.235285 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlgpb\" (UniqueName: \"kubernetes.io/projected/4a742181-aebe-42f8-a83e-fee7b480366b-kube-api-access-xlgpb\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs\" (UID: \"4a742181-aebe-42f8-a83e-fee7b480366b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs" Mar 01 09:32:55 crc kubenswrapper[4792]: I0301 09:32:55.407139 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs" Mar 01 09:32:55 crc kubenswrapper[4792]: I0301 09:32:55.988658 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8np4j" event={"ID":"6ade9641-e262-4086-b871-3d010d48a86a","Type":"ContainerStarted","Data":"14aff226ae1ef4a69e0db8d441089bf4ab4d70fd98e04b21c3691c5e2b4594f0"} Mar 01 09:32:56 crc kubenswrapper[4792]: I0301 09:32:56.014551 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8np4j" podStartSLOduration=2.533332442 podStartE2EDuration="11.014533512s" podCreationTimestamp="2026-03-01 09:32:45 +0000 UTC" firstStartedPulling="2026-03-01 09:32:46.890679467 +0000 UTC m=+1496.132558664" lastFinishedPulling="2026-03-01 09:32:55.371880537 +0000 UTC m=+1504.613759734" observedRunningTime="2026-03-01 09:32:56.013470907 +0000 UTC m=+1505.255350104" watchObservedRunningTime="2026-03-01 09:32:56.014533512 +0000 UTC m=+1505.256412709" Mar 01 09:32:56 crc kubenswrapper[4792]: I0301 09:32:56.035212 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs"] Mar 01 09:32:56 crc kubenswrapper[4792]: W0301 09:32:56.041090 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a742181_aebe_42f8_a83e_fee7b480366b.slice/crio-b08f4fb0c894f674888aac3a423e143dc1d90f11637cf92b6939cf3c1832d890 WatchSource:0}: Error finding container b08f4fb0c894f674888aac3a423e143dc1d90f11637cf92b6939cf3c1832d890: Status 404 returned error can't find the container with id b08f4fb0c894f674888aac3a423e143dc1d90f11637cf92b6939cf3c1832d890 Mar 01 09:32:56 crc kubenswrapper[4792]: I0301 09:32:56.996543 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs" event={"ID":"4a742181-aebe-42f8-a83e-fee7b480366b","Type":"ContainerStarted","Data":"1fcee8427ea6340db8e69cb0e43a52de1fe2f18dc84e960d49fc0b0918052c29"} Mar 01 09:32:56 crc kubenswrapper[4792]: I0301 09:32:56.997067 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs" event={"ID":"4a742181-aebe-42f8-a83e-fee7b480366b","Type":"ContainerStarted","Data":"b08f4fb0c894f674888aac3a423e143dc1d90f11637cf92b6939cf3c1832d890"} Mar 01 09:32:57 crc kubenswrapper[4792]: I0301 09:32:57.014847 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs" podStartSLOduration=1.6561432200000001 podStartE2EDuration="2.014829335s" podCreationTimestamp="2026-03-01 09:32:55 +0000 UTC" firstStartedPulling="2026-03-01 09:32:56.043673522 +0000 UTC m=+1505.285552719" lastFinishedPulling="2026-03-01 09:32:56.402359637 +0000 UTC m=+1505.644238834" observedRunningTime="2026-03-01 09:32:57.012536021 +0000 UTC m=+1506.254415218" watchObservedRunningTime="2026-03-01 09:32:57.014829335 +0000 UTC m=+1506.256708532" Mar 01 09:33:04 crc kubenswrapper[4792]: I0301 09:33:04.943248 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:33:04 crc kubenswrapper[4792]: I0301 09:33:04.943787 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:33:05 crc kubenswrapper[4792]: I0301 09:33:05.885617 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8np4j" Mar 01 09:33:05 crc kubenswrapper[4792]: I0301 09:33:05.885665 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8np4j" Mar 01 09:33:06 crc kubenswrapper[4792]: I0301 09:33:06.929038 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8np4j" podUID="6ade9641-e262-4086-b871-3d010d48a86a" containerName="registry-server" probeResult="failure" output=< Mar 01 09:33:06 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 09:33:06 crc kubenswrapper[4792]: > Mar 01 09:33:16 crc kubenswrapper[4792]: I0301 09:33:16.929073 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8np4j" podUID="6ade9641-e262-4086-b871-3d010d48a86a" containerName="registry-server" probeResult="failure" output=< Mar 01 09:33:16 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 09:33:16 crc kubenswrapper[4792]: > Mar 01 09:33:25 crc kubenswrapper[4792]: I0301 09:33:25.931447 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8np4j" Mar 01 09:33:25 crc kubenswrapper[4792]: I0301 09:33:25.979897 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8np4j" Mar 01 09:33:26 crc kubenswrapper[4792]: I0301 09:33:26.187238 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8np4j"] Mar 01 09:33:27 crc kubenswrapper[4792]: I0301 09:33:27.280117 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8np4j" podUID="6ade9641-e262-4086-b871-3d010d48a86a" containerName="registry-server" containerID="cri-o://14aff226ae1ef4a69e0db8d441089bf4ab4d70fd98e04b21c3691c5e2b4594f0" gracePeriod=2 Mar 01 09:33:27 crc kubenswrapper[4792]: I0301 09:33:27.637213 4792 scope.go:117] "RemoveContainer" containerID="e872a8250debe35b2b405169cd43bdbc962c34739bd277e35d8038f3fa166251" Mar 01 09:33:27 crc kubenswrapper[4792]: I0301 09:33:27.647936 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8np4j" Mar 01 09:33:27 crc kubenswrapper[4792]: I0301 09:33:27.658698 4792 scope.go:117] "RemoveContainer" containerID="6dd143b9e9badd592279cca432fe539c49e92a79ca469f608516d1e967d18c73" Mar 01 09:33:27 crc kubenswrapper[4792]: I0301 09:33:27.698181 4792 scope.go:117] "RemoveContainer" containerID="68eabc969b4329c81ee454f5c339af1b09a491b6cf0b1ab092fc279d1ef9e440" Mar 01 09:33:27 crc kubenswrapper[4792]: I0301 09:33:27.735451 4792 scope.go:117] "RemoveContainer" containerID="81c1c2615bd05b6e2f8a23b6d892f8335b3c7a5c117575ce3ed245f2faa7543f" Mar 01 09:33:27 crc kubenswrapper[4792]: I0301 09:33:27.791651 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ade9641-e262-4086-b871-3d010d48a86a-catalog-content\") pod \"6ade9641-e262-4086-b871-3d010d48a86a\" (UID: \"6ade9641-e262-4086-b871-3d010d48a86a\") " Mar 01 09:33:27 crc kubenswrapper[4792]: I0301 09:33:27.791744 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ade9641-e262-4086-b871-3d010d48a86a-utilities\") pod \"6ade9641-e262-4086-b871-3d010d48a86a\" (UID: \"6ade9641-e262-4086-b871-3d010d48a86a\") " Mar 01 09:33:27 crc kubenswrapper[4792]: I0301 09:33:27.791988 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9469s\" (UniqueName: \"kubernetes.io/projected/6ade9641-e262-4086-b871-3d010d48a86a-kube-api-access-9469s\") pod \"6ade9641-e262-4086-b871-3d010d48a86a\" (UID: \"6ade9641-e262-4086-b871-3d010d48a86a\") " Mar 01 09:33:27 crc kubenswrapper[4792]: I0301 09:33:27.793009 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ade9641-e262-4086-b871-3d010d48a86a-utilities" (OuterVolumeSpecName: "utilities") pod "6ade9641-e262-4086-b871-3d010d48a86a" (UID: "6ade9641-e262-4086-b871-3d010d48a86a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:33:27 crc kubenswrapper[4792]: I0301 09:33:27.796748 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ade9641-e262-4086-b871-3d010d48a86a-kube-api-access-9469s" (OuterVolumeSpecName: "kube-api-access-9469s") pod "6ade9641-e262-4086-b871-3d010d48a86a" (UID: "6ade9641-e262-4086-b871-3d010d48a86a"). InnerVolumeSpecName "kube-api-access-9469s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:33:27 crc kubenswrapper[4792]: I0301 09:33:27.894657 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9469s\" (UniqueName: \"kubernetes.io/projected/6ade9641-e262-4086-b871-3d010d48a86a-kube-api-access-9469s\") on node \"crc\" DevicePath \"\"" Mar 01 09:33:27 crc kubenswrapper[4792]: I0301 09:33:27.894701 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ade9641-e262-4086-b871-3d010d48a86a-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:33:27 crc kubenswrapper[4792]: I0301 09:33:27.918726 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ade9641-e262-4086-b871-3d010d48a86a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6ade9641-e262-4086-b871-3d010d48a86a" (UID: "6ade9641-e262-4086-b871-3d010d48a86a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:33:27 crc kubenswrapper[4792]: I0301 09:33:27.996985 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ade9641-e262-4086-b871-3d010d48a86a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:33:28 crc kubenswrapper[4792]: I0301 09:33:28.291967 4792 generic.go:334] "Generic (PLEG): container finished" podID="6ade9641-e262-4086-b871-3d010d48a86a" containerID="14aff226ae1ef4a69e0db8d441089bf4ab4d70fd98e04b21c3691c5e2b4594f0" exitCode=0 Mar 01 09:33:28 crc kubenswrapper[4792]: I0301 09:33:28.292029 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8np4j" Mar 01 09:33:28 crc kubenswrapper[4792]: I0301 09:33:28.292017 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8np4j" event={"ID":"6ade9641-e262-4086-b871-3d010d48a86a","Type":"ContainerDied","Data":"14aff226ae1ef4a69e0db8d441089bf4ab4d70fd98e04b21c3691c5e2b4594f0"} Mar 01 09:33:28 crc kubenswrapper[4792]: I0301 09:33:28.292377 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8np4j" event={"ID":"6ade9641-e262-4086-b871-3d010d48a86a","Type":"ContainerDied","Data":"0191adc5316cf18ee8f837f299248bbe54008a3de338797be3e29fa3dacc63bb"} Mar 01 09:33:28 crc kubenswrapper[4792]: I0301 09:33:28.292408 4792 scope.go:117] "RemoveContainer" containerID="14aff226ae1ef4a69e0db8d441089bf4ab4d70fd98e04b21c3691c5e2b4594f0" Mar 01 09:33:28 crc kubenswrapper[4792]: I0301 09:33:28.320134 4792 scope.go:117] "RemoveContainer" containerID="f3ae7ab1c5a82ee5a1b41247907dc695e85724c88c6834ad6eff45f31feeda9e" Mar 01 09:33:28 crc kubenswrapper[4792]: I0301 09:33:28.332309 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8np4j"] Mar 01 09:33:28 crc kubenswrapper[4792]: I0301 09:33:28.341801 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8np4j"] Mar 01 09:33:28 crc kubenswrapper[4792]: I0301 09:33:28.344227 4792 scope.go:117] "RemoveContainer" containerID="80ca295b5ffcae69a52408094affab38679155b5f4a8145c7ff37412f7ceb954" Mar 01 09:33:28 crc kubenswrapper[4792]: I0301 09:33:28.364219 4792 scope.go:117] "RemoveContainer" containerID="14aff226ae1ef4a69e0db8d441089bf4ab4d70fd98e04b21c3691c5e2b4594f0" Mar 01 09:33:28 crc kubenswrapper[4792]: E0301 09:33:28.364704 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14aff226ae1ef4a69e0db8d441089bf4ab4d70fd98e04b21c3691c5e2b4594f0\": container with ID starting with 14aff226ae1ef4a69e0db8d441089bf4ab4d70fd98e04b21c3691c5e2b4594f0 not found: ID does not exist" containerID="14aff226ae1ef4a69e0db8d441089bf4ab4d70fd98e04b21c3691c5e2b4594f0" Mar 01 09:33:28 crc kubenswrapper[4792]: I0301 09:33:28.364733 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14aff226ae1ef4a69e0db8d441089bf4ab4d70fd98e04b21c3691c5e2b4594f0"} err="failed to get container status \"14aff226ae1ef4a69e0db8d441089bf4ab4d70fd98e04b21c3691c5e2b4594f0\": rpc error: code = NotFound desc = could not find container \"14aff226ae1ef4a69e0db8d441089bf4ab4d70fd98e04b21c3691c5e2b4594f0\": container with ID starting with 14aff226ae1ef4a69e0db8d441089bf4ab4d70fd98e04b21c3691c5e2b4594f0 not found: ID does not exist" Mar 01 09:33:28 crc kubenswrapper[4792]: I0301 09:33:28.364754 4792 scope.go:117] "RemoveContainer" containerID="f3ae7ab1c5a82ee5a1b41247907dc695e85724c88c6834ad6eff45f31feeda9e" Mar 01 09:33:28 crc kubenswrapper[4792]: E0301 09:33:28.365034 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3ae7ab1c5a82ee5a1b41247907dc695e85724c88c6834ad6eff45f31feeda9e\": container with ID starting with f3ae7ab1c5a82ee5a1b41247907dc695e85724c88c6834ad6eff45f31feeda9e not found: ID does not exist" containerID="f3ae7ab1c5a82ee5a1b41247907dc695e85724c88c6834ad6eff45f31feeda9e" Mar 01 09:33:28 crc kubenswrapper[4792]: I0301 09:33:28.365060 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3ae7ab1c5a82ee5a1b41247907dc695e85724c88c6834ad6eff45f31feeda9e"} err="failed to get container status \"f3ae7ab1c5a82ee5a1b41247907dc695e85724c88c6834ad6eff45f31feeda9e\": rpc error: code = NotFound desc = could not find container \"f3ae7ab1c5a82ee5a1b41247907dc695e85724c88c6834ad6eff45f31feeda9e\": container with ID starting with f3ae7ab1c5a82ee5a1b41247907dc695e85724c88c6834ad6eff45f31feeda9e not found: ID does not exist" Mar 01 09:33:28 crc kubenswrapper[4792]: I0301 09:33:28.365195 4792 scope.go:117] "RemoveContainer" containerID="80ca295b5ffcae69a52408094affab38679155b5f4a8145c7ff37412f7ceb954" Mar 01 09:33:28 crc kubenswrapper[4792]: E0301 09:33:28.365404 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80ca295b5ffcae69a52408094affab38679155b5f4a8145c7ff37412f7ceb954\": container with ID starting with 80ca295b5ffcae69a52408094affab38679155b5f4a8145c7ff37412f7ceb954 not found: ID does not exist" containerID="80ca295b5ffcae69a52408094affab38679155b5f4a8145c7ff37412f7ceb954" Mar 01 09:33:28 crc kubenswrapper[4792]: I0301 09:33:28.365422 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80ca295b5ffcae69a52408094affab38679155b5f4a8145c7ff37412f7ceb954"} err="failed to get container status \"80ca295b5ffcae69a52408094affab38679155b5f4a8145c7ff37412f7ceb954\": rpc error: code = NotFound desc = could not find container \"80ca295b5ffcae69a52408094affab38679155b5f4a8145c7ff37412f7ceb954\": container with ID starting with 80ca295b5ffcae69a52408094affab38679155b5f4a8145c7ff37412f7ceb954 not found: ID does not exist" Mar 01 09:33:29 crc kubenswrapper[4792]: I0301 09:33:29.418424 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ade9641-e262-4086-b871-3d010d48a86a" path="/var/lib/kubelet/pods/6ade9641-e262-4086-b871-3d010d48a86a/volumes" Mar 01 09:33:34 crc kubenswrapper[4792]: I0301 09:33:34.943509 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:33:34 crc kubenswrapper[4792]: I0301 09:33:34.944062 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:34:00 crc kubenswrapper[4792]: I0301 09:34:00.146588 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539294-5jtlh"] Mar 01 09:34:00 crc kubenswrapper[4792]: E0301 09:34:00.149332 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ade9641-e262-4086-b871-3d010d48a86a" containerName="registry-server" Mar 01 09:34:00 crc kubenswrapper[4792]: I0301 09:34:00.149460 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ade9641-e262-4086-b871-3d010d48a86a" containerName="registry-server" Mar 01 09:34:00 crc kubenswrapper[4792]: E0301 09:34:00.149541 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ade9641-e262-4086-b871-3d010d48a86a" containerName="extract-utilities" Mar 01 09:34:00 crc kubenswrapper[4792]: I0301 09:34:00.149612 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ade9641-e262-4086-b871-3d010d48a86a" containerName="extract-utilities" Mar 01 09:34:00 crc kubenswrapper[4792]: E0301 09:34:00.149701 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ade9641-e262-4086-b871-3d010d48a86a" containerName="extract-content" Mar 01 09:34:00 crc kubenswrapper[4792]: I0301 09:34:00.149778 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ade9641-e262-4086-b871-3d010d48a86a" containerName="extract-content" Mar 01 09:34:00 crc kubenswrapper[4792]: I0301 09:34:00.150110 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ade9641-e262-4086-b871-3d010d48a86a" containerName="registry-server" Mar 01 09:34:00 crc kubenswrapper[4792]: I0301 09:34:00.150960 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539294-5jtlh" Mar 01 09:34:00 crc kubenswrapper[4792]: I0301 09:34:00.153984 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:34:00 crc kubenswrapper[4792]: I0301 09:34:00.154187 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:34:00 crc kubenswrapper[4792]: I0301 09:34:00.154364 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:34:00 crc kubenswrapper[4792]: I0301 09:34:00.157508 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539294-5jtlh"] Mar 01 09:34:00 crc kubenswrapper[4792]: I0301 09:34:00.275916 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmfl9\" (UniqueName: \"kubernetes.io/projected/a6725e35-5100-4360-85ca-00aad33007d4-kube-api-access-qmfl9\") pod \"auto-csr-approver-29539294-5jtlh\" (UID: \"a6725e35-5100-4360-85ca-00aad33007d4\") " pod="openshift-infra/auto-csr-approver-29539294-5jtlh" Mar 01 09:34:00 crc kubenswrapper[4792]: I0301 09:34:00.377731 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmfl9\" (UniqueName: \"kubernetes.io/projected/a6725e35-5100-4360-85ca-00aad33007d4-kube-api-access-qmfl9\") pod \"auto-csr-approver-29539294-5jtlh\" (UID: \"a6725e35-5100-4360-85ca-00aad33007d4\") " pod="openshift-infra/auto-csr-approver-29539294-5jtlh" Mar 01 09:34:00 crc kubenswrapper[4792]: I0301 09:34:00.408830 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmfl9\" (UniqueName: \"kubernetes.io/projected/a6725e35-5100-4360-85ca-00aad33007d4-kube-api-access-qmfl9\") pod \"auto-csr-approver-29539294-5jtlh\" (UID: \"a6725e35-5100-4360-85ca-00aad33007d4\") " pod="openshift-infra/auto-csr-approver-29539294-5jtlh" Mar 01 09:34:00 crc kubenswrapper[4792]: I0301 09:34:00.472554 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539294-5jtlh" Mar 01 09:34:00 crc kubenswrapper[4792]: I0301 09:34:00.922955 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 01 09:34:00 crc kubenswrapper[4792]: I0301 09:34:00.924702 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539294-5jtlh"] Mar 01 09:34:01 crc kubenswrapper[4792]: I0301 09:34:01.561414 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539294-5jtlh" event={"ID":"a6725e35-5100-4360-85ca-00aad33007d4","Type":"ContainerStarted","Data":"63dbef626b111579f2bd4db9775a778114d3a877b3fcc4798e7ba32d50b983f5"} Mar 01 09:34:02 crc kubenswrapper[4792]: I0301 09:34:02.572814 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539294-5jtlh" event={"ID":"a6725e35-5100-4360-85ca-00aad33007d4","Type":"ContainerStarted","Data":"e107bd42472fceec66462b44aaa6f7f47fb07e9eba8ac8e30bec4fee69d4eff3"} Mar 01 09:34:02 crc kubenswrapper[4792]: I0301 09:34:02.594859 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29539294-5jtlh" podStartSLOduration=1.733798539 podStartE2EDuration="2.594823121s" podCreationTimestamp="2026-03-01 09:34:00 +0000 UTC" firstStartedPulling="2026-03-01 09:34:00.922743563 +0000 UTC m=+1570.164622760" lastFinishedPulling="2026-03-01 09:34:01.783768135 +0000 UTC m=+1571.025647342" observedRunningTime="2026-03-01 09:34:02.584456349 +0000 UTC m=+1571.826335536" watchObservedRunningTime="2026-03-01 09:34:02.594823121 +0000 UTC m=+1571.836702318" Mar 01 09:34:03 crc kubenswrapper[4792]: I0301 09:34:03.583627 4792 generic.go:334] "Generic (PLEG): container finished" podID="a6725e35-5100-4360-85ca-00aad33007d4" containerID="e107bd42472fceec66462b44aaa6f7f47fb07e9eba8ac8e30bec4fee69d4eff3" exitCode=0 Mar 01 09:34:03 crc kubenswrapper[4792]: I0301 09:34:03.583674 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539294-5jtlh" event={"ID":"a6725e35-5100-4360-85ca-00aad33007d4","Type":"ContainerDied","Data":"e107bd42472fceec66462b44aaa6f7f47fb07e9eba8ac8e30bec4fee69d4eff3"} Mar 01 09:34:04 crc kubenswrapper[4792]: I0301 09:34:04.921586 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539294-5jtlh" Mar 01 09:34:04 crc kubenswrapper[4792]: I0301 09:34:04.943120 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:34:04 crc kubenswrapper[4792]: I0301 09:34:04.943173 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:34:04 crc kubenswrapper[4792]: I0301 09:34:04.943210 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:34:04 crc kubenswrapper[4792]: I0301 09:34:04.943845 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6c6c336556c3895a23d652801049ab8fd2cc3ff89812dc0c31bb6831441e0e06"} pod="openshift-machine-config-operator/machine-config-daemon-bqszv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 01 09:34:04 crc kubenswrapper[4792]: I0301 09:34:04.943920 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" containerID="cri-o://6c6c336556c3895a23d652801049ab8fd2cc3ff89812dc0c31bb6831441e0e06" gracePeriod=600 Mar 01 09:34:05 crc kubenswrapper[4792]: I0301 09:34:05.069808 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmfl9\" (UniqueName: \"kubernetes.io/projected/a6725e35-5100-4360-85ca-00aad33007d4-kube-api-access-qmfl9\") pod \"a6725e35-5100-4360-85ca-00aad33007d4\" (UID: \"a6725e35-5100-4360-85ca-00aad33007d4\") " Mar 01 09:34:05 crc kubenswrapper[4792]: I0301 09:34:05.075386 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6725e35-5100-4360-85ca-00aad33007d4-kube-api-access-qmfl9" (OuterVolumeSpecName: "kube-api-access-qmfl9") pod "a6725e35-5100-4360-85ca-00aad33007d4" (UID: "a6725e35-5100-4360-85ca-00aad33007d4"). InnerVolumeSpecName "kube-api-access-qmfl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:34:05 crc kubenswrapper[4792]: I0301 09:34:05.172172 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmfl9\" (UniqueName: \"kubernetes.io/projected/a6725e35-5100-4360-85ca-00aad33007d4-kube-api-access-qmfl9\") on node \"crc\" DevicePath \"\"" Mar 01 09:34:05 crc kubenswrapper[4792]: I0301 09:34:05.618066 4792 generic.go:334] "Generic (PLEG): container finished" podID="9105f6b0-6f16-47aa-8009-73736a90b765" containerID="6c6c336556c3895a23d652801049ab8fd2cc3ff89812dc0c31bb6831441e0e06" exitCode=0 Mar 01 09:34:05 crc kubenswrapper[4792]: I0301 09:34:05.618758 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerDied","Data":"6c6c336556c3895a23d652801049ab8fd2cc3ff89812dc0c31bb6831441e0e06"} Mar 01 09:34:05 crc kubenswrapper[4792]: I0301 09:34:05.618875 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerStarted","Data":"60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e"} Mar 01 09:34:05 crc kubenswrapper[4792]: I0301 09:34:05.618983 4792 scope.go:117] "RemoveContainer" containerID="d3c51b46f8c635f0bd922e6a816c6cbadeb855fe42f4d60474cde44514e4e4de" Mar 01 09:34:05 crc kubenswrapper[4792]: I0301 09:34:05.636556 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539294-5jtlh" event={"ID":"a6725e35-5100-4360-85ca-00aad33007d4","Type":"ContainerDied","Data":"63dbef626b111579f2bd4db9775a778114d3a877b3fcc4798e7ba32d50b983f5"} Mar 01 09:34:05 crc kubenswrapper[4792]: I0301 09:34:05.636593 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63dbef626b111579f2bd4db9775a778114d3a877b3fcc4798e7ba32d50b983f5" Mar 01 09:34:05 crc kubenswrapper[4792]: I0301 09:34:05.636656 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539294-5jtlh" Mar 01 09:34:05 crc kubenswrapper[4792]: I0301 09:34:05.675928 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539288-9klf4"] Mar 01 09:34:05 crc kubenswrapper[4792]: I0301 09:34:05.686562 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539288-9klf4"] Mar 01 09:34:07 crc kubenswrapper[4792]: I0301 09:34:07.419141 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e8f417d-a9b7-4969-9e24-785fa8baf9c4" path="/var/lib/kubelet/pods/2e8f417d-a9b7-4969-9e24-785fa8baf9c4/volumes" Mar 01 09:34:27 crc kubenswrapper[4792]: I0301 09:34:27.824783 4792 scope.go:117] "RemoveContainer" containerID="ac3d91f96c6efaf7baa089ecdf84d5d3fe923f61545b960c7a1aa1d77e8db2e5" Mar 01 09:34:27 crc kubenswrapper[4792]: I0301 09:34:27.844689 4792 scope.go:117] "RemoveContainer" containerID="670f35fecbdbd6b75d71f5beac8b8d230ac85378c92ee4b0813d0d49f8a4dde5" Mar 01 09:34:27 crc kubenswrapper[4792]: I0301 09:34:27.902350 4792 scope.go:117] "RemoveContainer" containerID="0b4398286a53ae92983ef93db19480d6804e4b83a997761fc68f16627e65ecd5" Mar 01 09:34:27 crc kubenswrapper[4792]: I0301 09:34:27.952230 4792 scope.go:117] "RemoveContainer" containerID="4fd86a535781157d736a326c5d3973270ef9e75f90ca5c7a184728477646f601" Mar 01 09:34:59 crc kubenswrapper[4792]: I0301 09:34:59.455256 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wl5zx"] Mar 01 09:34:59 crc kubenswrapper[4792]: E0301 09:34:59.456657 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6725e35-5100-4360-85ca-00aad33007d4" containerName="oc" Mar 01 09:34:59 crc kubenswrapper[4792]: I0301 09:34:59.456676 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6725e35-5100-4360-85ca-00aad33007d4" containerName="oc" Mar 01 09:34:59 crc kubenswrapper[4792]: I0301 09:34:59.456898 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6725e35-5100-4360-85ca-00aad33007d4" containerName="oc" Mar 01 09:34:59 crc kubenswrapper[4792]: I0301 09:34:59.458483 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wl5zx" Mar 01 09:34:59 crc kubenswrapper[4792]: I0301 09:34:59.466960 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wl5zx"] Mar 01 09:34:59 crc kubenswrapper[4792]: I0301 09:34:59.556981 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a89252a8-b40d-4834-b779-de581f79f189-catalog-content\") pod \"redhat-marketplace-wl5zx\" (UID: \"a89252a8-b40d-4834-b779-de581f79f189\") " pod="openshift-marketplace/redhat-marketplace-wl5zx" Mar 01 09:34:59 crc kubenswrapper[4792]: I0301 09:34:59.557091 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a89252a8-b40d-4834-b779-de581f79f189-utilities\") pod \"redhat-marketplace-wl5zx\" (UID: \"a89252a8-b40d-4834-b779-de581f79f189\") " pod="openshift-marketplace/redhat-marketplace-wl5zx" Mar 01 09:34:59 crc kubenswrapper[4792]: I0301 09:34:59.557147 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29rc5\" (UniqueName: \"kubernetes.io/projected/a89252a8-b40d-4834-b779-de581f79f189-kube-api-access-29rc5\") pod \"redhat-marketplace-wl5zx\" (UID: \"a89252a8-b40d-4834-b779-de581f79f189\") " pod="openshift-marketplace/redhat-marketplace-wl5zx" Mar 01 09:34:59 crc kubenswrapper[4792]: I0301 09:34:59.659174 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29rc5\" (UniqueName: \"kubernetes.io/projected/a89252a8-b40d-4834-b779-de581f79f189-kube-api-access-29rc5\") pod \"redhat-marketplace-wl5zx\" (UID: \"a89252a8-b40d-4834-b779-de581f79f189\") " pod="openshift-marketplace/redhat-marketplace-wl5zx" Mar 01 09:34:59 crc kubenswrapper[4792]: I0301 09:34:59.659298 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a89252a8-b40d-4834-b779-de581f79f189-catalog-content\") pod \"redhat-marketplace-wl5zx\" (UID: \"a89252a8-b40d-4834-b779-de581f79f189\") " pod="openshift-marketplace/redhat-marketplace-wl5zx" Mar 01 09:34:59 crc kubenswrapper[4792]: I0301 09:34:59.659452 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a89252a8-b40d-4834-b779-de581f79f189-utilities\") pod \"redhat-marketplace-wl5zx\" (UID: \"a89252a8-b40d-4834-b779-de581f79f189\") " pod="openshift-marketplace/redhat-marketplace-wl5zx" Mar 01 09:34:59 crc kubenswrapper[4792]: I0301 09:34:59.659835 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a89252a8-b40d-4834-b779-de581f79f189-catalog-content\") pod \"redhat-marketplace-wl5zx\" (UID: \"a89252a8-b40d-4834-b779-de581f79f189\") " pod="openshift-marketplace/redhat-marketplace-wl5zx" Mar 01 09:34:59 crc kubenswrapper[4792]: I0301 09:34:59.659935 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a89252a8-b40d-4834-b779-de581f79f189-utilities\") pod \"redhat-marketplace-wl5zx\" (UID: \"a89252a8-b40d-4834-b779-de581f79f189\") " pod="openshift-marketplace/redhat-marketplace-wl5zx" Mar 01 09:34:59 crc kubenswrapper[4792]: I0301 09:34:59.677575 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29rc5\" (UniqueName: \"kubernetes.io/projected/a89252a8-b40d-4834-b779-de581f79f189-kube-api-access-29rc5\") pod \"redhat-marketplace-wl5zx\" (UID: \"a89252a8-b40d-4834-b779-de581f79f189\") " pod="openshift-marketplace/redhat-marketplace-wl5zx" Mar 01 09:34:59 crc kubenswrapper[4792]: I0301 09:34:59.779534 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wl5zx" Mar 01 09:35:00 crc kubenswrapper[4792]: I0301 09:35:00.239050 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wl5zx"] Mar 01 09:35:01 crc kubenswrapper[4792]: I0301 09:35:01.134072 4792 generic.go:334] "Generic (PLEG): container finished" podID="a89252a8-b40d-4834-b779-de581f79f189" containerID="f225bc33f46e6dda33bce0741685921fe0d892a9accebd1197ca53fc9c52e0b0" exitCode=0 Mar 01 09:35:01 crc kubenswrapper[4792]: I0301 09:35:01.134175 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wl5zx" event={"ID":"a89252a8-b40d-4834-b779-de581f79f189","Type":"ContainerDied","Data":"f225bc33f46e6dda33bce0741685921fe0d892a9accebd1197ca53fc9c52e0b0"} Mar 01 09:35:01 crc kubenswrapper[4792]: I0301 09:35:01.134398 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wl5zx" event={"ID":"a89252a8-b40d-4834-b779-de581f79f189","Type":"ContainerStarted","Data":"38095d24331ba4477eb556caad35e6df6322eb44f1fa245ce42a42dc8a30b1ca"} Mar 01 09:35:02 crc kubenswrapper[4792]: I0301 09:35:02.144580 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wl5zx" event={"ID":"a89252a8-b40d-4834-b779-de581f79f189","Type":"ContainerStarted","Data":"f5497cbcc8961e27f62acf3129e4a7486e0def49b4ca62f36cbf0ef4873f6c5c"} Mar 01 09:35:03 crc kubenswrapper[4792]: I0301 09:35:03.154112 4792 generic.go:334] "Generic (PLEG): container finished" podID="a89252a8-b40d-4834-b779-de581f79f189" containerID="f5497cbcc8961e27f62acf3129e4a7486e0def49b4ca62f36cbf0ef4873f6c5c" exitCode=0 Mar 01 09:35:03 crc kubenswrapper[4792]: I0301 09:35:03.154362 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wl5zx" event={"ID":"a89252a8-b40d-4834-b779-de581f79f189","Type":"ContainerDied","Data":"f5497cbcc8961e27f62acf3129e4a7486e0def49b4ca62f36cbf0ef4873f6c5c"} Mar 01 09:35:03 crc kubenswrapper[4792]: I0301 09:35:03.237412 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fcxll"] Mar 01 09:35:03 crc kubenswrapper[4792]: I0301 09:35:03.239418 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fcxll" Mar 01 09:35:03 crc kubenswrapper[4792]: I0301 09:35:03.253660 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fcxll"] Mar 01 09:35:03 crc kubenswrapper[4792]: I0301 09:35:03.330316 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1afe4776-6480-4f35-afcc-a281193262c9-catalog-content\") pod \"community-operators-fcxll\" (UID: \"1afe4776-6480-4f35-afcc-a281193262c9\") " pod="openshift-marketplace/community-operators-fcxll" Mar 01 09:35:03 crc kubenswrapper[4792]: I0301 09:35:03.330377 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6djqm\" (UniqueName: \"kubernetes.io/projected/1afe4776-6480-4f35-afcc-a281193262c9-kube-api-access-6djqm\") pod \"community-operators-fcxll\" (UID: \"1afe4776-6480-4f35-afcc-a281193262c9\") " pod="openshift-marketplace/community-operators-fcxll" Mar 01 09:35:03 crc kubenswrapper[4792]: I0301 09:35:03.330402 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1afe4776-6480-4f35-afcc-a281193262c9-utilities\") pod \"community-operators-fcxll\" (UID: \"1afe4776-6480-4f35-afcc-a281193262c9\") " pod="openshift-marketplace/community-operators-fcxll" Mar 01 09:35:03 crc kubenswrapper[4792]: I0301 09:35:03.432034 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6djqm\" (UniqueName: \"kubernetes.io/projected/1afe4776-6480-4f35-afcc-a281193262c9-kube-api-access-6djqm\") pod \"community-operators-fcxll\" (UID: \"1afe4776-6480-4f35-afcc-a281193262c9\") " pod="openshift-marketplace/community-operators-fcxll" Mar 01 09:35:03 crc kubenswrapper[4792]: I0301 09:35:03.432090 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1afe4776-6480-4f35-afcc-a281193262c9-utilities\") pod \"community-operators-fcxll\" (UID: \"1afe4776-6480-4f35-afcc-a281193262c9\") " pod="openshift-marketplace/community-operators-fcxll" Mar 01 09:35:03 crc kubenswrapper[4792]: I0301 09:35:03.432223 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1afe4776-6480-4f35-afcc-a281193262c9-catalog-content\") pod \"community-operators-fcxll\" (UID: \"1afe4776-6480-4f35-afcc-a281193262c9\") " pod="openshift-marketplace/community-operators-fcxll" Mar 01 09:35:03 crc kubenswrapper[4792]: I0301 09:35:03.432579 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1afe4776-6480-4f35-afcc-a281193262c9-utilities\") pod \"community-operators-fcxll\" (UID: \"1afe4776-6480-4f35-afcc-a281193262c9\") " pod="openshift-marketplace/community-operators-fcxll" Mar 01 09:35:03 crc kubenswrapper[4792]: I0301 09:35:03.432630 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1afe4776-6480-4f35-afcc-a281193262c9-catalog-content\") pod \"community-operators-fcxll\" (UID: \"1afe4776-6480-4f35-afcc-a281193262c9\") " pod="openshift-marketplace/community-operators-fcxll" Mar 01 09:35:03 crc kubenswrapper[4792]: I0301 09:35:03.452929 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6djqm\" (UniqueName: \"kubernetes.io/projected/1afe4776-6480-4f35-afcc-a281193262c9-kube-api-access-6djqm\") pod \"community-operators-fcxll\" (UID: \"1afe4776-6480-4f35-afcc-a281193262c9\") " pod="openshift-marketplace/community-operators-fcxll" Mar 01 09:35:03 crc kubenswrapper[4792]: I0301 09:35:03.556969 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fcxll" Mar 01 09:35:04 crc kubenswrapper[4792]: I0301 09:35:04.061964 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fcxll"] Mar 01 09:35:04 crc kubenswrapper[4792]: I0301 09:35:04.163265 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcxll" event={"ID":"1afe4776-6480-4f35-afcc-a281193262c9","Type":"ContainerStarted","Data":"3a50e5fc5ec68b469488a198251a926259d52265bbc08a456414a2ff134347dd"} Mar 01 09:35:04 crc kubenswrapper[4792]: I0301 09:35:04.165957 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wl5zx" event={"ID":"a89252a8-b40d-4834-b779-de581f79f189","Type":"ContainerStarted","Data":"ba9b476cb3c0778e21f4a93c3bb2db37852633c19ddbf425c623316bd8484f71"} Mar 01 09:35:04 crc kubenswrapper[4792]: I0301 09:35:04.190300 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wl5zx" podStartSLOduration=2.794768243 podStartE2EDuration="5.190275422s" podCreationTimestamp="2026-03-01 09:34:59 +0000 UTC" firstStartedPulling="2026-03-01 09:35:01.135773445 +0000 UTC m=+1630.377652652" lastFinishedPulling="2026-03-01 09:35:03.531280634 +0000 UTC m=+1632.773159831" observedRunningTime="2026-03-01 09:35:04.18487096 +0000 UTC m=+1633.426750167" watchObservedRunningTime="2026-03-01 09:35:04.190275422 +0000 UTC m=+1633.432154619" Mar 01 09:35:05 crc kubenswrapper[4792]: I0301 09:35:05.175185 4792 generic.go:334] "Generic (PLEG): container finished" podID="1afe4776-6480-4f35-afcc-a281193262c9" containerID="388a7a778d56b9d677489e2790140ca00303b0b34ce53763de5976889826c926" exitCode=0 Mar 01 09:35:05 crc kubenswrapper[4792]: I0301 09:35:05.175407 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcxll" event={"ID":"1afe4776-6480-4f35-afcc-a281193262c9","Type":"ContainerDied","Data":"388a7a778d56b9d677489e2790140ca00303b0b34ce53763de5976889826c926"} Mar 01 09:35:06 crc kubenswrapper[4792]: I0301 09:35:06.184414 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcxll" event={"ID":"1afe4776-6480-4f35-afcc-a281193262c9","Type":"ContainerStarted","Data":"28a4a862fd946bea0b146fb4f74f4d0dd37e8c6bdea644885c268ed6f637d7f2"} Mar 01 09:35:09 crc kubenswrapper[4792]: I0301 09:35:09.208370 4792 generic.go:334] "Generic (PLEG): container finished" podID="1afe4776-6480-4f35-afcc-a281193262c9" containerID="28a4a862fd946bea0b146fb4f74f4d0dd37e8c6bdea644885c268ed6f637d7f2" exitCode=0 Mar 01 09:35:09 crc kubenswrapper[4792]: I0301 09:35:09.208446 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcxll" event={"ID":"1afe4776-6480-4f35-afcc-a281193262c9","Type":"ContainerDied","Data":"28a4a862fd946bea0b146fb4f74f4d0dd37e8c6bdea644885c268ed6f637d7f2"} Mar 01 09:35:09 crc kubenswrapper[4792]: I0301 09:35:09.780396 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wl5zx" Mar 01 09:35:09 crc kubenswrapper[4792]: I0301 09:35:09.780733 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wl5zx" Mar 01 09:35:09 crc kubenswrapper[4792]: I0301 09:35:09.827728 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wl5zx" Mar 01 09:35:10 crc kubenswrapper[4792]: I0301 09:35:10.220261 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcxll" event={"ID":"1afe4776-6480-4f35-afcc-a281193262c9","Type":"ContainerStarted","Data":"5ae47c4e6417d04fae7af9210d91c331587934aecae33a0c298b593503918666"} Mar 01 09:35:10 crc kubenswrapper[4792]: I0301 09:35:10.253345 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fcxll" podStartSLOduration=2.840584367 podStartE2EDuration="7.253324624s" podCreationTimestamp="2026-03-01 09:35:03 +0000 UTC" firstStartedPulling="2026-03-01 09:35:05.177747295 +0000 UTC m=+1634.419626492" lastFinishedPulling="2026-03-01 09:35:09.590487552 +0000 UTC m=+1638.832366749" observedRunningTime="2026-03-01 09:35:10.245309408 +0000 UTC m=+1639.487188605" watchObservedRunningTime="2026-03-01 09:35:10.253324624 +0000 UTC m=+1639.495203821" Mar 01 09:35:10 crc kubenswrapper[4792]: I0301 09:35:10.273670 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wl5zx" Mar 01 09:35:12 crc kubenswrapper[4792]: I0301 09:35:12.029621 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wl5zx"] Mar 01 09:35:12 crc kubenswrapper[4792]: I0301 09:35:12.235666 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wl5zx" podUID="a89252a8-b40d-4834-b779-de581f79f189" containerName="registry-server" containerID="cri-o://ba9b476cb3c0778e21f4a93c3bb2db37852633c19ddbf425c623316bd8484f71" gracePeriod=2 Mar 01 09:35:12 crc kubenswrapper[4792]: I0301 09:35:12.721250 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wl5zx" Mar 01 09:35:12 crc kubenswrapper[4792]: I0301 09:35:12.808268 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a89252a8-b40d-4834-b779-de581f79f189-catalog-content\") pod \"a89252a8-b40d-4834-b779-de581f79f189\" (UID: \"a89252a8-b40d-4834-b779-de581f79f189\") " Mar 01 09:35:12 crc kubenswrapper[4792]: I0301 09:35:12.808310 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29rc5\" (UniqueName: \"kubernetes.io/projected/a89252a8-b40d-4834-b779-de581f79f189-kube-api-access-29rc5\") pod \"a89252a8-b40d-4834-b779-de581f79f189\" (UID: \"a89252a8-b40d-4834-b779-de581f79f189\") " Mar 01 09:35:12 crc kubenswrapper[4792]: I0301 09:35:12.808341 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a89252a8-b40d-4834-b779-de581f79f189-utilities\") pod \"a89252a8-b40d-4834-b779-de581f79f189\" (UID: \"a89252a8-b40d-4834-b779-de581f79f189\") " Mar 01 09:35:12 crc kubenswrapper[4792]: I0301 09:35:12.809545 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a89252a8-b40d-4834-b779-de581f79f189-utilities" (OuterVolumeSpecName: "utilities") pod "a89252a8-b40d-4834-b779-de581f79f189" (UID: "a89252a8-b40d-4834-b779-de581f79f189"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:35:12 crc kubenswrapper[4792]: I0301 09:35:12.810160 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a89252a8-b40d-4834-b779-de581f79f189-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:35:12 crc kubenswrapper[4792]: I0301 09:35:12.814259 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a89252a8-b40d-4834-b779-de581f79f189-kube-api-access-29rc5" (OuterVolumeSpecName: "kube-api-access-29rc5") pod "a89252a8-b40d-4834-b779-de581f79f189" (UID: "a89252a8-b40d-4834-b779-de581f79f189"). InnerVolumeSpecName "kube-api-access-29rc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:35:12 crc kubenswrapper[4792]: I0301 09:35:12.836093 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a89252a8-b40d-4834-b779-de581f79f189-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a89252a8-b40d-4834-b779-de581f79f189" (UID: "a89252a8-b40d-4834-b779-de581f79f189"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:35:12 crc kubenswrapper[4792]: I0301 09:35:12.912332 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a89252a8-b40d-4834-b779-de581f79f189-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:35:12 crc kubenswrapper[4792]: I0301 09:35:12.912759 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29rc5\" (UniqueName: \"kubernetes.io/projected/a89252a8-b40d-4834-b779-de581f79f189-kube-api-access-29rc5\") on node \"crc\" DevicePath \"\"" Mar 01 09:35:13 crc kubenswrapper[4792]: I0301 09:35:13.245200 4792 generic.go:334] "Generic (PLEG): container finished" podID="a89252a8-b40d-4834-b779-de581f79f189" containerID="ba9b476cb3c0778e21f4a93c3bb2db37852633c19ddbf425c623316bd8484f71" exitCode=0 Mar 01 09:35:13 crc kubenswrapper[4792]: I0301 09:35:13.245261 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wl5zx" event={"ID":"a89252a8-b40d-4834-b779-de581f79f189","Type":"ContainerDied","Data":"ba9b476cb3c0778e21f4a93c3bb2db37852633c19ddbf425c623316bd8484f71"} Mar 01 09:35:13 crc kubenswrapper[4792]: I0301 09:35:13.245295 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wl5zx" Mar 01 09:35:13 crc kubenswrapper[4792]: I0301 09:35:13.245312 4792 scope.go:117] "RemoveContainer" containerID="ba9b476cb3c0778e21f4a93c3bb2db37852633c19ddbf425c623316bd8484f71" Mar 01 09:35:13 crc kubenswrapper[4792]: I0301 09:35:13.245298 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wl5zx" event={"ID":"a89252a8-b40d-4834-b779-de581f79f189","Type":"ContainerDied","Data":"38095d24331ba4477eb556caad35e6df6322eb44f1fa245ce42a42dc8a30b1ca"} Mar 01 09:35:13 crc kubenswrapper[4792]: I0301 09:35:13.273670 4792 scope.go:117] "RemoveContainer" containerID="f5497cbcc8961e27f62acf3129e4a7486e0def49b4ca62f36cbf0ef4873f6c5c" Mar 01 09:35:13 crc kubenswrapper[4792]: I0301 09:35:13.279987 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wl5zx"] Mar 01 09:35:13 crc kubenswrapper[4792]: I0301 09:35:13.297030 4792 scope.go:117] "RemoveContainer" containerID="f225bc33f46e6dda33bce0741685921fe0d892a9accebd1197ca53fc9c52e0b0" Mar 01 09:35:13 crc kubenswrapper[4792]: I0301 09:35:13.308775 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wl5zx"] Mar 01 09:35:13 crc kubenswrapper[4792]: I0301 09:35:13.337105 4792 scope.go:117] "RemoveContainer" containerID="ba9b476cb3c0778e21f4a93c3bb2db37852633c19ddbf425c623316bd8484f71" Mar 01 09:35:13 crc kubenswrapper[4792]: E0301 09:35:13.337983 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba9b476cb3c0778e21f4a93c3bb2db37852633c19ddbf425c623316bd8484f71\": container with ID starting with ba9b476cb3c0778e21f4a93c3bb2db37852633c19ddbf425c623316bd8484f71 not found: ID does not exist" containerID="ba9b476cb3c0778e21f4a93c3bb2db37852633c19ddbf425c623316bd8484f71" Mar 01 09:35:13 crc kubenswrapper[4792]: I0301 09:35:13.338028 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba9b476cb3c0778e21f4a93c3bb2db37852633c19ddbf425c623316bd8484f71"} err="failed to get container status \"ba9b476cb3c0778e21f4a93c3bb2db37852633c19ddbf425c623316bd8484f71\": rpc error: code = NotFound desc = could not find container \"ba9b476cb3c0778e21f4a93c3bb2db37852633c19ddbf425c623316bd8484f71\": container with ID starting with ba9b476cb3c0778e21f4a93c3bb2db37852633c19ddbf425c623316bd8484f71 not found: ID does not exist" Mar 01 09:35:13 crc kubenswrapper[4792]: I0301 09:35:13.338051 4792 scope.go:117] "RemoveContainer" containerID="f5497cbcc8961e27f62acf3129e4a7486e0def49b4ca62f36cbf0ef4873f6c5c" Mar 01 09:35:13 crc kubenswrapper[4792]: E0301 09:35:13.338362 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5497cbcc8961e27f62acf3129e4a7486e0def49b4ca62f36cbf0ef4873f6c5c\": container with ID starting with f5497cbcc8961e27f62acf3129e4a7486e0def49b4ca62f36cbf0ef4873f6c5c not found: ID does not exist" containerID="f5497cbcc8961e27f62acf3129e4a7486e0def49b4ca62f36cbf0ef4873f6c5c" Mar 01 09:35:13 crc kubenswrapper[4792]: I0301 09:35:13.338391 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5497cbcc8961e27f62acf3129e4a7486e0def49b4ca62f36cbf0ef4873f6c5c"} err="failed to get container status \"f5497cbcc8961e27f62acf3129e4a7486e0def49b4ca62f36cbf0ef4873f6c5c\": rpc error: code = NotFound desc = could not find container \"f5497cbcc8961e27f62acf3129e4a7486e0def49b4ca62f36cbf0ef4873f6c5c\": container with ID starting with f5497cbcc8961e27f62acf3129e4a7486e0def49b4ca62f36cbf0ef4873f6c5c not found: ID does not exist" Mar 01 09:35:13 crc kubenswrapper[4792]: I0301 09:35:13.338407 4792 scope.go:117] "RemoveContainer" containerID="f225bc33f46e6dda33bce0741685921fe0d892a9accebd1197ca53fc9c52e0b0" Mar 01 09:35:13 crc kubenswrapper[4792]: E0301 09:35:13.338729 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f225bc33f46e6dda33bce0741685921fe0d892a9accebd1197ca53fc9c52e0b0\": container with ID starting with f225bc33f46e6dda33bce0741685921fe0d892a9accebd1197ca53fc9c52e0b0 not found: ID does not exist" containerID="f225bc33f46e6dda33bce0741685921fe0d892a9accebd1197ca53fc9c52e0b0" Mar 01 09:35:13 crc kubenswrapper[4792]: I0301 09:35:13.338784 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f225bc33f46e6dda33bce0741685921fe0d892a9accebd1197ca53fc9c52e0b0"} err="failed to get container status \"f225bc33f46e6dda33bce0741685921fe0d892a9accebd1197ca53fc9c52e0b0\": rpc error: code = NotFound desc = could not find container \"f225bc33f46e6dda33bce0741685921fe0d892a9accebd1197ca53fc9c52e0b0\": container with ID starting with f225bc33f46e6dda33bce0741685921fe0d892a9accebd1197ca53fc9c52e0b0 not found: ID does not exist" Mar 01 09:35:13 crc kubenswrapper[4792]: I0301 09:35:13.420857 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a89252a8-b40d-4834-b779-de581f79f189" path="/var/lib/kubelet/pods/a89252a8-b40d-4834-b779-de581f79f189/volumes" Mar 01 09:35:13 crc kubenswrapper[4792]: I0301 09:35:13.557421 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fcxll" Mar 01 09:35:13 crc kubenswrapper[4792]: I0301 09:35:13.557504 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fcxll" Mar 01 09:35:14 crc kubenswrapper[4792]: I0301 09:35:14.603038 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-fcxll" podUID="1afe4776-6480-4f35-afcc-a281193262c9" containerName="registry-server" probeResult="failure" output=< Mar 01 09:35:14 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 09:35:14 crc kubenswrapper[4792]: > Mar 01 09:35:23 crc kubenswrapper[4792]: I0301 09:35:23.609311 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fcxll" Mar 01 09:35:23 crc kubenswrapper[4792]: I0301 09:35:23.665022 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fcxll" Mar 01 09:35:23 crc kubenswrapper[4792]: I0301 09:35:23.851509 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fcxll"] Mar 01 09:35:25 crc kubenswrapper[4792]: I0301 09:35:25.340167 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fcxll" podUID="1afe4776-6480-4f35-afcc-a281193262c9" containerName="registry-server" containerID="cri-o://5ae47c4e6417d04fae7af9210d91c331587934aecae33a0c298b593503918666" gracePeriod=2 Mar 01 09:35:25 crc kubenswrapper[4792]: I0301 09:35:25.810687 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fcxll" Mar 01 09:35:25 crc kubenswrapper[4792]: I0301 09:35:25.951785 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1afe4776-6480-4f35-afcc-a281193262c9-utilities\") pod \"1afe4776-6480-4f35-afcc-a281193262c9\" (UID: \"1afe4776-6480-4f35-afcc-a281193262c9\") " Mar 01 09:35:25 crc kubenswrapper[4792]: I0301 09:35:25.951838 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6djqm\" (UniqueName: \"kubernetes.io/projected/1afe4776-6480-4f35-afcc-a281193262c9-kube-api-access-6djqm\") pod \"1afe4776-6480-4f35-afcc-a281193262c9\" (UID: \"1afe4776-6480-4f35-afcc-a281193262c9\") " Mar 01 09:35:25 crc kubenswrapper[4792]: I0301 09:35:25.952122 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1afe4776-6480-4f35-afcc-a281193262c9-catalog-content\") pod \"1afe4776-6480-4f35-afcc-a281193262c9\" (UID: \"1afe4776-6480-4f35-afcc-a281193262c9\") " Mar 01 09:35:25 crc kubenswrapper[4792]: I0301 09:35:25.953898 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1afe4776-6480-4f35-afcc-a281193262c9-utilities" (OuterVolumeSpecName: "utilities") pod "1afe4776-6480-4f35-afcc-a281193262c9" (UID: "1afe4776-6480-4f35-afcc-a281193262c9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:35:25 crc kubenswrapper[4792]: I0301 09:35:25.960793 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1afe4776-6480-4f35-afcc-a281193262c9-kube-api-access-6djqm" (OuterVolumeSpecName: "kube-api-access-6djqm") pod "1afe4776-6480-4f35-afcc-a281193262c9" (UID: "1afe4776-6480-4f35-afcc-a281193262c9"). InnerVolumeSpecName "kube-api-access-6djqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:35:26 crc kubenswrapper[4792]: I0301 09:35:26.017649 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1afe4776-6480-4f35-afcc-a281193262c9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1afe4776-6480-4f35-afcc-a281193262c9" (UID: "1afe4776-6480-4f35-afcc-a281193262c9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:35:26 crc kubenswrapper[4792]: I0301 09:35:26.054775 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1afe4776-6480-4f35-afcc-a281193262c9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:35:26 crc kubenswrapper[4792]: I0301 09:35:26.054811 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1afe4776-6480-4f35-afcc-a281193262c9-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:35:26 crc kubenswrapper[4792]: I0301 09:35:26.054827 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6djqm\" (UniqueName: \"kubernetes.io/projected/1afe4776-6480-4f35-afcc-a281193262c9-kube-api-access-6djqm\") on node \"crc\" DevicePath \"\"" Mar 01 09:35:26 crc kubenswrapper[4792]: I0301 09:35:26.353847 4792 generic.go:334] "Generic (PLEG): container finished" podID="1afe4776-6480-4f35-afcc-a281193262c9" containerID="5ae47c4e6417d04fae7af9210d91c331587934aecae33a0c298b593503918666" exitCode=0 Mar 01 09:35:26 crc kubenswrapper[4792]: I0301 09:35:26.353981 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcxll" event={"ID":"1afe4776-6480-4f35-afcc-a281193262c9","Type":"ContainerDied","Data":"5ae47c4e6417d04fae7af9210d91c331587934aecae33a0c298b593503918666"} Mar 01 09:35:26 crc kubenswrapper[4792]: I0301 09:35:26.354021 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcxll" event={"ID":"1afe4776-6480-4f35-afcc-a281193262c9","Type":"ContainerDied","Data":"3a50e5fc5ec68b469488a198251a926259d52265bbc08a456414a2ff134347dd"} Mar 01 09:35:26 crc kubenswrapper[4792]: I0301 09:35:26.354050 4792 scope.go:117] "RemoveContainer" containerID="5ae47c4e6417d04fae7af9210d91c331587934aecae33a0c298b593503918666" Mar 01 09:35:26 crc kubenswrapper[4792]: I0301 09:35:26.354204 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fcxll" Mar 01 09:35:26 crc kubenswrapper[4792]: I0301 09:35:26.376023 4792 scope.go:117] "RemoveContainer" containerID="28a4a862fd946bea0b146fb4f74f4d0dd37e8c6bdea644885c268ed6f637d7f2" Mar 01 09:35:26 crc kubenswrapper[4792]: I0301 09:35:26.404570 4792 scope.go:117] "RemoveContainer" containerID="388a7a778d56b9d677489e2790140ca00303b0b34ce53763de5976889826c926" Mar 01 09:35:26 crc kubenswrapper[4792]: I0301 09:35:26.421818 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fcxll"] Mar 01 09:35:26 crc kubenswrapper[4792]: I0301 09:35:26.431414 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fcxll"] Mar 01 09:35:26 crc kubenswrapper[4792]: I0301 09:35:26.458289 4792 scope.go:117] "RemoveContainer" containerID="5ae47c4e6417d04fae7af9210d91c331587934aecae33a0c298b593503918666" Mar 01 09:35:26 crc kubenswrapper[4792]: E0301 09:35:26.460403 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ae47c4e6417d04fae7af9210d91c331587934aecae33a0c298b593503918666\": container with ID starting with 5ae47c4e6417d04fae7af9210d91c331587934aecae33a0c298b593503918666 not found: ID does not exist" containerID="5ae47c4e6417d04fae7af9210d91c331587934aecae33a0c298b593503918666" Mar 01 09:35:26 crc kubenswrapper[4792]: I0301 09:35:26.460455 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ae47c4e6417d04fae7af9210d91c331587934aecae33a0c298b593503918666"} err="failed to get container status \"5ae47c4e6417d04fae7af9210d91c331587934aecae33a0c298b593503918666\": rpc error: code = NotFound desc = could not find container \"5ae47c4e6417d04fae7af9210d91c331587934aecae33a0c298b593503918666\": container with ID starting with 5ae47c4e6417d04fae7af9210d91c331587934aecae33a0c298b593503918666 not found: ID does not exist" Mar 01 09:35:26 crc kubenswrapper[4792]: I0301 09:35:26.460487 4792 scope.go:117] "RemoveContainer" containerID="28a4a862fd946bea0b146fb4f74f4d0dd37e8c6bdea644885c268ed6f637d7f2" Mar 01 09:35:26 crc kubenswrapper[4792]: E0301 09:35:26.460829 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28a4a862fd946bea0b146fb4f74f4d0dd37e8c6bdea644885c268ed6f637d7f2\": container with ID starting with 28a4a862fd946bea0b146fb4f74f4d0dd37e8c6bdea644885c268ed6f637d7f2 not found: ID does not exist" containerID="28a4a862fd946bea0b146fb4f74f4d0dd37e8c6bdea644885c268ed6f637d7f2" Mar 01 09:35:26 crc kubenswrapper[4792]: I0301 09:35:26.460929 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28a4a862fd946bea0b146fb4f74f4d0dd37e8c6bdea644885c268ed6f637d7f2"} err="failed to get container status \"28a4a862fd946bea0b146fb4f74f4d0dd37e8c6bdea644885c268ed6f637d7f2\": rpc error: code = NotFound desc = could not find container \"28a4a862fd946bea0b146fb4f74f4d0dd37e8c6bdea644885c268ed6f637d7f2\": container with ID starting with 28a4a862fd946bea0b146fb4f74f4d0dd37e8c6bdea644885c268ed6f637d7f2 not found: ID does not exist" Mar 01 09:35:26 crc kubenswrapper[4792]: I0301 09:35:26.461042 4792 scope.go:117] "RemoveContainer" containerID="388a7a778d56b9d677489e2790140ca00303b0b34ce53763de5976889826c926" Mar 01 09:35:26 crc kubenswrapper[4792]: E0301 09:35:26.461511 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"388a7a778d56b9d677489e2790140ca00303b0b34ce53763de5976889826c926\": container with ID starting with 388a7a778d56b9d677489e2790140ca00303b0b34ce53763de5976889826c926 not found: ID does not exist" containerID="388a7a778d56b9d677489e2790140ca00303b0b34ce53763de5976889826c926" Mar 01 09:35:26 crc kubenswrapper[4792]: I0301 09:35:26.461544 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"388a7a778d56b9d677489e2790140ca00303b0b34ce53763de5976889826c926"} err="failed to get container status \"388a7a778d56b9d677489e2790140ca00303b0b34ce53763de5976889826c926\": rpc error: code = NotFound desc = could not find container \"388a7a778d56b9d677489e2790140ca00303b0b34ce53763de5976889826c926\": container with ID starting with 388a7a778d56b9d677489e2790140ca00303b0b34ce53763de5976889826c926 not found: ID does not exist" Mar 01 09:35:27 crc kubenswrapper[4792]: I0301 09:35:27.418470 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1afe4776-6480-4f35-afcc-a281193262c9" path="/var/lib/kubelet/pods/1afe4776-6480-4f35-afcc-a281193262c9/volumes" Mar 01 09:35:28 crc kubenswrapper[4792]: I0301 09:35:28.049995 4792 scope.go:117] "RemoveContainer" containerID="10b8b301db81f96185e1ca933d6d257371da7cfb2a533a1434c80a6ff2a5895f" Mar 01 09:35:28 crc kubenswrapper[4792]: I0301 09:35:28.073955 4792 scope.go:117] "RemoveContainer" containerID="131d1c48281fda07ee509861ecd19ed50a8dc2c67c40d98a6892403dc5e2415a" Mar 01 09:36:00 crc kubenswrapper[4792]: I0301 09:36:00.143930 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539296-87tnt"] Mar 01 09:36:00 crc kubenswrapper[4792]: E0301 09:36:00.144753 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a89252a8-b40d-4834-b779-de581f79f189" containerName="extract-utilities" Mar 01 09:36:00 crc kubenswrapper[4792]: I0301 09:36:00.144766 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a89252a8-b40d-4834-b779-de581f79f189" containerName="extract-utilities" Mar 01 09:36:00 crc kubenswrapper[4792]: E0301 09:36:00.144796 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1afe4776-6480-4f35-afcc-a281193262c9" containerName="registry-server" Mar 01 09:36:00 crc kubenswrapper[4792]: I0301 09:36:00.144802 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="1afe4776-6480-4f35-afcc-a281193262c9" containerName="registry-server" Mar 01 09:36:00 crc kubenswrapper[4792]: E0301 09:36:00.144809 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1afe4776-6480-4f35-afcc-a281193262c9" containerName="extract-utilities" Mar 01 09:36:00 crc kubenswrapper[4792]: I0301 09:36:00.144816 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="1afe4776-6480-4f35-afcc-a281193262c9" containerName="extract-utilities" Mar 01 09:36:00 crc kubenswrapper[4792]: E0301 09:36:00.144826 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a89252a8-b40d-4834-b779-de581f79f189" containerName="registry-server" Mar 01 09:36:00 crc kubenswrapper[4792]: I0301 09:36:00.144831 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a89252a8-b40d-4834-b779-de581f79f189" containerName="registry-server" Mar 01 09:36:00 crc kubenswrapper[4792]: E0301 09:36:00.144845 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a89252a8-b40d-4834-b779-de581f79f189" containerName="extract-content" Mar 01 09:36:00 crc kubenswrapper[4792]: I0301 09:36:00.144850 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a89252a8-b40d-4834-b779-de581f79f189" containerName="extract-content" Mar 01 09:36:00 crc kubenswrapper[4792]: E0301 09:36:00.144864 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1afe4776-6480-4f35-afcc-a281193262c9" containerName="extract-content" Mar 01 09:36:00 crc kubenswrapper[4792]: I0301 09:36:00.144869 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="1afe4776-6480-4f35-afcc-a281193262c9" containerName="extract-content" Mar 01 09:36:00 crc kubenswrapper[4792]: I0301 09:36:00.145118 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="1afe4776-6480-4f35-afcc-a281193262c9" containerName="registry-server" Mar 01 09:36:00 crc kubenswrapper[4792]: I0301 09:36:00.145132 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a89252a8-b40d-4834-b779-de581f79f189" containerName="registry-server" Mar 01 09:36:00 crc kubenswrapper[4792]: I0301 09:36:00.145765 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539296-87tnt" Mar 01 09:36:00 crc kubenswrapper[4792]: I0301 09:36:00.149350 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:36:00 crc kubenswrapper[4792]: I0301 09:36:00.151142 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:36:00 crc kubenswrapper[4792]: I0301 09:36:00.152467 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:36:00 crc kubenswrapper[4792]: I0301 09:36:00.160560 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539296-87tnt"] Mar 01 09:36:00 crc kubenswrapper[4792]: I0301 09:36:00.265444 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wv9f9\" (UniqueName: \"kubernetes.io/projected/f0029741-30a3-4fc2-b71d-c77dbd652c35-kube-api-access-wv9f9\") pod \"auto-csr-approver-29539296-87tnt\" (UID: \"f0029741-30a3-4fc2-b71d-c77dbd652c35\") " pod="openshift-infra/auto-csr-approver-29539296-87tnt" Mar 01 09:36:00 crc kubenswrapper[4792]: I0301 09:36:00.370892 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wv9f9\" (UniqueName: \"kubernetes.io/projected/f0029741-30a3-4fc2-b71d-c77dbd652c35-kube-api-access-wv9f9\") pod \"auto-csr-approver-29539296-87tnt\" (UID: \"f0029741-30a3-4fc2-b71d-c77dbd652c35\") " pod="openshift-infra/auto-csr-approver-29539296-87tnt" Mar 01 09:36:00 crc kubenswrapper[4792]: I0301 09:36:00.392674 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wv9f9\" (UniqueName: \"kubernetes.io/projected/f0029741-30a3-4fc2-b71d-c77dbd652c35-kube-api-access-wv9f9\") pod \"auto-csr-approver-29539296-87tnt\" (UID: \"f0029741-30a3-4fc2-b71d-c77dbd652c35\") " pod="openshift-infra/auto-csr-approver-29539296-87tnt" Mar 01 09:36:00 crc kubenswrapper[4792]: I0301 09:36:00.464239 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539296-87tnt" Mar 01 09:36:00 crc kubenswrapper[4792]: I0301 09:36:00.926684 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539296-87tnt"] Mar 01 09:36:01 crc kubenswrapper[4792]: I0301 09:36:01.136386 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539296-87tnt" event={"ID":"f0029741-30a3-4fc2-b71d-c77dbd652c35","Type":"ContainerStarted","Data":"0d3079eb6446bd098d6ec2c9ffe0811e97aa3c5bc35e8c3c9bc098850cbba915"} Mar 01 09:36:02 crc kubenswrapper[4792]: I0301 09:36:02.145543 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539296-87tnt" event={"ID":"f0029741-30a3-4fc2-b71d-c77dbd652c35","Type":"ContainerStarted","Data":"8b912a3a88a6f648dd530babff742abe47a9567e05895bab6379ed09d8bc8a56"} Mar 01 09:36:03 crc kubenswrapper[4792]: I0301 09:36:03.156155 4792 generic.go:334] "Generic (PLEG): container finished" podID="f0029741-30a3-4fc2-b71d-c77dbd652c35" containerID="8b912a3a88a6f648dd530babff742abe47a9567e05895bab6379ed09d8bc8a56" exitCode=0 Mar 01 09:36:03 crc kubenswrapper[4792]: I0301 09:36:03.156217 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539296-87tnt" event={"ID":"f0029741-30a3-4fc2-b71d-c77dbd652c35","Type":"ContainerDied","Data":"8b912a3a88a6f648dd530babff742abe47a9567e05895bab6379ed09d8bc8a56"} Mar 01 09:36:04 crc kubenswrapper[4792]: I0301 09:36:04.095103 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cbpls"] Mar 01 09:36:04 crc kubenswrapper[4792]: I0301 09:36:04.101541 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cbpls" Mar 01 09:36:04 crc kubenswrapper[4792]: I0301 09:36:04.107837 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cbpls"] Mar 01 09:36:04 crc kubenswrapper[4792]: I0301 09:36:04.236493 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5vt4\" (UniqueName: \"kubernetes.io/projected/ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c-kube-api-access-t5vt4\") pod \"certified-operators-cbpls\" (UID: \"ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c\") " pod="openshift-marketplace/certified-operators-cbpls" Mar 01 09:36:04 crc kubenswrapper[4792]: I0301 09:36:04.236844 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c-catalog-content\") pod \"certified-operators-cbpls\" (UID: \"ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c\") " pod="openshift-marketplace/certified-operators-cbpls" Mar 01 09:36:04 crc kubenswrapper[4792]: I0301 09:36:04.236882 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c-utilities\") pod \"certified-operators-cbpls\" (UID: \"ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c\") " pod="openshift-marketplace/certified-operators-cbpls" Mar 01 09:36:04 crc kubenswrapper[4792]: I0301 09:36:04.339114 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c-catalog-content\") pod \"certified-operators-cbpls\" (UID: \"ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c\") " pod="openshift-marketplace/certified-operators-cbpls" Mar 01 09:36:04 crc kubenswrapper[4792]: I0301 09:36:04.339198 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c-utilities\") pod \"certified-operators-cbpls\" (UID: \"ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c\") " pod="openshift-marketplace/certified-operators-cbpls" Mar 01 09:36:04 crc kubenswrapper[4792]: I0301 09:36:04.339324 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5vt4\" (UniqueName: \"kubernetes.io/projected/ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c-kube-api-access-t5vt4\") pod \"certified-operators-cbpls\" (UID: \"ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c\") " pod="openshift-marketplace/certified-operators-cbpls" Mar 01 09:36:04 crc kubenswrapper[4792]: I0301 09:36:04.339664 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c-catalog-content\") pod \"certified-operators-cbpls\" (UID: \"ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c\") " pod="openshift-marketplace/certified-operators-cbpls" Mar 01 09:36:04 crc kubenswrapper[4792]: I0301 09:36:04.339896 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c-utilities\") pod \"certified-operators-cbpls\" (UID: \"ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c\") " pod="openshift-marketplace/certified-operators-cbpls" Mar 01 09:36:04 crc kubenswrapper[4792]: I0301 09:36:04.362777 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5vt4\" (UniqueName: \"kubernetes.io/projected/ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c-kube-api-access-t5vt4\") pod \"certified-operators-cbpls\" (UID: \"ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c\") " pod="openshift-marketplace/certified-operators-cbpls" Mar 01 09:36:04 crc kubenswrapper[4792]: I0301 09:36:04.424299 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cbpls" Mar 01 09:36:04 crc kubenswrapper[4792]: I0301 09:36:04.552306 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539296-87tnt" Mar 01 09:36:04 crc kubenswrapper[4792]: I0301 09:36:04.644467 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wv9f9\" (UniqueName: \"kubernetes.io/projected/f0029741-30a3-4fc2-b71d-c77dbd652c35-kube-api-access-wv9f9\") pod \"f0029741-30a3-4fc2-b71d-c77dbd652c35\" (UID: \"f0029741-30a3-4fc2-b71d-c77dbd652c35\") " Mar 01 09:36:04 crc kubenswrapper[4792]: I0301 09:36:04.661428 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0029741-30a3-4fc2-b71d-c77dbd652c35-kube-api-access-wv9f9" (OuterVolumeSpecName: "kube-api-access-wv9f9") pod "f0029741-30a3-4fc2-b71d-c77dbd652c35" (UID: "f0029741-30a3-4fc2-b71d-c77dbd652c35"). InnerVolumeSpecName "kube-api-access-wv9f9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:36:04 crc kubenswrapper[4792]: I0301 09:36:04.746492 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wv9f9\" (UniqueName: \"kubernetes.io/projected/f0029741-30a3-4fc2-b71d-c77dbd652c35-kube-api-access-wv9f9\") on node \"crc\" DevicePath \"\"" Mar 01 09:36:04 crc kubenswrapper[4792]: I0301 09:36:04.895390 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cbpls"] Mar 01 09:36:04 crc kubenswrapper[4792]: W0301 09:36:04.904130 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce3dfbd4_8d64_4be6_b93d_a3e300c4ed6c.slice/crio-5d7b7f5ad9ee24fe834e1813c98805aeb552492372f5eec1abf81f2121a73b86 WatchSource:0}: Error finding container 5d7b7f5ad9ee24fe834e1813c98805aeb552492372f5eec1abf81f2121a73b86: Status 404 returned error can't find the container with id 5d7b7f5ad9ee24fe834e1813c98805aeb552492372f5eec1abf81f2121a73b86 Mar 01 09:36:05 crc kubenswrapper[4792]: I0301 09:36:05.173670 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539296-87tnt" event={"ID":"f0029741-30a3-4fc2-b71d-c77dbd652c35","Type":"ContainerDied","Data":"0d3079eb6446bd098d6ec2c9ffe0811e97aa3c5bc35e8c3c9bc098850cbba915"} Mar 01 09:36:05 crc kubenswrapper[4792]: I0301 09:36:05.173955 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d3079eb6446bd098d6ec2c9ffe0811e97aa3c5bc35e8c3c9bc098850cbba915" Mar 01 09:36:05 crc kubenswrapper[4792]: I0301 09:36:05.173745 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539296-87tnt" Mar 01 09:36:05 crc kubenswrapper[4792]: I0301 09:36:05.176842 4792 generic.go:334] "Generic (PLEG): container finished" podID="4a742181-aebe-42f8-a83e-fee7b480366b" containerID="1fcee8427ea6340db8e69cb0e43a52de1fe2f18dc84e960d49fc0b0918052c29" exitCode=0 Mar 01 09:36:05 crc kubenswrapper[4792]: I0301 09:36:05.176939 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs" event={"ID":"4a742181-aebe-42f8-a83e-fee7b480366b","Type":"ContainerDied","Data":"1fcee8427ea6340db8e69cb0e43a52de1fe2f18dc84e960d49fc0b0918052c29"} Mar 01 09:36:05 crc kubenswrapper[4792]: I0301 09:36:05.181592 4792 generic.go:334] "Generic (PLEG): container finished" podID="ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c" containerID="9c45ce853bc6b1ad8201ac31229a189b5789c0d6a2beb07af331ffabce6af29e" exitCode=0 Mar 01 09:36:05 crc kubenswrapper[4792]: I0301 09:36:05.181633 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cbpls" event={"ID":"ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c","Type":"ContainerDied","Data":"9c45ce853bc6b1ad8201ac31229a189b5789c0d6a2beb07af331ffabce6af29e"} Mar 01 09:36:05 crc kubenswrapper[4792]: I0301 09:36:05.181661 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cbpls" event={"ID":"ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c","Type":"ContainerStarted","Data":"5d7b7f5ad9ee24fe834e1813c98805aeb552492372f5eec1abf81f2121a73b86"} Mar 01 09:36:05 crc kubenswrapper[4792]: I0301 09:36:05.613254 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539290-npvbb"] Mar 01 09:36:05 crc kubenswrapper[4792]: I0301 09:36:05.622192 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539290-npvbb"] Mar 01 09:36:06 crc kubenswrapper[4792]: I0301 09:36:06.197320 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cbpls" event={"ID":"ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c","Type":"ContainerStarted","Data":"5e7498bd3071920c8c9e21cfd9c176f5c0382b50521ec3e8a9865375f48f1d83"} Mar 01 09:36:06 crc kubenswrapper[4792]: I0301 09:36:06.683504 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs" Mar 01 09:36:06 crc kubenswrapper[4792]: I0301 09:36:06.780641 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a742181-aebe-42f8-a83e-fee7b480366b-bootstrap-combined-ca-bundle\") pod \"4a742181-aebe-42f8-a83e-fee7b480366b\" (UID: \"4a742181-aebe-42f8-a83e-fee7b480366b\") " Mar 01 09:36:06 crc kubenswrapper[4792]: I0301 09:36:06.780862 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a742181-aebe-42f8-a83e-fee7b480366b-inventory\") pod \"4a742181-aebe-42f8-a83e-fee7b480366b\" (UID: \"4a742181-aebe-42f8-a83e-fee7b480366b\") " Mar 01 09:36:06 crc kubenswrapper[4792]: I0301 09:36:06.780933 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4a742181-aebe-42f8-a83e-fee7b480366b-ssh-key-openstack-edpm-ipam\") pod \"4a742181-aebe-42f8-a83e-fee7b480366b\" (UID: \"4a742181-aebe-42f8-a83e-fee7b480366b\") " Mar 01 09:36:06 crc kubenswrapper[4792]: I0301 09:36:06.780971 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlgpb\" (UniqueName: \"kubernetes.io/projected/4a742181-aebe-42f8-a83e-fee7b480366b-kube-api-access-xlgpb\") pod \"4a742181-aebe-42f8-a83e-fee7b480366b\" (UID: \"4a742181-aebe-42f8-a83e-fee7b480366b\") " Mar 01 09:36:06 crc kubenswrapper[4792]: I0301 09:36:06.787306 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a742181-aebe-42f8-a83e-fee7b480366b-kube-api-access-xlgpb" (OuterVolumeSpecName: "kube-api-access-xlgpb") pod "4a742181-aebe-42f8-a83e-fee7b480366b" (UID: "4a742181-aebe-42f8-a83e-fee7b480366b"). InnerVolumeSpecName "kube-api-access-xlgpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:36:06 crc kubenswrapper[4792]: I0301 09:36:06.790088 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a742181-aebe-42f8-a83e-fee7b480366b-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "4a742181-aebe-42f8-a83e-fee7b480366b" (UID: "4a742181-aebe-42f8-a83e-fee7b480366b"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:36:06 crc kubenswrapper[4792]: I0301 09:36:06.808256 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a742181-aebe-42f8-a83e-fee7b480366b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4a742181-aebe-42f8-a83e-fee7b480366b" (UID: "4a742181-aebe-42f8-a83e-fee7b480366b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:36:06 crc kubenswrapper[4792]: I0301 09:36:06.812197 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a742181-aebe-42f8-a83e-fee7b480366b-inventory" (OuterVolumeSpecName: "inventory") pod "4a742181-aebe-42f8-a83e-fee7b480366b" (UID: "4a742181-aebe-42f8-a83e-fee7b480366b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:36:06 crc kubenswrapper[4792]: I0301 09:36:06.884087 4792 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a742181-aebe-42f8-a83e-fee7b480366b-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:36:06 crc kubenswrapper[4792]: I0301 09:36:06.884261 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a742181-aebe-42f8-a83e-fee7b480366b-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 09:36:06 crc kubenswrapper[4792]: I0301 09:36:06.884316 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4a742181-aebe-42f8-a83e-fee7b480366b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:36:06 crc kubenswrapper[4792]: I0301 09:36:06.884392 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlgpb\" (UniqueName: \"kubernetes.io/projected/4a742181-aebe-42f8-a83e-fee7b480366b-kube-api-access-xlgpb\") on node \"crc\" DevicePath \"\"" Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.227992 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs" Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.229584 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs" event={"ID":"4a742181-aebe-42f8-a83e-fee7b480366b","Type":"ContainerDied","Data":"b08f4fb0c894f674888aac3a423e143dc1d90f11637cf92b6939cf3c1832d890"} Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.229650 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b08f4fb0c894f674888aac3a423e143dc1d90f11637cf92b6939cf3c1832d890" Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.300401 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2"] Mar 01 09:36:07 crc kubenswrapper[4792]: E0301 09:36:07.300877 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a742181-aebe-42f8-a83e-fee7b480366b" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.300898 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a742181-aebe-42f8-a83e-fee7b480366b" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 01 09:36:07 crc kubenswrapper[4792]: E0301 09:36:07.300950 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0029741-30a3-4fc2-b71d-c77dbd652c35" containerName="oc" Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.300959 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0029741-30a3-4fc2-b71d-c77dbd652c35" containerName="oc" Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.301159 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0029741-30a3-4fc2-b71d-c77dbd652c35" containerName="oc" Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.301199 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a742181-aebe-42f8-a83e-fee7b480366b" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.305198 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2" Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.308558 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.309100 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.309222 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.309550 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2"] Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.311299 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.392733 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8grr9\" (UniqueName: \"kubernetes.io/projected/1f054d9d-4fbb-4909-826c-e6037c4716bd-kube-api-access-8grr9\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2\" (UID: \"1f054d9d-4fbb-4909-826c-e6037c4716bd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2" Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.393112 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f054d9d-4fbb-4909-826c-e6037c4716bd-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2\" (UID: \"1f054d9d-4fbb-4909-826c-e6037c4716bd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2" Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.393202 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f054d9d-4fbb-4909-826c-e6037c4716bd-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2\" (UID: \"1f054d9d-4fbb-4909-826c-e6037c4716bd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2" Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.417644 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3644e57-7093-4402-a6f2-48ed10ac14fa" path="/var/lib/kubelet/pods/d3644e57-7093-4402-a6f2-48ed10ac14fa/volumes" Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.494696 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f054d9d-4fbb-4909-826c-e6037c4716bd-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2\" (UID: \"1f054d9d-4fbb-4909-826c-e6037c4716bd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2" Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.494751 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f054d9d-4fbb-4909-826c-e6037c4716bd-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2\" (UID: \"1f054d9d-4fbb-4909-826c-e6037c4716bd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2" Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.494816 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8grr9\" (UniqueName: \"kubernetes.io/projected/1f054d9d-4fbb-4909-826c-e6037c4716bd-kube-api-access-8grr9\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2\" (UID: \"1f054d9d-4fbb-4909-826c-e6037c4716bd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2" Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.498105 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f054d9d-4fbb-4909-826c-e6037c4716bd-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2\" (UID: \"1f054d9d-4fbb-4909-826c-e6037c4716bd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2" Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.498611 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f054d9d-4fbb-4909-826c-e6037c4716bd-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2\" (UID: \"1f054d9d-4fbb-4909-826c-e6037c4716bd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2" Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.511426 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8grr9\" (UniqueName: \"kubernetes.io/projected/1f054d9d-4fbb-4909-826c-e6037c4716bd-kube-api-access-8grr9\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2\" (UID: \"1f054d9d-4fbb-4909-826c-e6037c4716bd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2" Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.619159 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2" Mar 01 09:36:08 crc kubenswrapper[4792]: W0301 09:36:08.128305 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f054d9d_4fbb_4909_826c_e6037c4716bd.slice/crio-3d5568558fc77cac38af3f9d8b8a28b534315185559e9662b5fc399941b99b47 WatchSource:0}: Error finding container 3d5568558fc77cac38af3f9d8b8a28b534315185559e9662b5fc399941b99b47: Status 404 returned error can't find the container with id 3d5568558fc77cac38af3f9d8b8a28b534315185559e9662b5fc399941b99b47 Mar 01 09:36:08 crc kubenswrapper[4792]: I0301 09:36:08.129009 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2"] Mar 01 09:36:08 crc kubenswrapper[4792]: I0301 09:36:08.241132 4792 generic.go:334] "Generic (PLEG): container finished" podID="ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c" containerID="5e7498bd3071920c8c9e21cfd9c176f5c0382b50521ec3e8a9865375f48f1d83" exitCode=0 Mar 01 09:36:08 crc kubenswrapper[4792]: I0301 09:36:08.241208 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cbpls" event={"ID":"ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c","Type":"ContainerDied","Data":"5e7498bd3071920c8c9e21cfd9c176f5c0382b50521ec3e8a9865375f48f1d83"} Mar 01 09:36:08 crc kubenswrapper[4792]: I0301 09:36:08.244342 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2" event={"ID":"1f054d9d-4fbb-4909-826c-e6037c4716bd","Type":"ContainerStarted","Data":"3d5568558fc77cac38af3f9d8b8a28b534315185559e9662b5fc399941b99b47"} Mar 01 09:36:09 crc kubenswrapper[4792]: I0301 09:36:09.263628 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2" event={"ID":"1f054d9d-4fbb-4909-826c-e6037c4716bd","Type":"ContainerStarted","Data":"2b11700043caea12ce3ec9b1a685865b6228db2c927ef984914abec9ff9701b8"} Mar 01 09:36:09 crc kubenswrapper[4792]: I0301 09:36:09.271078 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cbpls" event={"ID":"ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c","Type":"ContainerStarted","Data":"b0264f4f8b8a32eaeb72b679e197ca563d73bc43b3c102185f04660359d72041"} Mar 01 09:36:09 crc kubenswrapper[4792]: I0301 09:36:09.282595 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2" podStartSLOduration=1.6839720759999999 podStartE2EDuration="2.28257585s" podCreationTimestamp="2026-03-01 09:36:07 +0000 UTC" firstStartedPulling="2026-03-01 09:36:08.130644381 +0000 UTC m=+1697.372523568" lastFinishedPulling="2026-03-01 09:36:08.729248145 +0000 UTC m=+1697.971127342" observedRunningTime="2026-03-01 09:36:09.277928242 +0000 UTC m=+1698.519807449" watchObservedRunningTime="2026-03-01 09:36:09.28257585 +0000 UTC m=+1698.524455047" Mar 01 09:36:09 crc kubenswrapper[4792]: I0301 09:36:09.307062 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cbpls" podStartSLOduration=1.878958288 podStartE2EDuration="5.307031621s" podCreationTimestamp="2026-03-01 09:36:04 +0000 UTC" firstStartedPulling="2026-03-01 09:36:05.183125536 +0000 UTC m=+1694.425004723" lastFinishedPulling="2026-03-01 09:36:08.611198859 +0000 UTC m=+1697.853078056" observedRunningTime="2026-03-01 09:36:09.302387813 +0000 UTC m=+1698.544267010" watchObservedRunningTime="2026-03-01 09:36:09.307031621 +0000 UTC m=+1698.548910818" Mar 01 09:36:14 crc kubenswrapper[4792]: I0301 09:36:14.425503 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cbpls" Mar 01 09:36:14 crc kubenswrapper[4792]: I0301 09:36:14.426111 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cbpls" Mar 01 09:36:14 crc kubenswrapper[4792]: I0301 09:36:14.485485 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cbpls" Mar 01 09:36:15 crc kubenswrapper[4792]: I0301 09:36:15.374397 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cbpls" Mar 01 09:36:15 crc kubenswrapper[4792]: I0301 09:36:15.430958 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cbpls"] Mar 01 09:36:17 crc kubenswrapper[4792]: I0301 09:36:17.337060 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cbpls" podUID="ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c" containerName="registry-server" containerID="cri-o://b0264f4f8b8a32eaeb72b679e197ca563d73bc43b3c102185f04660359d72041" gracePeriod=2 Mar 01 09:36:17 crc kubenswrapper[4792]: I0301 09:36:17.792280 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cbpls" Mar 01 09:36:17 crc kubenswrapper[4792]: I0301 09:36:17.908943 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c-catalog-content\") pod \"ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c\" (UID: \"ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c\") " Mar 01 09:36:17 crc kubenswrapper[4792]: I0301 09:36:17.909139 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c-utilities\") pod \"ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c\" (UID: \"ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c\") " Mar 01 09:36:17 crc kubenswrapper[4792]: I0301 09:36:17.909196 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5vt4\" (UniqueName: \"kubernetes.io/projected/ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c-kube-api-access-t5vt4\") pod \"ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c\" (UID: \"ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c\") " Mar 01 09:36:17 crc kubenswrapper[4792]: I0301 09:36:17.910141 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c-utilities" (OuterVolumeSpecName: "utilities") pod "ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c" (UID: "ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:36:17 crc kubenswrapper[4792]: I0301 09:36:17.915412 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c-kube-api-access-t5vt4" (OuterVolumeSpecName: "kube-api-access-t5vt4") pod "ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c" (UID: "ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c"). InnerVolumeSpecName "kube-api-access-t5vt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:36:17 crc kubenswrapper[4792]: I0301 09:36:17.963098 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c" (UID: "ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:36:18 crc kubenswrapper[4792]: I0301 09:36:18.011289 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:36:18 crc kubenswrapper[4792]: I0301 09:36:18.011503 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:36:18 crc kubenswrapper[4792]: I0301 09:36:18.011603 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5vt4\" (UniqueName: \"kubernetes.io/projected/ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c-kube-api-access-t5vt4\") on node \"crc\" DevicePath \"\"" Mar 01 09:36:18 crc kubenswrapper[4792]: I0301 09:36:18.348120 4792 generic.go:334] "Generic (PLEG): container finished" podID="ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c" containerID="b0264f4f8b8a32eaeb72b679e197ca563d73bc43b3c102185f04660359d72041" exitCode=0 Mar 01 09:36:18 crc kubenswrapper[4792]: I0301 09:36:18.348168 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cbpls" event={"ID":"ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c","Type":"ContainerDied","Data":"b0264f4f8b8a32eaeb72b679e197ca563d73bc43b3c102185f04660359d72041"} Mar 01 09:36:18 crc kubenswrapper[4792]: I0301 09:36:18.348195 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cbpls" event={"ID":"ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c","Type":"ContainerDied","Data":"5d7b7f5ad9ee24fe834e1813c98805aeb552492372f5eec1abf81f2121a73b86"} Mar 01 09:36:18 crc kubenswrapper[4792]: I0301 09:36:18.348211 4792 scope.go:117] "RemoveContainer" containerID="b0264f4f8b8a32eaeb72b679e197ca563d73bc43b3c102185f04660359d72041" Mar 01 09:36:18 crc kubenswrapper[4792]: I0301 09:36:18.348358 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cbpls" Mar 01 09:36:18 crc kubenswrapper[4792]: I0301 09:36:18.365418 4792 scope.go:117] "RemoveContainer" containerID="5e7498bd3071920c8c9e21cfd9c176f5c0382b50521ec3e8a9865375f48f1d83" Mar 01 09:36:18 crc kubenswrapper[4792]: I0301 09:36:18.386635 4792 scope.go:117] "RemoveContainer" containerID="9c45ce853bc6b1ad8201ac31229a189b5789c0d6a2beb07af331ffabce6af29e" Mar 01 09:36:18 crc kubenswrapper[4792]: I0301 09:36:18.395373 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cbpls"] Mar 01 09:36:18 crc kubenswrapper[4792]: I0301 09:36:18.411795 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cbpls"] Mar 01 09:36:18 crc kubenswrapper[4792]: I0301 09:36:18.442017 4792 scope.go:117] "RemoveContainer" containerID="b0264f4f8b8a32eaeb72b679e197ca563d73bc43b3c102185f04660359d72041" Mar 01 09:36:18 crc kubenswrapper[4792]: E0301 09:36:18.442450 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0264f4f8b8a32eaeb72b679e197ca563d73bc43b3c102185f04660359d72041\": container with ID starting with b0264f4f8b8a32eaeb72b679e197ca563d73bc43b3c102185f04660359d72041 not found: ID does not exist" containerID="b0264f4f8b8a32eaeb72b679e197ca563d73bc43b3c102185f04660359d72041" Mar 01 09:36:18 crc kubenswrapper[4792]: I0301 09:36:18.442482 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0264f4f8b8a32eaeb72b679e197ca563d73bc43b3c102185f04660359d72041"} err="failed to get container status \"b0264f4f8b8a32eaeb72b679e197ca563d73bc43b3c102185f04660359d72041\": rpc error: code = NotFound desc = could not find container \"b0264f4f8b8a32eaeb72b679e197ca563d73bc43b3c102185f04660359d72041\": container with ID starting with b0264f4f8b8a32eaeb72b679e197ca563d73bc43b3c102185f04660359d72041 not found: ID does not exist" Mar 01 09:36:18 crc kubenswrapper[4792]: I0301 09:36:18.442508 4792 scope.go:117] "RemoveContainer" containerID="5e7498bd3071920c8c9e21cfd9c176f5c0382b50521ec3e8a9865375f48f1d83" Mar 01 09:36:18 crc kubenswrapper[4792]: E0301 09:36:18.442952 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e7498bd3071920c8c9e21cfd9c176f5c0382b50521ec3e8a9865375f48f1d83\": container with ID starting with 5e7498bd3071920c8c9e21cfd9c176f5c0382b50521ec3e8a9865375f48f1d83 not found: ID does not exist" containerID="5e7498bd3071920c8c9e21cfd9c176f5c0382b50521ec3e8a9865375f48f1d83" Mar 01 09:36:18 crc kubenswrapper[4792]: I0301 09:36:18.442978 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e7498bd3071920c8c9e21cfd9c176f5c0382b50521ec3e8a9865375f48f1d83"} err="failed to get container status \"5e7498bd3071920c8c9e21cfd9c176f5c0382b50521ec3e8a9865375f48f1d83\": rpc error: code = NotFound desc = could not find container \"5e7498bd3071920c8c9e21cfd9c176f5c0382b50521ec3e8a9865375f48f1d83\": container with ID starting with 5e7498bd3071920c8c9e21cfd9c176f5c0382b50521ec3e8a9865375f48f1d83 not found: ID does not exist" Mar 01 09:36:18 crc kubenswrapper[4792]: I0301 09:36:18.442995 4792 scope.go:117] "RemoveContainer" containerID="9c45ce853bc6b1ad8201ac31229a189b5789c0d6a2beb07af331ffabce6af29e" Mar 01 09:36:18 crc kubenswrapper[4792]: E0301 09:36:18.443283 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c45ce853bc6b1ad8201ac31229a189b5789c0d6a2beb07af331ffabce6af29e\": container with ID starting with 9c45ce853bc6b1ad8201ac31229a189b5789c0d6a2beb07af331ffabce6af29e not found: ID does not exist" containerID="9c45ce853bc6b1ad8201ac31229a189b5789c0d6a2beb07af331ffabce6af29e" Mar 01 09:36:18 crc kubenswrapper[4792]: I0301 09:36:18.443301 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c45ce853bc6b1ad8201ac31229a189b5789c0d6a2beb07af331ffabce6af29e"} err="failed to get container status \"9c45ce853bc6b1ad8201ac31229a189b5789c0d6a2beb07af331ffabce6af29e\": rpc error: code = NotFound desc = could not find container \"9c45ce853bc6b1ad8201ac31229a189b5789c0d6a2beb07af331ffabce6af29e\": container with ID starting with 9c45ce853bc6b1ad8201ac31229a189b5789c0d6a2beb07af331ffabce6af29e not found: ID does not exist" Mar 01 09:36:19 crc kubenswrapper[4792]: I0301 09:36:19.417726 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c" path="/var/lib/kubelet/pods/ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c/volumes" Mar 01 09:36:28 crc kubenswrapper[4792]: I0301 09:36:28.177183 4792 scope.go:117] "RemoveContainer" containerID="7926ce126d7f3dd092ea29933967e6329a351e44fde88116cf9663b118841513" Mar 01 09:36:34 crc kubenswrapper[4792]: I0301 09:36:34.944535 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:36:34 crc kubenswrapper[4792]: I0301 09:36:34.945260 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:37:04 crc kubenswrapper[4792]: I0301 09:37:04.942837 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:37:04 crc kubenswrapper[4792]: I0301 09:37:04.943504 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:37:10 crc kubenswrapper[4792]: I0301 09:37:10.057408 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-11c1-account-create-update-8h9xf"] Mar 01 09:37:10 crc kubenswrapper[4792]: I0301 09:37:10.067275 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-dlv4c"] Mar 01 09:37:10 crc kubenswrapper[4792]: I0301 09:37:10.078527 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-2c3a-account-create-update-vnvfb"] Mar 01 09:37:10 crc kubenswrapper[4792]: I0301 09:37:10.089753 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-f95nh"] Mar 01 09:37:10 crc kubenswrapper[4792]: I0301 09:37:10.097600 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-11c1-account-create-update-8h9xf"] Mar 01 09:37:10 crc kubenswrapper[4792]: I0301 09:37:10.107619 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-dlv4c"] Mar 01 09:37:10 crc kubenswrapper[4792]: I0301 09:37:10.117793 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-2c3a-account-create-update-vnvfb"] Mar 01 09:37:10 crc kubenswrapper[4792]: I0301 09:37:10.128303 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-f95nh"] Mar 01 09:37:11 crc kubenswrapper[4792]: I0301 09:37:11.026073 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-8zsss"] Mar 01 09:37:11 crc kubenswrapper[4792]: I0301 09:37:11.033854 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-d8c5-account-create-update-z4zgs"] Mar 01 09:37:11 crc kubenswrapper[4792]: I0301 09:37:11.052537 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-8zsss"] Mar 01 09:37:11 crc kubenswrapper[4792]: I0301 09:37:11.061458 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-d8c5-account-create-update-z4zgs"] Mar 01 09:37:11 crc kubenswrapper[4792]: I0301 09:37:11.418617 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="127158ae-b49c-42bd-932d-af85eafce8c0" path="/var/lib/kubelet/pods/127158ae-b49c-42bd-932d-af85eafce8c0/volumes" Mar 01 09:37:11 crc kubenswrapper[4792]: I0301 09:37:11.419206 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5" path="/var/lib/kubelet/pods/192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5/volumes" Mar 01 09:37:11 crc kubenswrapper[4792]: I0301 09:37:11.419700 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="272107df-b15b-4c97-b9b0-e865f9a391da" path="/var/lib/kubelet/pods/272107df-b15b-4c97-b9b0-e865f9a391da/volumes" Mar 01 09:37:11 crc kubenswrapper[4792]: I0301 09:37:11.420204 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46d8b4e1-c1b5-468c-b319-84985c525d6a" path="/var/lib/kubelet/pods/46d8b4e1-c1b5-468c-b319-84985c525d6a/volumes" Mar 01 09:37:11 crc kubenswrapper[4792]: I0301 09:37:11.421227 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58e6fcdd-44b2-4c03-9cf6-a772bd0c3779" path="/var/lib/kubelet/pods/58e6fcdd-44b2-4c03-9cf6-a772bd0c3779/volumes" Mar 01 09:37:11 crc kubenswrapper[4792]: I0301 09:37:11.421726 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="869a99e5-f399-4938-ba59-bbe20e23385b" path="/var/lib/kubelet/pods/869a99e5-f399-4938-ba59-bbe20e23385b/volumes" Mar 01 09:37:24 crc kubenswrapper[4792]: I0301 09:37:24.034002 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-2dgrc"] Mar 01 09:37:24 crc kubenswrapper[4792]: I0301 09:37:24.040962 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-2dgrc"] Mar 01 09:37:24 crc kubenswrapper[4792]: I0301 09:37:24.824615 4792 generic.go:334] "Generic (PLEG): container finished" podID="1f054d9d-4fbb-4909-826c-e6037c4716bd" containerID="2b11700043caea12ce3ec9b1a685865b6228db2c927ef984914abec9ff9701b8" exitCode=0 Mar 01 09:37:24 crc kubenswrapper[4792]: I0301 09:37:24.824666 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2" event={"ID":"1f054d9d-4fbb-4909-826c-e6037c4716bd","Type":"ContainerDied","Data":"2b11700043caea12ce3ec9b1a685865b6228db2c927ef984914abec9ff9701b8"} Mar 01 09:37:25 crc kubenswrapper[4792]: I0301 09:37:25.423257 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9e80b4e-b68a-48a2-b0fe-e5cf19e00669" path="/var/lib/kubelet/pods/e9e80b4e-b68a-48a2-b0fe-e5cf19e00669/volumes" Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.222351 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2" Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.293093 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8grr9\" (UniqueName: \"kubernetes.io/projected/1f054d9d-4fbb-4909-826c-e6037c4716bd-kube-api-access-8grr9\") pod \"1f054d9d-4fbb-4909-826c-e6037c4716bd\" (UID: \"1f054d9d-4fbb-4909-826c-e6037c4716bd\") " Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.293183 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f054d9d-4fbb-4909-826c-e6037c4716bd-inventory\") pod \"1f054d9d-4fbb-4909-826c-e6037c4716bd\" (UID: \"1f054d9d-4fbb-4909-826c-e6037c4716bd\") " Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.293286 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f054d9d-4fbb-4909-826c-e6037c4716bd-ssh-key-openstack-edpm-ipam\") pod \"1f054d9d-4fbb-4909-826c-e6037c4716bd\" (UID: \"1f054d9d-4fbb-4909-826c-e6037c4716bd\") " Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.304114 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f054d9d-4fbb-4909-826c-e6037c4716bd-kube-api-access-8grr9" (OuterVolumeSpecName: "kube-api-access-8grr9") pod "1f054d9d-4fbb-4909-826c-e6037c4716bd" (UID: "1f054d9d-4fbb-4909-826c-e6037c4716bd"). InnerVolumeSpecName "kube-api-access-8grr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.325597 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f054d9d-4fbb-4909-826c-e6037c4716bd-inventory" (OuterVolumeSpecName: "inventory") pod "1f054d9d-4fbb-4909-826c-e6037c4716bd" (UID: "1f054d9d-4fbb-4909-826c-e6037c4716bd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.326132 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f054d9d-4fbb-4909-826c-e6037c4716bd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1f054d9d-4fbb-4909-826c-e6037c4716bd" (UID: "1f054d9d-4fbb-4909-826c-e6037c4716bd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.395451 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8grr9\" (UniqueName: \"kubernetes.io/projected/1f054d9d-4fbb-4909-826c-e6037c4716bd-kube-api-access-8grr9\") on node \"crc\" DevicePath \"\"" Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.395768 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f054d9d-4fbb-4909-826c-e6037c4716bd-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.396094 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f054d9d-4fbb-4909-826c-e6037c4716bd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.843566 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2" event={"ID":"1f054d9d-4fbb-4909-826c-e6037c4716bd","Type":"ContainerDied","Data":"3d5568558fc77cac38af3f9d8b8a28b534315185559e9662b5fc399941b99b47"} Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.843795 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d5568558fc77cac38af3f9d8b8a28b534315185559e9662b5fc399941b99b47" Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.843627 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2" Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.932900 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw"] Mar 01 09:37:26 crc kubenswrapper[4792]: E0301 09:37:26.933276 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c" containerName="registry-server" Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.933293 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c" containerName="registry-server" Mar 01 09:37:26 crc kubenswrapper[4792]: E0301 09:37:26.933303 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c" containerName="extract-utilities" Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.933310 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c" containerName="extract-utilities" Mar 01 09:37:26 crc kubenswrapper[4792]: E0301 09:37:26.933324 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c" containerName="extract-content" Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.933330 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c" containerName="extract-content" Mar 01 09:37:26 crc kubenswrapper[4792]: E0301 09:37:26.933343 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f054d9d-4fbb-4909-826c-e6037c4716bd" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.933350 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f054d9d-4fbb-4909-826c-e6037c4716bd" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.933511 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c" containerName="registry-server" Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.933531 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f054d9d-4fbb-4909-826c-e6037c4716bd" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.934292 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw" Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.936526 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.936530 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.936573 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.938960 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.965269 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw"] Mar 01 09:37:27 crc kubenswrapper[4792]: I0301 09:37:27.004668 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9af8a1fb-52d8-4b08-be39-ad106833ba1c-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw\" (UID: \"9af8a1fb-52d8-4b08-be39-ad106833ba1c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw" Mar 01 09:37:27 crc kubenswrapper[4792]: I0301 09:37:27.004712 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9gdq\" (UniqueName: \"kubernetes.io/projected/9af8a1fb-52d8-4b08-be39-ad106833ba1c-kube-api-access-h9gdq\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw\" (UID: \"9af8a1fb-52d8-4b08-be39-ad106833ba1c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw" Mar 01 09:37:27 crc kubenswrapper[4792]: I0301 09:37:27.004980 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9af8a1fb-52d8-4b08-be39-ad106833ba1c-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw\" (UID: \"9af8a1fb-52d8-4b08-be39-ad106833ba1c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw" Mar 01 09:37:27 crc kubenswrapper[4792]: I0301 09:37:27.106239 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9af8a1fb-52d8-4b08-be39-ad106833ba1c-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw\" (UID: \"9af8a1fb-52d8-4b08-be39-ad106833ba1c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw" Mar 01 09:37:27 crc kubenswrapper[4792]: I0301 09:37:27.106396 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9af8a1fb-52d8-4b08-be39-ad106833ba1c-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw\" (UID: \"9af8a1fb-52d8-4b08-be39-ad106833ba1c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw" Mar 01 09:37:27 crc kubenswrapper[4792]: I0301 09:37:27.106424 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9gdq\" (UniqueName: \"kubernetes.io/projected/9af8a1fb-52d8-4b08-be39-ad106833ba1c-kube-api-access-h9gdq\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw\" (UID: \"9af8a1fb-52d8-4b08-be39-ad106833ba1c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw" Mar 01 09:37:27 crc kubenswrapper[4792]: I0301 09:37:27.111344 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9af8a1fb-52d8-4b08-be39-ad106833ba1c-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw\" (UID: \"9af8a1fb-52d8-4b08-be39-ad106833ba1c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw" Mar 01 09:37:27 crc kubenswrapper[4792]: I0301 09:37:27.116375 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9af8a1fb-52d8-4b08-be39-ad106833ba1c-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw\" (UID: \"9af8a1fb-52d8-4b08-be39-ad106833ba1c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw" Mar 01 09:37:27 crc kubenswrapper[4792]: I0301 09:37:27.131478 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9gdq\" (UniqueName: \"kubernetes.io/projected/9af8a1fb-52d8-4b08-be39-ad106833ba1c-kube-api-access-h9gdq\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw\" (UID: \"9af8a1fb-52d8-4b08-be39-ad106833ba1c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw" Mar 01 09:37:27 crc kubenswrapper[4792]: I0301 09:37:27.258955 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw" Mar 01 09:37:27 crc kubenswrapper[4792]: I0301 09:37:27.836345 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw"] Mar 01 09:37:27 crc kubenswrapper[4792]: I0301 09:37:27.853744 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw" event={"ID":"9af8a1fb-52d8-4b08-be39-ad106833ba1c","Type":"ContainerStarted","Data":"1fed7b9f25154948ea1a944d03547c3dec990587c0db6a199eae71c8d98766c6"} Mar 01 09:37:28 crc kubenswrapper[4792]: I0301 09:37:28.260528 4792 scope.go:117] "RemoveContainer" containerID="98147a64ef321c5a2be94da9872cf3444a2d6ee5365cb23fe0e8d40238b6ab98" Mar 01 09:37:28 crc kubenswrapper[4792]: I0301 09:37:28.311080 4792 scope.go:117] "RemoveContainer" containerID="5c3a4231cfc20731f9ac2774fb470c532f5db1e9d44253c60e2e47577fa458dc" Mar 01 09:37:28 crc kubenswrapper[4792]: I0301 09:37:28.388397 4792 scope.go:117] "RemoveContainer" containerID="b99cdab13c59b3d72ed63dfb54dc704e52617818eb25d09b6ad0f435b22c114f" Mar 01 09:37:28 crc kubenswrapper[4792]: I0301 09:37:28.461093 4792 scope.go:117] "RemoveContainer" containerID="85f8d1f8a57c04591a9aaccb1305c025dd215cbe1527e2573cb115d042731951" Mar 01 09:37:28 crc kubenswrapper[4792]: I0301 09:37:28.506706 4792 scope.go:117] "RemoveContainer" containerID="3b9a5bf9216213ab73f7db6aa95b33bd1c546b1770c33a00558994664a8fc4ce" Mar 01 09:37:28 crc kubenswrapper[4792]: I0301 09:37:28.537587 4792 scope.go:117] "RemoveContainer" containerID="79a53da4edce2f856b264b84a40ae3b0fe791d8730afd70b1bb1b19a59aff3f9" Mar 01 09:37:28 crc kubenswrapper[4792]: I0301 09:37:28.565053 4792 scope.go:117] "RemoveContainer" containerID="0efefdd3dac5f3f586a3c6d6e7f2ba1305e9a8e8544b4e285a4ab7c3e12e8018" Mar 01 09:37:28 crc kubenswrapper[4792]: I0301 09:37:28.864960 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw" event={"ID":"9af8a1fb-52d8-4b08-be39-ad106833ba1c","Type":"ContainerStarted","Data":"263ab29e6e451a06b962c7883f7e5448fbe4595217de2d6fcca680e67789bfae"} Mar 01 09:37:28 crc kubenswrapper[4792]: I0301 09:37:28.888718 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw" podStartSLOduration=2.4177515339999998 podStartE2EDuration="2.888702418s" podCreationTimestamp="2026-03-01 09:37:26 +0000 UTC" firstStartedPulling="2026-03-01 09:37:27.840045161 +0000 UTC m=+1777.081924358" lastFinishedPulling="2026-03-01 09:37:28.310996025 +0000 UTC m=+1777.552875242" observedRunningTime="2026-03-01 09:37:28.882212963 +0000 UTC m=+1778.124092200" watchObservedRunningTime="2026-03-01 09:37:28.888702418 +0000 UTC m=+1778.130581615" Mar 01 09:37:33 crc kubenswrapper[4792]: I0301 09:37:33.918567 4792 generic.go:334] "Generic (PLEG): container finished" podID="9af8a1fb-52d8-4b08-be39-ad106833ba1c" containerID="263ab29e6e451a06b962c7883f7e5448fbe4595217de2d6fcca680e67789bfae" exitCode=0 Mar 01 09:37:33 crc kubenswrapper[4792]: I0301 09:37:33.918650 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw" event={"ID":"9af8a1fb-52d8-4b08-be39-ad106833ba1c","Type":"ContainerDied","Data":"263ab29e6e451a06b962c7883f7e5448fbe4595217de2d6fcca680e67789bfae"} Mar 01 09:37:34 crc kubenswrapper[4792]: I0301 09:37:34.943074 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:37:34 crc kubenswrapper[4792]: I0301 09:37:34.943126 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:37:34 crc kubenswrapper[4792]: I0301 09:37:34.943170 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:37:34 crc kubenswrapper[4792]: I0301 09:37:34.944067 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e"} pod="openshift-machine-config-operator/machine-config-daemon-bqszv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 01 09:37:34 crc kubenswrapper[4792]: I0301 09:37:34.944126 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" containerID="cri-o://60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" gracePeriod=600 Mar 01 09:37:35 crc kubenswrapper[4792]: I0301 09:37:35.058167 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-ks68h"] Mar 01 09:37:35 crc kubenswrapper[4792]: I0301 09:37:35.066486 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-ks68h"] Mar 01 09:37:35 crc kubenswrapper[4792]: E0301 09:37:35.076470 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:37:35 crc kubenswrapper[4792]: I0301 09:37:35.384797 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw" Mar 01 09:37:35 crc kubenswrapper[4792]: I0301 09:37:35.462032 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89" path="/var/lib/kubelet/pods/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89/volumes" Mar 01 09:37:35 crc kubenswrapper[4792]: I0301 09:37:35.470099 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9gdq\" (UniqueName: \"kubernetes.io/projected/9af8a1fb-52d8-4b08-be39-ad106833ba1c-kube-api-access-h9gdq\") pod \"9af8a1fb-52d8-4b08-be39-ad106833ba1c\" (UID: \"9af8a1fb-52d8-4b08-be39-ad106833ba1c\") " Mar 01 09:37:35 crc kubenswrapper[4792]: I0301 09:37:35.470334 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9af8a1fb-52d8-4b08-be39-ad106833ba1c-ssh-key-openstack-edpm-ipam\") pod \"9af8a1fb-52d8-4b08-be39-ad106833ba1c\" (UID: \"9af8a1fb-52d8-4b08-be39-ad106833ba1c\") " Mar 01 09:37:35 crc kubenswrapper[4792]: I0301 09:37:35.470553 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9af8a1fb-52d8-4b08-be39-ad106833ba1c-inventory\") pod \"9af8a1fb-52d8-4b08-be39-ad106833ba1c\" (UID: \"9af8a1fb-52d8-4b08-be39-ad106833ba1c\") " Mar 01 09:37:35 crc kubenswrapper[4792]: I0301 09:37:35.476222 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9af8a1fb-52d8-4b08-be39-ad106833ba1c-kube-api-access-h9gdq" (OuterVolumeSpecName: "kube-api-access-h9gdq") pod "9af8a1fb-52d8-4b08-be39-ad106833ba1c" (UID: "9af8a1fb-52d8-4b08-be39-ad106833ba1c"). InnerVolumeSpecName "kube-api-access-h9gdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:37:35 crc kubenswrapper[4792]: I0301 09:37:35.496433 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9af8a1fb-52d8-4b08-be39-ad106833ba1c-inventory" (OuterVolumeSpecName: "inventory") pod "9af8a1fb-52d8-4b08-be39-ad106833ba1c" (UID: "9af8a1fb-52d8-4b08-be39-ad106833ba1c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:37:35 crc kubenswrapper[4792]: I0301 09:37:35.506307 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9af8a1fb-52d8-4b08-be39-ad106833ba1c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9af8a1fb-52d8-4b08-be39-ad106833ba1c" (UID: "9af8a1fb-52d8-4b08-be39-ad106833ba1c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:37:35 crc kubenswrapper[4792]: I0301 09:37:35.574119 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9gdq\" (UniqueName: \"kubernetes.io/projected/9af8a1fb-52d8-4b08-be39-ad106833ba1c-kube-api-access-h9gdq\") on node \"crc\" DevicePath \"\"" Mar 01 09:37:35 crc kubenswrapper[4792]: I0301 09:37:35.574150 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9af8a1fb-52d8-4b08-be39-ad106833ba1c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:37:35 crc kubenswrapper[4792]: I0301 09:37:35.574193 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9af8a1fb-52d8-4b08-be39-ad106833ba1c-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 09:37:35 crc kubenswrapper[4792]: I0301 09:37:35.939050 4792 generic.go:334] "Generic (PLEG): container finished" podID="9105f6b0-6f16-47aa-8009-73736a90b765" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" exitCode=0 Mar 01 09:37:35 crc kubenswrapper[4792]: I0301 09:37:35.939120 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerDied","Data":"60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e"} Mar 01 09:37:35 crc kubenswrapper[4792]: I0301 09:37:35.939221 4792 scope.go:117] "RemoveContainer" containerID="6c6c336556c3895a23d652801049ab8fd2cc3ff89812dc0c31bb6831441e0e06" Mar 01 09:37:35 crc kubenswrapper[4792]: I0301 09:37:35.939952 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:37:35 crc kubenswrapper[4792]: E0301 09:37:35.940348 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:37:35 crc kubenswrapper[4792]: I0301 09:37:35.940941 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw" Mar 01 09:37:35 crc kubenswrapper[4792]: I0301 09:37:35.940942 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw" event={"ID":"9af8a1fb-52d8-4b08-be39-ad106833ba1c","Type":"ContainerDied","Data":"1fed7b9f25154948ea1a944d03547c3dec990587c0db6a199eae71c8d98766c6"} Mar 01 09:37:35 crc kubenswrapper[4792]: I0301 09:37:35.941013 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fed7b9f25154948ea1a944d03547c3dec990587c0db6a199eae71c8d98766c6" Mar 01 09:37:36 crc kubenswrapper[4792]: I0301 09:37:36.061797 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-kbjsl"] Mar 01 09:37:36 crc kubenswrapper[4792]: E0301 09:37:36.062508 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9af8a1fb-52d8-4b08-be39-ad106833ba1c" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 01 09:37:36 crc kubenswrapper[4792]: I0301 09:37:36.062525 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9af8a1fb-52d8-4b08-be39-ad106833ba1c" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 01 09:37:36 crc kubenswrapper[4792]: I0301 09:37:36.062754 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="9af8a1fb-52d8-4b08-be39-ad106833ba1c" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 01 09:37:36 crc kubenswrapper[4792]: I0301 09:37:36.063410 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kbjsl" Mar 01 09:37:36 crc kubenswrapper[4792]: I0301 09:37:36.068447 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:37:36 crc kubenswrapper[4792]: I0301 09:37:36.068954 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:37:36 crc kubenswrapper[4792]: I0301 09:37:36.069394 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:37:36 crc kubenswrapper[4792]: I0301 09:37:36.072255 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:37:36 crc kubenswrapper[4792]: I0301 09:37:36.079447 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-kbjsl"] Mar 01 09:37:36 crc kubenswrapper[4792]: I0301 09:37:36.084565 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9642\" (UniqueName: \"kubernetes.io/projected/31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8-kube-api-access-f9642\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kbjsl\" (UID: \"31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kbjsl" Mar 01 09:37:36 crc kubenswrapper[4792]: I0301 09:37:36.084681 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kbjsl\" (UID: \"31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kbjsl" Mar 01 09:37:36 crc kubenswrapper[4792]: I0301 09:37:36.084710 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kbjsl\" (UID: \"31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kbjsl" Mar 01 09:37:36 crc kubenswrapper[4792]: I0301 09:37:36.187496 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9642\" (UniqueName: \"kubernetes.io/projected/31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8-kube-api-access-f9642\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kbjsl\" (UID: \"31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kbjsl" Mar 01 09:37:36 crc kubenswrapper[4792]: I0301 09:37:36.187717 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kbjsl\" (UID: \"31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kbjsl" Mar 01 09:37:36 crc kubenswrapper[4792]: I0301 09:37:36.189277 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kbjsl\" (UID: \"31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kbjsl" Mar 01 09:37:36 crc kubenswrapper[4792]: I0301 09:37:36.196591 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kbjsl\" (UID: \"31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kbjsl" Mar 01 09:37:36 crc kubenswrapper[4792]: I0301 09:37:36.196633 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kbjsl\" (UID: \"31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kbjsl" Mar 01 09:37:36 crc kubenswrapper[4792]: I0301 09:37:36.203881 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9642\" (UniqueName: \"kubernetes.io/projected/31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8-kube-api-access-f9642\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kbjsl\" (UID: \"31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kbjsl" Mar 01 09:37:36 crc kubenswrapper[4792]: I0301 09:37:36.391966 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kbjsl" Mar 01 09:37:36 crc kubenswrapper[4792]: I0301 09:37:36.932148 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-kbjsl"] Mar 01 09:37:36 crc kubenswrapper[4792]: I0301 09:37:36.953174 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kbjsl" event={"ID":"31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8","Type":"ContainerStarted","Data":"26c3c7763b6a7087dbe46227cb8263cd4e006283a3845857a05816e842cb3bd6"} Mar 01 09:37:37 crc kubenswrapper[4792]: I0301 09:37:37.964224 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kbjsl" event={"ID":"31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8","Type":"ContainerStarted","Data":"40d6bf3bff7640be17f842b1e208ca6cbd13ff723d54eb172fb51e9f37d11d71"} Mar 01 09:37:37 crc kubenswrapper[4792]: I0301 09:37:37.984365 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kbjsl" podStartSLOduration=1.590300186 podStartE2EDuration="1.984344808s" podCreationTimestamp="2026-03-01 09:37:36 +0000 UTC" firstStartedPulling="2026-03-01 09:37:36.927628606 +0000 UTC m=+1786.169507803" lastFinishedPulling="2026-03-01 09:37:37.321673228 +0000 UTC m=+1786.563552425" observedRunningTime="2026-03-01 09:37:37.976486288 +0000 UTC m=+1787.218365505" watchObservedRunningTime="2026-03-01 09:37:37.984344808 +0000 UTC m=+1787.226224005" Mar 01 09:37:49 crc kubenswrapper[4792]: I0301 09:37:49.028017 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-t8rmw"] Mar 01 09:37:49 crc kubenswrapper[4792]: I0301 09:37:49.034752 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-t8rmw"] Mar 01 09:37:49 crc kubenswrapper[4792]: I0301 09:37:49.419298 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0b42afb-2954-442e-bc91-4c8275a4d2fd" path="/var/lib/kubelet/pods/f0b42afb-2954-442e-bc91-4c8275a4d2fd/volumes" Mar 01 09:37:51 crc kubenswrapper[4792]: I0301 09:37:51.413870 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:37:51 crc kubenswrapper[4792]: E0301 09:37:51.414352 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:37:52 crc kubenswrapper[4792]: I0301 09:37:52.029235 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-zxh6d"] Mar 01 09:37:52 crc kubenswrapper[4792]: I0301 09:37:52.036837 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-71d5-account-create-update-mjs9k"] Mar 01 09:37:52 crc kubenswrapper[4792]: I0301 09:37:52.044549 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-e676-account-create-update-5ntgh"] Mar 01 09:37:52 crc kubenswrapper[4792]: I0301 09:37:52.055329 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-zxh6d"] Mar 01 09:37:52 crc kubenswrapper[4792]: I0301 09:37:52.062774 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-kv8gv"] Mar 01 09:37:52 crc kubenswrapper[4792]: I0301 09:37:52.072206 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-8714-account-create-update-wzssg"] Mar 01 09:37:52 crc kubenswrapper[4792]: I0301 09:37:52.080150 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-kv8gv"] Mar 01 09:37:52 crc kubenswrapper[4792]: I0301 09:37:52.087131 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-71d5-account-create-update-mjs9k"] Mar 01 09:37:52 crc kubenswrapper[4792]: I0301 09:37:52.093553 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-8714-account-create-update-wzssg"] Mar 01 09:37:52 crc kubenswrapper[4792]: I0301 09:37:52.099783 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-e676-account-create-update-5ntgh"] Mar 01 09:37:53 crc kubenswrapper[4792]: I0301 09:37:53.419587 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46b17f7c-595d-4b78-9076-037fb2998f60" path="/var/lib/kubelet/pods/46b17f7c-595d-4b78-9076-037fb2998f60/volumes" Mar 01 09:37:53 crc kubenswrapper[4792]: I0301 09:37:53.420562 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b715bb3f-b181-4614-85c5-9155286ce80c" path="/var/lib/kubelet/pods/b715bb3f-b181-4614-85c5-9155286ce80c/volumes" Mar 01 09:37:53 crc kubenswrapper[4792]: I0301 09:37:53.421102 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd689802-7b27-463e-a155-ed837e8594e6" path="/var/lib/kubelet/pods/bd689802-7b27-463e-a155-ed837e8594e6/volumes" Mar 01 09:37:53 crc kubenswrapper[4792]: I0301 09:37:53.421620 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dabb3d2e-57fa-4ad3-9f3b-b85e0b670650" path="/var/lib/kubelet/pods/dabb3d2e-57fa-4ad3-9f3b-b85e0b670650/volumes" Mar 01 09:37:53 crc kubenswrapper[4792]: I0301 09:37:53.422571 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efc2406b-db33-4a33-86f1-dd69b0f537a1" path="/var/lib/kubelet/pods/efc2406b-db33-4a33-86f1-dd69b0f537a1/volumes" Mar 01 09:37:57 crc kubenswrapper[4792]: I0301 09:37:57.036600 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-jsld8"] Mar 01 09:37:57 crc kubenswrapper[4792]: I0301 09:37:57.045270 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-jsld8"] Mar 01 09:37:57 crc kubenswrapper[4792]: I0301 09:37:57.421463 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="465282ce-1312-4cb6-ae89-de6ada48a901" path="/var/lib/kubelet/pods/465282ce-1312-4cb6-ae89-de6ada48a901/volumes" Mar 01 09:38:00 crc kubenswrapper[4792]: I0301 09:38:00.129500 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539298-ckkqh"] Mar 01 09:38:00 crc kubenswrapper[4792]: I0301 09:38:00.130786 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539298-ckkqh" Mar 01 09:38:00 crc kubenswrapper[4792]: I0301 09:38:00.132792 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:38:00 crc kubenswrapper[4792]: I0301 09:38:00.132989 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:38:00 crc kubenswrapper[4792]: I0301 09:38:00.134181 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:38:00 crc kubenswrapper[4792]: I0301 09:38:00.146665 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539298-ckkqh"] Mar 01 09:38:00 crc kubenswrapper[4792]: I0301 09:38:00.258676 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgrfb\" (UniqueName: \"kubernetes.io/projected/7e9f36fa-467b-4b49-9d69-b465a22837e5-kube-api-access-jgrfb\") pod \"auto-csr-approver-29539298-ckkqh\" (UID: \"7e9f36fa-467b-4b49-9d69-b465a22837e5\") " pod="openshift-infra/auto-csr-approver-29539298-ckkqh" Mar 01 09:38:00 crc kubenswrapper[4792]: I0301 09:38:00.361406 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgrfb\" (UniqueName: \"kubernetes.io/projected/7e9f36fa-467b-4b49-9d69-b465a22837e5-kube-api-access-jgrfb\") pod \"auto-csr-approver-29539298-ckkqh\" (UID: \"7e9f36fa-467b-4b49-9d69-b465a22837e5\") " pod="openshift-infra/auto-csr-approver-29539298-ckkqh" Mar 01 09:38:00 crc kubenswrapper[4792]: I0301 09:38:00.383893 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgrfb\" (UniqueName: \"kubernetes.io/projected/7e9f36fa-467b-4b49-9d69-b465a22837e5-kube-api-access-jgrfb\") pod \"auto-csr-approver-29539298-ckkqh\" (UID: \"7e9f36fa-467b-4b49-9d69-b465a22837e5\") " pod="openshift-infra/auto-csr-approver-29539298-ckkqh" Mar 01 09:38:00 crc kubenswrapper[4792]: I0301 09:38:00.453112 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539298-ckkqh" Mar 01 09:38:00 crc kubenswrapper[4792]: I0301 09:38:00.896382 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539298-ckkqh"] Mar 01 09:38:01 crc kubenswrapper[4792]: I0301 09:38:01.157565 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539298-ckkqh" event={"ID":"7e9f36fa-467b-4b49-9d69-b465a22837e5","Type":"ContainerStarted","Data":"c746804eed8758bb95b1fa6973b5ac0ac15ae1549125e1b4807551d5b907ac5b"} Mar 01 09:38:02 crc kubenswrapper[4792]: I0301 09:38:02.174547 4792 generic.go:334] "Generic (PLEG): container finished" podID="7e9f36fa-467b-4b49-9d69-b465a22837e5" containerID="fe8d25b14be5e63dea82359594e611616a93ae643f10a8ff38209498ecbc612f" exitCode=0 Mar 01 09:38:02 crc kubenswrapper[4792]: I0301 09:38:02.174651 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539298-ckkqh" event={"ID":"7e9f36fa-467b-4b49-9d69-b465a22837e5","Type":"ContainerDied","Data":"fe8d25b14be5e63dea82359594e611616a93ae643f10a8ff38209498ecbc612f"} Mar 01 09:38:03 crc kubenswrapper[4792]: I0301 09:38:03.488639 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539298-ckkqh" Mar 01 09:38:03 crc kubenswrapper[4792]: I0301 09:38:03.619811 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgrfb\" (UniqueName: \"kubernetes.io/projected/7e9f36fa-467b-4b49-9d69-b465a22837e5-kube-api-access-jgrfb\") pod \"7e9f36fa-467b-4b49-9d69-b465a22837e5\" (UID: \"7e9f36fa-467b-4b49-9d69-b465a22837e5\") " Mar 01 09:38:03 crc kubenswrapper[4792]: I0301 09:38:03.631082 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e9f36fa-467b-4b49-9d69-b465a22837e5-kube-api-access-jgrfb" (OuterVolumeSpecName: "kube-api-access-jgrfb") pod "7e9f36fa-467b-4b49-9d69-b465a22837e5" (UID: "7e9f36fa-467b-4b49-9d69-b465a22837e5"). InnerVolumeSpecName "kube-api-access-jgrfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:38:03 crc kubenswrapper[4792]: I0301 09:38:03.721828 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgrfb\" (UniqueName: \"kubernetes.io/projected/7e9f36fa-467b-4b49-9d69-b465a22837e5-kube-api-access-jgrfb\") on node \"crc\" DevicePath \"\"" Mar 01 09:38:04 crc kubenswrapper[4792]: I0301 09:38:04.192281 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539298-ckkqh" event={"ID":"7e9f36fa-467b-4b49-9d69-b465a22837e5","Type":"ContainerDied","Data":"c746804eed8758bb95b1fa6973b5ac0ac15ae1549125e1b4807551d5b907ac5b"} Mar 01 09:38:04 crc kubenswrapper[4792]: I0301 09:38:04.192333 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c746804eed8758bb95b1fa6973b5ac0ac15ae1549125e1b4807551d5b907ac5b" Mar 01 09:38:04 crc kubenswrapper[4792]: I0301 09:38:04.192344 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539298-ckkqh" Mar 01 09:38:04 crc kubenswrapper[4792]: I0301 09:38:04.409342 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:38:04 crc kubenswrapper[4792]: E0301 09:38:04.409581 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:38:04 crc kubenswrapper[4792]: I0301 09:38:04.548523 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539292-vszff"] Mar 01 09:38:04 crc kubenswrapper[4792]: I0301 09:38:04.556423 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539292-vszff"] Mar 01 09:38:05 crc kubenswrapper[4792]: I0301 09:38:05.417273 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13d3d8c8-6b26-4e5c-94f9-6fe6ae0519fc" path="/var/lib/kubelet/pods/13d3d8c8-6b26-4e5c-94f9-6fe6ae0519fc/volumes" Mar 01 09:38:15 crc kubenswrapper[4792]: I0301 09:38:15.280007 4792 generic.go:334] "Generic (PLEG): container finished" podID="31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8" containerID="40d6bf3bff7640be17f842b1e208ca6cbd13ff723d54eb172fb51e9f37d11d71" exitCode=0 Mar 01 09:38:15 crc kubenswrapper[4792]: I0301 09:38:15.280208 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kbjsl" event={"ID":"31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8","Type":"ContainerDied","Data":"40d6bf3bff7640be17f842b1e208ca6cbd13ff723d54eb172fb51e9f37d11d71"} Mar 01 09:38:15 crc kubenswrapper[4792]: I0301 09:38:15.413171 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:38:15 crc kubenswrapper[4792]: E0301 09:38:15.413404 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:38:16 crc kubenswrapper[4792]: I0301 09:38:16.760095 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kbjsl" Mar 01 09:38:16 crc kubenswrapper[4792]: I0301 09:38:16.800255 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8-ssh-key-openstack-edpm-ipam\") pod \"31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8\" (UID: \"31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8\") " Mar 01 09:38:16 crc kubenswrapper[4792]: I0301 09:38:16.800336 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9642\" (UniqueName: \"kubernetes.io/projected/31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8-kube-api-access-f9642\") pod \"31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8\" (UID: \"31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8\") " Mar 01 09:38:16 crc kubenswrapper[4792]: I0301 09:38:16.800419 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8-inventory\") pod \"31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8\" (UID: \"31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8\") " Mar 01 09:38:16 crc kubenswrapper[4792]: I0301 09:38:16.805711 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8-kube-api-access-f9642" (OuterVolumeSpecName: "kube-api-access-f9642") pod "31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8" (UID: "31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8"). InnerVolumeSpecName "kube-api-access-f9642". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:38:16 crc kubenswrapper[4792]: I0301 09:38:16.827142 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8-inventory" (OuterVolumeSpecName: "inventory") pod "31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8" (UID: "31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:38:16 crc kubenswrapper[4792]: I0301 09:38:16.829522 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8" (UID: "31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:38:16 crc kubenswrapper[4792]: I0301 09:38:16.902027 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:38:16 crc kubenswrapper[4792]: I0301 09:38:16.902056 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9642\" (UniqueName: \"kubernetes.io/projected/31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8-kube-api-access-f9642\") on node \"crc\" DevicePath \"\"" Mar 01 09:38:16 crc kubenswrapper[4792]: I0301 09:38:16.902068 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.297661 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kbjsl" event={"ID":"31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8","Type":"ContainerDied","Data":"26c3c7763b6a7087dbe46227cb8263cd4e006283a3845857a05816e842cb3bd6"} Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.297700 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26c3c7763b6a7087dbe46227cb8263cd4e006283a3845857a05816e842cb3bd6" Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.297754 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kbjsl" Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.379991 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt"] Mar 01 09:38:17 crc kubenswrapper[4792]: E0301 09:38:17.380407 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.380430 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:38:17 crc kubenswrapper[4792]: E0301 09:38:17.380442 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e9f36fa-467b-4b49-9d69-b465a22837e5" containerName="oc" Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.380452 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e9f36fa-467b-4b49-9d69-b465a22837e5" containerName="oc" Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.380710 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.380733 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e9f36fa-467b-4b49-9d69-b465a22837e5" containerName="oc" Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.381451 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt" Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.383509 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.384525 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.384988 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.390920 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt"] Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.392644 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.511629 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8787b5ba-7462-4594-a11d-2d0afbfe3c1c-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt\" (UID: \"8787b5ba-7462-4594-a11d-2d0afbfe3c1c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt" Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.511689 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8787b5ba-7462-4594-a11d-2d0afbfe3c1c-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt\" (UID: \"8787b5ba-7462-4594-a11d-2d0afbfe3c1c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt" Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.511810 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lqcj\" (UniqueName: \"kubernetes.io/projected/8787b5ba-7462-4594-a11d-2d0afbfe3c1c-kube-api-access-6lqcj\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt\" (UID: \"8787b5ba-7462-4594-a11d-2d0afbfe3c1c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt" Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.613938 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lqcj\" (UniqueName: \"kubernetes.io/projected/8787b5ba-7462-4594-a11d-2d0afbfe3c1c-kube-api-access-6lqcj\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt\" (UID: \"8787b5ba-7462-4594-a11d-2d0afbfe3c1c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt" Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.614047 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8787b5ba-7462-4594-a11d-2d0afbfe3c1c-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt\" (UID: \"8787b5ba-7462-4594-a11d-2d0afbfe3c1c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt" Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.614072 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8787b5ba-7462-4594-a11d-2d0afbfe3c1c-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt\" (UID: \"8787b5ba-7462-4594-a11d-2d0afbfe3c1c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt" Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.617549 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8787b5ba-7462-4594-a11d-2d0afbfe3c1c-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt\" (UID: \"8787b5ba-7462-4594-a11d-2d0afbfe3c1c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt" Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.618464 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8787b5ba-7462-4594-a11d-2d0afbfe3c1c-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt\" (UID: \"8787b5ba-7462-4594-a11d-2d0afbfe3c1c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt" Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.634046 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lqcj\" (UniqueName: \"kubernetes.io/projected/8787b5ba-7462-4594-a11d-2d0afbfe3c1c-kube-api-access-6lqcj\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt\" (UID: \"8787b5ba-7462-4594-a11d-2d0afbfe3c1c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt" Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.697093 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt" Mar 01 09:38:18 crc kubenswrapper[4792]: W0301 09:38:18.259185 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8787b5ba_7462_4594_a11d_2d0afbfe3c1c.slice/crio-cc836aef7a180983b871b836e0db2dda5971b8cc57635fccf899b4b8a37fabc2 WatchSource:0}: Error finding container cc836aef7a180983b871b836e0db2dda5971b8cc57635fccf899b4b8a37fabc2: Status 404 returned error can't find the container with id cc836aef7a180983b871b836e0db2dda5971b8cc57635fccf899b4b8a37fabc2 Mar 01 09:38:18 crc kubenswrapper[4792]: I0301 09:38:18.264926 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt"] Mar 01 09:38:18 crc kubenswrapper[4792]: I0301 09:38:18.305509 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt" event={"ID":"8787b5ba-7462-4594-a11d-2d0afbfe3c1c","Type":"ContainerStarted","Data":"cc836aef7a180983b871b836e0db2dda5971b8cc57635fccf899b4b8a37fabc2"} Mar 01 09:38:19 crc kubenswrapper[4792]: I0301 09:38:19.316387 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt" event={"ID":"8787b5ba-7462-4594-a11d-2d0afbfe3c1c","Type":"ContainerStarted","Data":"a0ae22fc112a93c604faf629d5ca18987a7aa343d92829f2e65e4501f2454496"} Mar 01 09:38:19 crc kubenswrapper[4792]: I0301 09:38:19.338018 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt" podStartSLOduration=1.857130669 podStartE2EDuration="2.337999675s" podCreationTimestamp="2026-03-01 09:38:17 +0000 UTC" firstStartedPulling="2026-03-01 09:38:18.261493321 +0000 UTC m=+1827.503372518" lastFinishedPulling="2026-03-01 09:38:18.742362327 +0000 UTC m=+1827.984241524" observedRunningTime="2026-03-01 09:38:19.337744198 +0000 UTC m=+1828.579623395" watchObservedRunningTime="2026-03-01 09:38:19.337999675 +0000 UTC m=+1828.579878872" Mar 01 09:38:23 crc kubenswrapper[4792]: I0301 09:38:23.346882 4792 generic.go:334] "Generic (PLEG): container finished" podID="8787b5ba-7462-4594-a11d-2d0afbfe3c1c" containerID="a0ae22fc112a93c604faf629d5ca18987a7aa343d92829f2e65e4501f2454496" exitCode=0 Mar 01 09:38:23 crc kubenswrapper[4792]: I0301 09:38:23.347421 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt" event={"ID":"8787b5ba-7462-4594-a11d-2d0afbfe3c1c","Type":"ContainerDied","Data":"a0ae22fc112a93c604faf629d5ca18987a7aa343d92829f2e65e4501f2454496"} Mar 01 09:38:24 crc kubenswrapper[4792]: I0301 09:38:24.801660 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt" Mar 01 09:38:24 crc kubenswrapper[4792]: I0301 09:38:24.960098 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lqcj\" (UniqueName: \"kubernetes.io/projected/8787b5ba-7462-4594-a11d-2d0afbfe3c1c-kube-api-access-6lqcj\") pod \"8787b5ba-7462-4594-a11d-2d0afbfe3c1c\" (UID: \"8787b5ba-7462-4594-a11d-2d0afbfe3c1c\") " Mar 01 09:38:24 crc kubenswrapper[4792]: I0301 09:38:24.960209 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8787b5ba-7462-4594-a11d-2d0afbfe3c1c-inventory\") pod \"8787b5ba-7462-4594-a11d-2d0afbfe3c1c\" (UID: \"8787b5ba-7462-4594-a11d-2d0afbfe3c1c\") " Mar 01 09:38:24 crc kubenswrapper[4792]: I0301 09:38:24.960417 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8787b5ba-7462-4594-a11d-2d0afbfe3c1c-ssh-key-openstack-edpm-ipam\") pod \"8787b5ba-7462-4594-a11d-2d0afbfe3c1c\" (UID: \"8787b5ba-7462-4594-a11d-2d0afbfe3c1c\") " Mar 01 09:38:24 crc kubenswrapper[4792]: I0301 09:38:24.966233 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8787b5ba-7462-4594-a11d-2d0afbfe3c1c-kube-api-access-6lqcj" (OuterVolumeSpecName: "kube-api-access-6lqcj") pod "8787b5ba-7462-4594-a11d-2d0afbfe3c1c" (UID: "8787b5ba-7462-4594-a11d-2d0afbfe3c1c"). InnerVolumeSpecName "kube-api-access-6lqcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:38:24 crc kubenswrapper[4792]: I0301 09:38:24.986652 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8787b5ba-7462-4594-a11d-2d0afbfe3c1c-inventory" (OuterVolumeSpecName: "inventory") pod "8787b5ba-7462-4594-a11d-2d0afbfe3c1c" (UID: "8787b5ba-7462-4594-a11d-2d0afbfe3c1c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:38:24 crc kubenswrapper[4792]: I0301 09:38:24.998494 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8787b5ba-7462-4594-a11d-2d0afbfe3c1c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8787b5ba-7462-4594-a11d-2d0afbfe3c1c" (UID: "8787b5ba-7462-4594-a11d-2d0afbfe3c1c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.062985 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lqcj\" (UniqueName: \"kubernetes.io/projected/8787b5ba-7462-4594-a11d-2d0afbfe3c1c-kube-api-access-6lqcj\") on node \"crc\" DevicePath \"\"" Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.063286 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8787b5ba-7462-4594-a11d-2d0afbfe3c1c-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.063296 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8787b5ba-7462-4594-a11d-2d0afbfe3c1c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.370416 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt" event={"ID":"8787b5ba-7462-4594-a11d-2d0afbfe3c1c","Type":"ContainerDied","Data":"cc836aef7a180983b871b836e0db2dda5971b8cc57635fccf899b4b8a37fabc2"} Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.370495 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc836aef7a180983b871b836e0db2dda5971b8cc57635fccf899b4b8a37fabc2" Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.370566 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt" Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.445040 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj"] Mar 01 09:38:25 crc kubenswrapper[4792]: E0301 09:38:25.445477 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8787b5ba-7462-4594-a11d-2d0afbfe3c1c" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.445505 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8787b5ba-7462-4594-a11d-2d0afbfe3c1c" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.445752 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="8787b5ba-7462-4594-a11d-2d0afbfe3c1c" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.451376 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj" Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.453282 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.453334 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.454114 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.454609 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.460237 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj"] Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.480263 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6a7e948-b141-4fb0-b717-3d02a9014dd4-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj\" (UID: \"a6a7e948-b141-4fb0-b717-3d02a9014dd4\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj" Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.480306 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgrps\" (UniqueName: \"kubernetes.io/projected/a6a7e948-b141-4fb0-b717-3d02a9014dd4-kube-api-access-tgrps\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj\" (UID: \"a6a7e948-b141-4fb0-b717-3d02a9014dd4\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj" Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.480346 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a6a7e948-b141-4fb0-b717-3d02a9014dd4-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj\" (UID: \"a6a7e948-b141-4fb0-b717-3d02a9014dd4\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj" Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.582245 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6a7e948-b141-4fb0-b717-3d02a9014dd4-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj\" (UID: \"a6a7e948-b141-4fb0-b717-3d02a9014dd4\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj" Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.582474 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgrps\" (UniqueName: \"kubernetes.io/projected/a6a7e948-b141-4fb0-b717-3d02a9014dd4-kube-api-access-tgrps\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj\" (UID: \"a6a7e948-b141-4fb0-b717-3d02a9014dd4\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj" Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.582577 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a6a7e948-b141-4fb0-b717-3d02a9014dd4-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj\" (UID: \"a6a7e948-b141-4fb0-b717-3d02a9014dd4\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj" Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.586048 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6a7e948-b141-4fb0-b717-3d02a9014dd4-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj\" (UID: \"a6a7e948-b141-4fb0-b717-3d02a9014dd4\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj" Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.586225 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a6a7e948-b141-4fb0-b717-3d02a9014dd4-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj\" (UID: \"a6a7e948-b141-4fb0-b717-3d02a9014dd4\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj" Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.606015 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgrps\" (UniqueName: \"kubernetes.io/projected/a6a7e948-b141-4fb0-b717-3d02a9014dd4-kube-api-access-tgrps\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj\" (UID: \"a6a7e948-b141-4fb0-b717-3d02a9014dd4\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj" Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.769436 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj" Mar 01 09:38:26 crc kubenswrapper[4792]: I0301 09:38:26.260444 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj"] Mar 01 09:38:26 crc kubenswrapper[4792]: I0301 09:38:26.378562 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj" event={"ID":"a6a7e948-b141-4fb0-b717-3d02a9014dd4","Type":"ContainerStarted","Data":"05fc82db8e8a9121da2f3c828eec2d2e2261fe72fe811c5f9e0e91a7c4ce867b"} Mar 01 09:38:27 crc kubenswrapper[4792]: I0301 09:38:27.041412 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-gbmwh"] Mar 01 09:38:27 crc kubenswrapper[4792]: I0301 09:38:27.049384 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-gbmwh"] Mar 01 09:38:27 crc kubenswrapper[4792]: I0301 09:38:27.386768 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj" event={"ID":"a6a7e948-b141-4fb0-b717-3d02a9014dd4","Type":"ContainerStarted","Data":"291e30822651ff807afe1c8290d577ee02386c82661acb3758a8b13541958167"} Mar 01 09:38:27 crc kubenswrapper[4792]: I0301 09:38:27.409041 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj" podStartSLOduration=2.010579763 podStartE2EDuration="2.409020157s" podCreationTimestamp="2026-03-01 09:38:25 +0000 UTC" firstStartedPulling="2026-03-01 09:38:26.279762933 +0000 UTC m=+1835.521642130" lastFinishedPulling="2026-03-01 09:38:26.678203327 +0000 UTC m=+1835.920082524" observedRunningTime="2026-03-01 09:38:27.405897158 +0000 UTC m=+1836.647776355" watchObservedRunningTime="2026-03-01 09:38:27.409020157 +0000 UTC m=+1836.650899354" Mar 01 09:38:27 crc kubenswrapper[4792]: I0301 09:38:27.420320 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66aba873-81b0-452a-81f9-73cc18445180" path="/var/lib/kubelet/pods/66aba873-81b0-452a-81f9-73cc18445180/volumes" Mar 01 09:38:28 crc kubenswrapper[4792]: I0301 09:38:28.409400 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:38:28 crc kubenswrapper[4792]: E0301 09:38:28.409687 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:38:28 crc kubenswrapper[4792]: I0301 09:38:28.702408 4792 scope.go:117] "RemoveContainer" containerID="3d36f5fe200b4f79f67807598d19a358fe63f35f70500bebea7ecf29d4c8c11d" Mar 01 09:38:28 crc kubenswrapper[4792]: I0301 09:38:28.733014 4792 scope.go:117] "RemoveContainer" containerID="2a9eb88c21c0505fd080c3b8fba46cc255546b5fb4c130561920988c70383a89" Mar 01 09:38:28 crc kubenswrapper[4792]: I0301 09:38:28.766745 4792 scope.go:117] "RemoveContainer" containerID="01f9646a0afc7be0a0075cd21cc75eb15ebfcd51abc1ad1bd7ae6a925a4bfdd3" Mar 01 09:38:28 crc kubenswrapper[4792]: I0301 09:38:28.814674 4792 scope.go:117] "RemoveContainer" containerID="a5333afd5d7c2f19e4d0551bd45c113ef37b9f8fcc1a7b85eb962769ca9d63e5" Mar 01 09:38:28 crc kubenswrapper[4792]: I0301 09:38:28.845842 4792 scope.go:117] "RemoveContainer" containerID="1cc52ebb7e1b86f46dbab0e11949d60082faaf96962f0529b5c27c6156f59218" Mar 01 09:38:28 crc kubenswrapper[4792]: I0301 09:38:28.877354 4792 scope.go:117] "RemoveContainer" containerID="8ffaa11ad79d37459635175d7fbda620c8204659760d7268eb92ed800bd1a03d" Mar 01 09:38:28 crc kubenswrapper[4792]: I0301 09:38:28.929961 4792 scope.go:117] "RemoveContainer" containerID="d1c76ac502d7f1c626951dc28dbcf8372a61e85a2c8a22e8bee3f4ce1c1f91c2" Mar 01 09:38:28 crc kubenswrapper[4792]: I0301 09:38:28.951844 4792 scope.go:117] "RemoveContainer" containerID="ca00fe55b9ec531c46f7a5dbc120ce6185518403d6904d883bc1ec756288e2a0" Mar 01 09:38:28 crc kubenswrapper[4792]: I0301 09:38:28.983294 4792 scope.go:117] "RemoveContainer" containerID="d49c80ba137c1dfc24f0da7a4050addac018dbbe5eed7701b9bf0c31b472eef5" Mar 01 09:38:29 crc kubenswrapper[4792]: I0301 09:38:29.007826 4792 scope.go:117] "RemoveContainer" containerID="acaeed4a0d4cc4c819f11994601b2946e5e093014d4fc45dbb6ce057d16aef6a" Mar 01 09:38:30 crc kubenswrapper[4792]: I0301 09:38:30.029349 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-f89zl"] Mar 01 09:38:30 crc kubenswrapper[4792]: I0301 09:38:30.039624 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-f89zl"] Mar 01 09:38:31 crc kubenswrapper[4792]: I0301 09:38:31.420535 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e623b24a-64a5-4209-86bb-1814ae9c400b" path="/var/lib/kubelet/pods/e623b24a-64a5-4209-86bb-1814ae9c400b/volumes" Mar 01 09:38:33 crc kubenswrapper[4792]: I0301 09:38:33.028687 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-m9l5f"] Mar 01 09:38:33 crc kubenswrapper[4792]: I0301 09:38:33.036239 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-m9l5f"] Mar 01 09:38:33 crc kubenswrapper[4792]: I0301 09:38:33.419328 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7108e9ac-8215-41ca-ac84-3b3851142a42" path="/var/lib/kubelet/pods/7108e9ac-8215-41ca-ac84-3b3851142a42/volumes" Mar 01 09:38:41 crc kubenswrapper[4792]: I0301 09:38:41.413768 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:38:41 crc kubenswrapper[4792]: E0301 09:38:41.414509 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:38:48 crc kubenswrapper[4792]: I0301 09:38:48.033450 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-bxx5d"] Mar 01 09:38:48 crc kubenswrapper[4792]: I0301 09:38:48.041062 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-bxx5d"] Mar 01 09:38:49 crc kubenswrapper[4792]: I0301 09:38:49.418400 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e6bad7a-881b-4ef4-9916-f447e2fc1ffd" path="/var/lib/kubelet/pods/9e6bad7a-881b-4ef4-9916-f447e2fc1ffd/volumes" Mar 01 09:38:51 crc kubenswrapper[4792]: I0301 09:38:51.037570 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-gsxqb"] Mar 01 09:38:51 crc kubenswrapper[4792]: I0301 09:38:51.046776 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-gsxqb"] Mar 01 09:38:51 crc kubenswrapper[4792]: I0301 09:38:51.422354 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="737aa0a0-6e53-451e-9d5f-2deada87b5b4" path="/var/lib/kubelet/pods/737aa0a0-6e53-451e-9d5f-2deada87b5b4/volumes" Mar 01 09:38:52 crc kubenswrapper[4792]: I0301 09:38:52.408677 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:38:52 crc kubenswrapper[4792]: E0301 09:38:52.409217 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:39:04 crc kubenswrapper[4792]: I0301 09:39:04.412181 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:39:04 crc kubenswrapper[4792]: E0301 09:39:04.412882 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:39:15 crc kubenswrapper[4792]: I0301 09:39:15.408996 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:39:15 crc kubenswrapper[4792]: E0301 09:39:15.409842 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:39:18 crc kubenswrapper[4792]: I0301 09:39:18.827509 4792 generic.go:334] "Generic (PLEG): container finished" podID="a6a7e948-b141-4fb0-b717-3d02a9014dd4" containerID="291e30822651ff807afe1c8290d577ee02386c82661acb3758a8b13541958167" exitCode=0 Mar 01 09:39:18 crc kubenswrapper[4792]: I0301 09:39:18.827573 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj" event={"ID":"a6a7e948-b141-4fb0-b717-3d02a9014dd4","Type":"ContainerDied","Data":"291e30822651ff807afe1c8290d577ee02386c82661acb3758a8b13541958167"} Mar 01 09:39:20 crc kubenswrapper[4792]: I0301 09:39:20.228202 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj" Mar 01 09:39:20 crc kubenswrapper[4792]: I0301 09:39:20.351006 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgrps\" (UniqueName: \"kubernetes.io/projected/a6a7e948-b141-4fb0-b717-3d02a9014dd4-kube-api-access-tgrps\") pod \"a6a7e948-b141-4fb0-b717-3d02a9014dd4\" (UID: \"a6a7e948-b141-4fb0-b717-3d02a9014dd4\") " Mar 01 09:39:20 crc kubenswrapper[4792]: I0301 09:39:20.351206 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6a7e948-b141-4fb0-b717-3d02a9014dd4-inventory\") pod \"a6a7e948-b141-4fb0-b717-3d02a9014dd4\" (UID: \"a6a7e948-b141-4fb0-b717-3d02a9014dd4\") " Mar 01 09:39:20 crc kubenswrapper[4792]: I0301 09:39:20.351301 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a6a7e948-b141-4fb0-b717-3d02a9014dd4-ssh-key-openstack-edpm-ipam\") pod \"a6a7e948-b141-4fb0-b717-3d02a9014dd4\" (UID: \"a6a7e948-b141-4fb0-b717-3d02a9014dd4\") " Mar 01 09:39:20 crc kubenswrapper[4792]: I0301 09:39:20.356962 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6a7e948-b141-4fb0-b717-3d02a9014dd4-kube-api-access-tgrps" (OuterVolumeSpecName: "kube-api-access-tgrps") pod "a6a7e948-b141-4fb0-b717-3d02a9014dd4" (UID: "a6a7e948-b141-4fb0-b717-3d02a9014dd4"). InnerVolumeSpecName "kube-api-access-tgrps". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:39:20 crc kubenswrapper[4792]: I0301 09:39:20.393456 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6a7e948-b141-4fb0-b717-3d02a9014dd4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a6a7e948-b141-4fb0-b717-3d02a9014dd4" (UID: "a6a7e948-b141-4fb0-b717-3d02a9014dd4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:39:20 crc kubenswrapper[4792]: I0301 09:39:20.395353 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6a7e948-b141-4fb0-b717-3d02a9014dd4-inventory" (OuterVolumeSpecName: "inventory") pod "a6a7e948-b141-4fb0-b717-3d02a9014dd4" (UID: "a6a7e948-b141-4fb0-b717-3d02a9014dd4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:39:20 crc kubenswrapper[4792]: I0301 09:39:20.453966 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6a7e948-b141-4fb0-b717-3d02a9014dd4-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 09:39:20 crc kubenswrapper[4792]: I0301 09:39:20.454641 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a6a7e948-b141-4fb0-b717-3d02a9014dd4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:39:20 crc kubenswrapper[4792]: I0301 09:39:20.454656 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgrps\" (UniqueName: \"kubernetes.io/projected/a6a7e948-b141-4fb0-b717-3d02a9014dd4-kube-api-access-tgrps\") on node \"crc\" DevicePath \"\"" Mar 01 09:39:20 crc kubenswrapper[4792]: I0301 09:39:20.845699 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj" event={"ID":"a6a7e948-b141-4fb0-b717-3d02a9014dd4","Type":"ContainerDied","Data":"05fc82db8e8a9121da2f3c828eec2d2e2261fe72fe811c5f9e0e91a7c4ce867b"} Mar 01 09:39:20 crc kubenswrapper[4792]: I0301 09:39:20.845740 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05fc82db8e8a9121da2f3c828eec2d2e2261fe72fe811c5f9e0e91a7c4ce867b" Mar 01 09:39:20 crc kubenswrapper[4792]: I0301 09:39:20.845792 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj" Mar 01 09:39:20 crc kubenswrapper[4792]: I0301 09:39:20.939835 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-27xxx"] Mar 01 09:39:20 crc kubenswrapper[4792]: E0301 09:39:20.940388 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6a7e948-b141-4fb0-b717-3d02a9014dd4" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:39:20 crc kubenswrapper[4792]: I0301 09:39:20.940409 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6a7e948-b141-4fb0-b717-3d02a9014dd4" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:39:20 crc kubenswrapper[4792]: I0301 09:39:20.940625 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6a7e948-b141-4fb0-b717-3d02a9014dd4" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:39:20 crc kubenswrapper[4792]: I0301 09:39:20.941395 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-27xxx" Mar 01 09:39:20 crc kubenswrapper[4792]: I0301 09:39:20.943694 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:39:20 crc kubenswrapper[4792]: I0301 09:39:20.943945 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:39:20 crc kubenswrapper[4792]: I0301 09:39:20.944181 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:39:20 crc kubenswrapper[4792]: I0301 09:39:20.944390 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:39:20 crc kubenswrapper[4792]: I0301 09:39:20.962974 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-27xxx"] Mar 01 09:39:21 crc kubenswrapper[4792]: I0301 09:39:21.065795 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b5ccb279-c8b2-4288-9072-1175061be204-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-27xxx\" (UID: \"b5ccb279-c8b2-4288-9072-1175061be204\") " pod="openstack/ssh-known-hosts-edpm-deployment-27xxx" Mar 01 09:39:21 crc kubenswrapper[4792]: I0301 09:39:21.065921 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8whbm\" (UniqueName: \"kubernetes.io/projected/b5ccb279-c8b2-4288-9072-1175061be204-kube-api-access-8whbm\") pod \"ssh-known-hosts-edpm-deployment-27xxx\" (UID: \"b5ccb279-c8b2-4288-9072-1175061be204\") " pod="openstack/ssh-known-hosts-edpm-deployment-27xxx" Mar 01 09:39:21 crc kubenswrapper[4792]: I0301 09:39:21.066004 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b5ccb279-c8b2-4288-9072-1175061be204-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-27xxx\" (UID: \"b5ccb279-c8b2-4288-9072-1175061be204\") " pod="openstack/ssh-known-hosts-edpm-deployment-27xxx" Mar 01 09:39:21 crc kubenswrapper[4792]: I0301 09:39:21.167736 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8whbm\" (UniqueName: \"kubernetes.io/projected/b5ccb279-c8b2-4288-9072-1175061be204-kube-api-access-8whbm\") pod \"ssh-known-hosts-edpm-deployment-27xxx\" (UID: \"b5ccb279-c8b2-4288-9072-1175061be204\") " pod="openstack/ssh-known-hosts-edpm-deployment-27xxx" Mar 01 09:39:21 crc kubenswrapper[4792]: I0301 09:39:21.167792 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b5ccb279-c8b2-4288-9072-1175061be204-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-27xxx\" (UID: \"b5ccb279-c8b2-4288-9072-1175061be204\") " pod="openstack/ssh-known-hosts-edpm-deployment-27xxx" Mar 01 09:39:21 crc kubenswrapper[4792]: I0301 09:39:21.167920 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b5ccb279-c8b2-4288-9072-1175061be204-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-27xxx\" (UID: \"b5ccb279-c8b2-4288-9072-1175061be204\") " pod="openstack/ssh-known-hosts-edpm-deployment-27xxx" Mar 01 09:39:21 crc kubenswrapper[4792]: I0301 09:39:21.175862 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b5ccb279-c8b2-4288-9072-1175061be204-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-27xxx\" (UID: \"b5ccb279-c8b2-4288-9072-1175061be204\") " pod="openstack/ssh-known-hosts-edpm-deployment-27xxx" Mar 01 09:39:21 crc kubenswrapper[4792]: I0301 09:39:21.176382 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b5ccb279-c8b2-4288-9072-1175061be204-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-27xxx\" (UID: \"b5ccb279-c8b2-4288-9072-1175061be204\") " pod="openstack/ssh-known-hosts-edpm-deployment-27xxx" Mar 01 09:39:21 crc kubenswrapper[4792]: I0301 09:39:21.185421 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8whbm\" (UniqueName: \"kubernetes.io/projected/b5ccb279-c8b2-4288-9072-1175061be204-kube-api-access-8whbm\") pod \"ssh-known-hosts-edpm-deployment-27xxx\" (UID: \"b5ccb279-c8b2-4288-9072-1175061be204\") " pod="openstack/ssh-known-hosts-edpm-deployment-27xxx" Mar 01 09:39:21 crc kubenswrapper[4792]: I0301 09:39:21.259729 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-27xxx" Mar 01 09:39:21 crc kubenswrapper[4792]: I0301 09:39:21.761232 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-27xxx"] Mar 01 09:39:21 crc kubenswrapper[4792]: W0301 09:39:21.768036 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5ccb279_c8b2_4288_9072_1175061be204.slice/crio-1bbb3c139b7d53c7d463ca62df59c4f4dd38f92de5cd1ea315a5ed35bc41088b WatchSource:0}: Error finding container 1bbb3c139b7d53c7d463ca62df59c4f4dd38f92de5cd1ea315a5ed35bc41088b: Status 404 returned error can't find the container with id 1bbb3c139b7d53c7d463ca62df59c4f4dd38f92de5cd1ea315a5ed35bc41088b Mar 01 09:39:21 crc kubenswrapper[4792]: I0301 09:39:21.771825 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 01 09:39:21 crc kubenswrapper[4792]: I0301 09:39:21.854707 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-27xxx" event={"ID":"b5ccb279-c8b2-4288-9072-1175061be204","Type":"ContainerStarted","Data":"1bbb3c139b7d53c7d463ca62df59c4f4dd38f92de5cd1ea315a5ed35bc41088b"} Mar 01 09:39:22 crc kubenswrapper[4792]: I0301 09:39:22.869974 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-27xxx" event={"ID":"b5ccb279-c8b2-4288-9072-1175061be204","Type":"ContainerStarted","Data":"6ad37eb5a9ce8285310a5d61f804630e8b0a3954519985d12a0d71d52e93d217"} Mar 01 09:39:22 crc kubenswrapper[4792]: I0301 09:39:22.897050 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-27xxx" podStartSLOduration=2.423246503 podStartE2EDuration="2.897033189s" podCreationTimestamp="2026-03-01 09:39:20 +0000 UTC" firstStartedPulling="2026-03-01 09:39:21.771532701 +0000 UTC m=+1891.013411908" lastFinishedPulling="2026-03-01 09:39:22.245319397 +0000 UTC m=+1891.487198594" observedRunningTime="2026-03-01 09:39:22.887858826 +0000 UTC m=+1892.129738043" watchObservedRunningTime="2026-03-01 09:39:22.897033189 +0000 UTC m=+1892.138912386" Mar 01 09:39:28 crc kubenswrapper[4792]: I0301 09:39:28.408890 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:39:28 crc kubenswrapper[4792]: E0301 09:39:28.409510 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:39:28 crc kubenswrapper[4792]: I0301 09:39:28.916626 4792 generic.go:334] "Generic (PLEG): container finished" podID="b5ccb279-c8b2-4288-9072-1175061be204" containerID="6ad37eb5a9ce8285310a5d61f804630e8b0a3954519985d12a0d71d52e93d217" exitCode=0 Mar 01 09:39:28 crc kubenswrapper[4792]: I0301 09:39:28.916688 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-27xxx" event={"ID":"b5ccb279-c8b2-4288-9072-1175061be204","Type":"ContainerDied","Data":"6ad37eb5a9ce8285310a5d61f804630e8b0a3954519985d12a0d71d52e93d217"} Mar 01 09:39:29 crc kubenswrapper[4792]: I0301 09:39:29.215746 4792 scope.go:117] "RemoveContainer" containerID="5753e582a896b76584d26c8b6fbaf1b0c86841fe9960e87056d0ee4ab735dcee" Mar 01 09:39:29 crc kubenswrapper[4792]: I0301 09:39:29.264378 4792 scope.go:117] "RemoveContainer" containerID="eb8be93bccd98a25e35b0b04ca4b752b359f9eaa5d34f76412ea01464dd8c3f9" Mar 01 09:39:29 crc kubenswrapper[4792]: I0301 09:39:29.288960 4792 scope.go:117] "RemoveContainer" containerID="4a5e793bcbd54f67d2aa56894763cca0ce1c06ab0ab5c25152dbd8e3b2985066" Mar 01 09:39:29 crc kubenswrapper[4792]: I0301 09:39:29.330431 4792 scope.go:117] "RemoveContainer" containerID="fded697acef2e4939680efd28c30dd0707c1b449f6152a36c981b92695845052" Mar 01 09:39:30 crc kubenswrapper[4792]: I0301 09:39:30.295420 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-27xxx" Mar 01 09:39:30 crc kubenswrapper[4792]: I0301 09:39:30.346546 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8whbm\" (UniqueName: \"kubernetes.io/projected/b5ccb279-c8b2-4288-9072-1175061be204-kube-api-access-8whbm\") pod \"b5ccb279-c8b2-4288-9072-1175061be204\" (UID: \"b5ccb279-c8b2-4288-9072-1175061be204\") " Mar 01 09:39:30 crc kubenswrapper[4792]: I0301 09:39:30.348715 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b5ccb279-c8b2-4288-9072-1175061be204-inventory-0\") pod \"b5ccb279-c8b2-4288-9072-1175061be204\" (UID: \"b5ccb279-c8b2-4288-9072-1175061be204\") " Mar 01 09:39:30 crc kubenswrapper[4792]: I0301 09:39:30.348797 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b5ccb279-c8b2-4288-9072-1175061be204-ssh-key-openstack-edpm-ipam\") pod \"b5ccb279-c8b2-4288-9072-1175061be204\" (UID: \"b5ccb279-c8b2-4288-9072-1175061be204\") " Mar 01 09:39:30 crc kubenswrapper[4792]: I0301 09:39:30.367802 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5ccb279-c8b2-4288-9072-1175061be204-kube-api-access-8whbm" (OuterVolumeSpecName: "kube-api-access-8whbm") pod "b5ccb279-c8b2-4288-9072-1175061be204" (UID: "b5ccb279-c8b2-4288-9072-1175061be204"). InnerVolumeSpecName "kube-api-access-8whbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:39:30 crc kubenswrapper[4792]: I0301 09:39:30.380846 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5ccb279-c8b2-4288-9072-1175061be204-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "b5ccb279-c8b2-4288-9072-1175061be204" (UID: "b5ccb279-c8b2-4288-9072-1175061be204"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:39:30 crc kubenswrapper[4792]: I0301 09:39:30.401209 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5ccb279-c8b2-4288-9072-1175061be204-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b5ccb279-c8b2-4288-9072-1175061be204" (UID: "b5ccb279-c8b2-4288-9072-1175061be204"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:39:30 crc kubenswrapper[4792]: I0301 09:39:30.451311 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8whbm\" (UniqueName: \"kubernetes.io/projected/b5ccb279-c8b2-4288-9072-1175061be204-kube-api-access-8whbm\") on node \"crc\" DevicePath \"\"" Mar 01 09:39:30 crc kubenswrapper[4792]: I0301 09:39:30.452872 4792 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b5ccb279-c8b2-4288-9072-1175061be204-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 01 09:39:30 crc kubenswrapper[4792]: I0301 09:39:30.453013 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b5ccb279-c8b2-4288-9072-1175061be204-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:39:30 crc kubenswrapper[4792]: I0301 09:39:30.933084 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-27xxx" event={"ID":"b5ccb279-c8b2-4288-9072-1175061be204","Type":"ContainerDied","Data":"1bbb3c139b7d53c7d463ca62df59c4f4dd38f92de5cd1ea315a5ed35bc41088b"} Mar 01 09:39:30 crc kubenswrapper[4792]: I0301 09:39:30.933406 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1bbb3c139b7d53c7d463ca62df59c4f4dd38f92de5cd1ea315a5ed35bc41088b" Mar 01 09:39:30 crc kubenswrapper[4792]: I0301 09:39:30.933126 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-27xxx" Mar 01 09:39:31 crc kubenswrapper[4792]: I0301 09:39:31.007108 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-j8zd5"] Mar 01 09:39:31 crc kubenswrapper[4792]: E0301 09:39:31.007662 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5ccb279-c8b2-4288-9072-1175061be204" containerName="ssh-known-hosts-edpm-deployment" Mar 01 09:39:31 crc kubenswrapper[4792]: I0301 09:39:31.007728 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5ccb279-c8b2-4288-9072-1175061be204" containerName="ssh-known-hosts-edpm-deployment" Mar 01 09:39:31 crc kubenswrapper[4792]: I0301 09:39:31.008010 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5ccb279-c8b2-4288-9072-1175061be204" containerName="ssh-known-hosts-edpm-deployment" Mar 01 09:39:31 crc kubenswrapper[4792]: I0301 09:39:31.008692 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-j8zd5" Mar 01 09:39:31 crc kubenswrapper[4792]: I0301 09:39:31.015173 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:39:31 crc kubenswrapper[4792]: I0301 09:39:31.015323 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:39:31 crc kubenswrapper[4792]: I0301 09:39:31.015704 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:39:31 crc kubenswrapper[4792]: I0301 09:39:31.015885 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:39:31 crc kubenswrapper[4792]: I0301 09:39:31.031505 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-j8zd5"] Mar 01 09:39:31 crc kubenswrapper[4792]: I0301 09:39:31.063775 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c6429ad-21d8-4f58-900b-e5f6fe4d603d-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-j8zd5\" (UID: \"5c6429ad-21d8-4f58-900b-e5f6fe4d603d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-j8zd5" Mar 01 09:39:31 crc kubenswrapper[4792]: I0301 09:39:31.064022 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5c6429ad-21d8-4f58-900b-e5f6fe4d603d-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-j8zd5\" (UID: \"5c6429ad-21d8-4f58-900b-e5f6fe4d603d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-j8zd5" Mar 01 09:39:31 crc kubenswrapper[4792]: I0301 09:39:31.064089 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99hfr\" (UniqueName: \"kubernetes.io/projected/5c6429ad-21d8-4f58-900b-e5f6fe4d603d-kube-api-access-99hfr\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-j8zd5\" (UID: \"5c6429ad-21d8-4f58-900b-e5f6fe4d603d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-j8zd5" Mar 01 09:39:31 crc kubenswrapper[4792]: I0301 09:39:31.166734 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c6429ad-21d8-4f58-900b-e5f6fe4d603d-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-j8zd5\" (UID: \"5c6429ad-21d8-4f58-900b-e5f6fe4d603d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-j8zd5" Mar 01 09:39:31 crc kubenswrapper[4792]: I0301 09:39:31.166829 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5c6429ad-21d8-4f58-900b-e5f6fe4d603d-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-j8zd5\" (UID: \"5c6429ad-21d8-4f58-900b-e5f6fe4d603d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-j8zd5" Mar 01 09:39:31 crc kubenswrapper[4792]: I0301 09:39:31.166940 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99hfr\" (UniqueName: \"kubernetes.io/projected/5c6429ad-21d8-4f58-900b-e5f6fe4d603d-kube-api-access-99hfr\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-j8zd5\" (UID: \"5c6429ad-21d8-4f58-900b-e5f6fe4d603d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-j8zd5" Mar 01 09:39:31 crc kubenswrapper[4792]: I0301 09:39:31.171305 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5c6429ad-21d8-4f58-900b-e5f6fe4d603d-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-j8zd5\" (UID: \"5c6429ad-21d8-4f58-900b-e5f6fe4d603d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-j8zd5" Mar 01 09:39:31 crc kubenswrapper[4792]: I0301 09:39:31.175372 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c6429ad-21d8-4f58-900b-e5f6fe4d603d-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-j8zd5\" (UID: \"5c6429ad-21d8-4f58-900b-e5f6fe4d603d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-j8zd5" Mar 01 09:39:31 crc kubenswrapper[4792]: I0301 09:39:31.191632 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99hfr\" (UniqueName: \"kubernetes.io/projected/5c6429ad-21d8-4f58-900b-e5f6fe4d603d-kube-api-access-99hfr\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-j8zd5\" (UID: \"5c6429ad-21d8-4f58-900b-e5f6fe4d603d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-j8zd5" Mar 01 09:39:31 crc kubenswrapper[4792]: I0301 09:39:31.333579 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-j8zd5" Mar 01 09:39:31 crc kubenswrapper[4792]: I0301 09:39:31.939566 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-j8zd5"] Mar 01 09:39:32 crc kubenswrapper[4792]: I0301 09:39:32.953860 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-j8zd5" event={"ID":"5c6429ad-21d8-4f58-900b-e5f6fe4d603d","Type":"ContainerStarted","Data":"4100c168a5febe541b2b6fdd770ebafed4e19b504bd5f0104ce3f530de9d8c6d"} Mar 01 09:39:32 crc kubenswrapper[4792]: I0301 09:39:32.954201 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-j8zd5" event={"ID":"5c6429ad-21d8-4f58-900b-e5f6fe4d603d","Type":"ContainerStarted","Data":"fb07e1f6288a759448ea11b2782889c47aeb90f77ddbe7b61acd663c1b9e7723"} Mar 01 09:39:32 crc kubenswrapper[4792]: I0301 09:39:32.969251 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-j8zd5" podStartSLOduration=2.5538495020000003 podStartE2EDuration="2.969234086s" podCreationTimestamp="2026-03-01 09:39:30 +0000 UTC" firstStartedPulling="2026-03-01 09:39:31.950021955 +0000 UTC m=+1901.191901162" lastFinishedPulling="2026-03-01 09:39:32.365406549 +0000 UTC m=+1901.607285746" observedRunningTime="2026-03-01 09:39:32.966954598 +0000 UTC m=+1902.208833795" watchObservedRunningTime="2026-03-01 09:39:32.969234086 +0000 UTC m=+1902.211113283" Mar 01 09:39:35 crc kubenswrapper[4792]: I0301 09:39:35.047214 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-3b7a-account-create-update-2vc26"] Mar 01 09:39:35 crc kubenswrapper[4792]: I0301 09:39:35.054055 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-3b7a-account-create-update-2vc26"] Mar 01 09:39:35 crc kubenswrapper[4792]: I0301 09:39:35.421806 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2" path="/var/lib/kubelet/pods/e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2/volumes" Mar 01 09:39:36 crc kubenswrapper[4792]: I0301 09:39:36.047541 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-zj224"] Mar 01 09:39:36 crc kubenswrapper[4792]: I0301 09:39:36.059522 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-8f2r2"] Mar 01 09:39:36 crc kubenswrapper[4792]: I0301 09:39:36.081949 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-qt42r"] Mar 01 09:39:36 crc kubenswrapper[4792]: I0301 09:39:36.094993 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-c72f-account-create-update-x9vvj"] Mar 01 09:39:36 crc kubenswrapper[4792]: I0301 09:39:36.104229 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-474a-account-create-update-dlgkl"] Mar 01 09:39:36 crc kubenswrapper[4792]: I0301 09:39:36.113141 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-zj224"] Mar 01 09:39:36 crc kubenswrapper[4792]: I0301 09:39:36.121652 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-8f2r2"] Mar 01 09:39:36 crc kubenswrapper[4792]: I0301 09:39:36.128887 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-474a-account-create-update-dlgkl"] Mar 01 09:39:36 crc kubenswrapper[4792]: I0301 09:39:36.136606 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-qt42r"] Mar 01 09:39:36 crc kubenswrapper[4792]: I0301 09:39:36.150476 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-c72f-account-create-update-x9vvj"] Mar 01 09:39:37 crc kubenswrapper[4792]: I0301 09:39:37.421662 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09b4c86e-31ba-4d91-a602-39fa3a57c798" path="/var/lib/kubelet/pods/09b4c86e-31ba-4d91-a602-39fa3a57c798/volumes" Mar 01 09:39:37 crc kubenswrapper[4792]: I0301 09:39:37.423127 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21b0442e-f4b4-4f59-b3c5-1510ae4d792c" path="/var/lib/kubelet/pods/21b0442e-f4b4-4f59-b3c5-1510ae4d792c/volumes" Mar 01 09:39:37 crc kubenswrapper[4792]: I0301 09:39:37.423678 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a069955e-f546-4522-97ec-5a529f79b1aa" path="/var/lib/kubelet/pods/a069955e-f546-4522-97ec-5a529f79b1aa/volumes" Mar 01 09:39:37 crc kubenswrapper[4792]: I0301 09:39:37.424257 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe" path="/var/lib/kubelet/pods/f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe/volumes" Mar 01 09:39:37 crc kubenswrapper[4792]: I0301 09:39:37.425484 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2be4f49-c20a-4e25-bff3-e4617d275fa1" path="/var/lib/kubelet/pods/f2be4f49-c20a-4e25-bff3-e4617d275fa1/volumes" Mar 01 09:39:39 crc kubenswrapper[4792]: I0301 09:39:39.409126 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:39:39 crc kubenswrapper[4792]: E0301 09:39:39.409764 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:39:41 crc kubenswrapper[4792]: I0301 09:39:41.010855 4792 generic.go:334] "Generic (PLEG): container finished" podID="5c6429ad-21d8-4f58-900b-e5f6fe4d603d" containerID="4100c168a5febe541b2b6fdd770ebafed4e19b504bd5f0104ce3f530de9d8c6d" exitCode=0 Mar 01 09:39:41 crc kubenswrapper[4792]: I0301 09:39:41.010925 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-j8zd5" event={"ID":"5c6429ad-21d8-4f58-900b-e5f6fe4d603d","Type":"ContainerDied","Data":"4100c168a5febe541b2b6fdd770ebafed4e19b504bd5f0104ce3f530de9d8c6d"} Mar 01 09:39:42 crc kubenswrapper[4792]: I0301 09:39:42.498520 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-j8zd5" Mar 01 09:39:42 crc kubenswrapper[4792]: I0301 09:39:42.672640 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5c6429ad-21d8-4f58-900b-e5f6fe4d603d-ssh-key-openstack-edpm-ipam\") pod \"5c6429ad-21d8-4f58-900b-e5f6fe4d603d\" (UID: \"5c6429ad-21d8-4f58-900b-e5f6fe4d603d\") " Mar 01 09:39:42 crc kubenswrapper[4792]: I0301 09:39:42.672685 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99hfr\" (UniqueName: \"kubernetes.io/projected/5c6429ad-21d8-4f58-900b-e5f6fe4d603d-kube-api-access-99hfr\") pod \"5c6429ad-21d8-4f58-900b-e5f6fe4d603d\" (UID: \"5c6429ad-21d8-4f58-900b-e5f6fe4d603d\") " Mar 01 09:39:42 crc kubenswrapper[4792]: I0301 09:39:42.672791 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c6429ad-21d8-4f58-900b-e5f6fe4d603d-inventory\") pod \"5c6429ad-21d8-4f58-900b-e5f6fe4d603d\" (UID: \"5c6429ad-21d8-4f58-900b-e5f6fe4d603d\") " Mar 01 09:39:42 crc kubenswrapper[4792]: I0301 09:39:42.678415 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c6429ad-21d8-4f58-900b-e5f6fe4d603d-kube-api-access-99hfr" (OuterVolumeSpecName: "kube-api-access-99hfr") pod "5c6429ad-21d8-4f58-900b-e5f6fe4d603d" (UID: "5c6429ad-21d8-4f58-900b-e5f6fe4d603d"). InnerVolumeSpecName "kube-api-access-99hfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:39:42 crc kubenswrapper[4792]: I0301 09:39:42.698150 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c6429ad-21d8-4f58-900b-e5f6fe4d603d-inventory" (OuterVolumeSpecName: "inventory") pod "5c6429ad-21d8-4f58-900b-e5f6fe4d603d" (UID: "5c6429ad-21d8-4f58-900b-e5f6fe4d603d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:39:42 crc kubenswrapper[4792]: I0301 09:39:42.703798 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c6429ad-21d8-4f58-900b-e5f6fe4d603d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5c6429ad-21d8-4f58-900b-e5f6fe4d603d" (UID: "5c6429ad-21d8-4f58-900b-e5f6fe4d603d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:39:42 crc kubenswrapper[4792]: I0301 09:39:42.774054 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5c6429ad-21d8-4f58-900b-e5f6fe4d603d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:39:42 crc kubenswrapper[4792]: I0301 09:39:42.774084 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99hfr\" (UniqueName: \"kubernetes.io/projected/5c6429ad-21d8-4f58-900b-e5f6fe4d603d-kube-api-access-99hfr\") on node \"crc\" DevicePath \"\"" Mar 01 09:39:42 crc kubenswrapper[4792]: I0301 09:39:42.774095 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c6429ad-21d8-4f58-900b-e5f6fe4d603d-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 09:39:43 crc kubenswrapper[4792]: I0301 09:39:43.032754 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-j8zd5" event={"ID":"5c6429ad-21d8-4f58-900b-e5f6fe4d603d","Type":"ContainerDied","Data":"fb07e1f6288a759448ea11b2782889c47aeb90f77ddbe7b61acd663c1b9e7723"} Mar 01 09:39:43 crc kubenswrapper[4792]: I0301 09:39:43.032798 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb07e1f6288a759448ea11b2782889c47aeb90f77ddbe7b61acd663c1b9e7723" Mar 01 09:39:43 crc kubenswrapper[4792]: I0301 09:39:43.032821 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-j8zd5" Mar 01 09:39:43 crc kubenswrapper[4792]: I0301 09:39:43.200005 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh"] Mar 01 09:39:43 crc kubenswrapper[4792]: E0301 09:39:43.200371 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c6429ad-21d8-4f58-900b-e5f6fe4d603d" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:39:43 crc kubenswrapper[4792]: I0301 09:39:43.200392 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c6429ad-21d8-4f58-900b-e5f6fe4d603d" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:39:43 crc kubenswrapper[4792]: I0301 09:39:43.200564 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c6429ad-21d8-4f58-900b-e5f6fe4d603d" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:39:43 crc kubenswrapper[4792]: I0301 09:39:43.201122 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh" Mar 01 09:39:43 crc kubenswrapper[4792]: I0301 09:39:43.204573 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:39:43 crc kubenswrapper[4792]: I0301 09:39:43.204596 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:39:43 crc kubenswrapper[4792]: I0301 09:39:43.204840 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:39:43 crc kubenswrapper[4792]: I0301 09:39:43.204887 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:39:43 crc kubenswrapper[4792]: I0301 09:39:43.218847 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh"] Mar 01 09:39:43 crc kubenswrapper[4792]: I0301 09:39:43.384228 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/91d95c97-b82e-413c-b05a-3e9cb36e504e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh\" (UID: \"91d95c97-b82e-413c-b05a-3e9cb36e504e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh" Mar 01 09:39:43 crc kubenswrapper[4792]: I0301 09:39:43.384864 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nk765\" (UniqueName: \"kubernetes.io/projected/91d95c97-b82e-413c-b05a-3e9cb36e504e-kube-api-access-nk765\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh\" (UID: \"91d95c97-b82e-413c-b05a-3e9cb36e504e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh" Mar 01 09:39:43 crc kubenswrapper[4792]: I0301 09:39:43.385002 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/91d95c97-b82e-413c-b05a-3e9cb36e504e-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh\" (UID: \"91d95c97-b82e-413c-b05a-3e9cb36e504e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh" Mar 01 09:39:43 crc kubenswrapper[4792]: I0301 09:39:43.486996 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/91d95c97-b82e-413c-b05a-3e9cb36e504e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh\" (UID: \"91d95c97-b82e-413c-b05a-3e9cb36e504e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh" Mar 01 09:39:43 crc kubenswrapper[4792]: I0301 09:39:43.487866 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nk765\" (UniqueName: \"kubernetes.io/projected/91d95c97-b82e-413c-b05a-3e9cb36e504e-kube-api-access-nk765\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh\" (UID: \"91d95c97-b82e-413c-b05a-3e9cb36e504e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh" Mar 01 09:39:43 crc kubenswrapper[4792]: I0301 09:39:43.487962 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/91d95c97-b82e-413c-b05a-3e9cb36e504e-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh\" (UID: \"91d95c97-b82e-413c-b05a-3e9cb36e504e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh" Mar 01 09:39:43 crc kubenswrapper[4792]: I0301 09:39:43.490680 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/91d95c97-b82e-413c-b05a-3e9cb36e504e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh\" (UID: \"91d95c97-b82e-413c-b05a-3e9cb36e504e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh" Mar 01 09:39:43 crc kubenswrapper[4792]: I0301 09:39:43.493553 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/91d95c97-b82e-413c-b05a-3e9cb36e504e-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh\" (UID: \"91d95c97-b82e-413c-b05a-3e9cb36e504e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh" Mar 01 09:39:43 crc kubenswrapper[4792]: I0301 09:39:43.504390 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nk765\" (UniqueName: \"kubernetes.io/projected/91d95c97-b82e-413c-b05a-3e9cb36e504e-kube-api-access-nk765\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh\" (UID: \"91d95c97-b82e-413c-b05a-3e9cb36e504e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh" Mar 01 09:39:43 crc kubenswrapper[4792]: I0301 09:39:43.517639 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh" Mar 01 09:39:44 crc kubenswrapper[4792]: I0301 09:39:44.014876 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh"] Mar 01 09:39:44 crc kubenswrapper[4792]: I0301 09:39:44.042755 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh" event={"ID":"91d95c97-b82e-413c-b05a-3e9cb36e504e","Type":"ContainerStarted","Data":"30bb0ac3c1d1f68b2a280b73be598d2147d1223e5f8cd999679a757d2ca28d20"} Mar 01 09:39:45 crc kubenswrapper[4792]: I0301 09:39:45.050759 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh" event={"ID":"91d95c97-b82e-413c-b05a-3e9cb36e504e","Type":"ContainerStarted","Data":"c7d695ec4b923c131de05f4b3929cfdf1d810db81219d72b727b76b052713eaf"} Mar 01 09:39:45 crc kubenswrapper[4792]: I0301 09:39:45.065160 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh" podStartSLOduration=1.672184384 podStartE2EDuration="2.065145879s" podCreationTimestamp="2026-03-01 09:39:43 +0000 UTC" firstStartedPulling="2026-03-01 09:39:44.019857487 +0000 UTC m=+1913.261736684" lastFinishedPulling="2026-03-01 09:39:44.412818982 +0000 UTC m=+1913.654698179" observedRunningTime="2026-03-01 09:39:45.061832615 +0000 UTC m=+1914.303711802" watchObservedRunningTime="2026-03-01 09:39:45.065145879 +0000 UTC m=+1914.307025076" Mar 01 09:39:54 crc kubenswrapper[4792]: I0301 09:39:54.182255 4792 generic.go:334] "Generic (PLEG): container finished" podID="91d95c97-b82e-413c-b05a-3e9cb36e504e" containerID="c7d695ec4b923c131de05f4b3929cfdf1d810db81219d72b727b76b052713eaf" exitCode=0 Mar 01 09:39:54 crc kubenswrapper[4792]: I0301 09:39:54.182325 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh" event={"ID":"91d95c97-b82e-413c-b05a-3e9cb36e504e","Type":"ContainerDied","Data":"c7d695ec4b923c131de05f4b3929cfdf1d810db81219d72b727b76b052713eaf"} Mar 01 09:39:54 crc kubenswrapper[4792]: I0301 09:39:54.409176 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:39:54 crc kubenswrapper[4792]: E0301 09:39:54.409884 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:39:55 crc kubenswrapper[4792]: I0301 09:39:55.610274 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh" Mar 01 09:39:55 crc kubenswrapper[4792]: I0301 09:39:55.669284 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/91d95c97-b82e-413c-b05a-3e9cb36e504e-inventory\") pod \"91d95c97-b82e-413c-b05a-3e9cb36e504e\" (UID: \"91d95c97-b82e-413c-b05a-3e9cb36e504e\") " Mar 01 09:39:55 crc kubenswrapper[4792]: I0301 09:39:55.669502 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/91d95c97-b82e-413c-b05a-3e9cb36e504e-ssh-key-openstack-edpm-ipam\") pod \"91d95c97-b82e-413c-b05a-3e9cb36e504e\" (UID: \"91d95c97-b82e-413c-b05a-3e9cb36e504e\") " Mar 01 09:39:55 crc kubenswrapper[4792]: I0301 09:39:55.669563 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nk765\" (UniqueName: \"kubernetes.io/projected/91d95c97-b82e-413c-b05a-3e9cb36e504e-kube-api-access-nk765\") pod \"91d95c97-b82e-413c-b05a-3e9cb36e504e\" (UID: \"91d95c97-b82e-413c-b05a-3e9cb36e504e\") " Mar 01 09:39:55 crc kubenswrapper[4792]: I0301 09:39:55.678170 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91d95c97-b82e-413c-b05a-3e9cb36e504e-kube-api-access-nk765" (OuterVolumeSpecName: "kube-api-access-nk765") pod "91d95c97-b82e-413c-b05a-3e9cb36e504e" (UID: "91d95c97-b82e-413c-b05a-3e9cb36e504e"). InnerVolumeSpecName "kube-api-access-nk765". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:39:55 crc kubenswrapper[4792]: I0301 09:39:55.699540 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91d95c97-b82e-413c-b05a-3e9cb36e504e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "91d95c97-b82e-413c-b05a-3e9cb36e504e" (UID: "91d95c97-b82e-413c-b05a-3e9cb36e504e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:39:55 crc kubenswrapper[4792]: I0301 09:39:55.702158 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91d95c97-b82e-413c-b05a-3e9cb36e504e-inventory" (OuterVolumeSpecName: "inventory") pod "91d95c97-b82e-413c-b05a-3e9cb36e504e" (UID: "91d95c97-b82e-413c-b05a-3e9cb36e504e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:39:55 crc kubenswrapper[4792]: I0301 09:39:55.771744 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/91d95c97-b82e-413c-b05a-3e9cb36e504e-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 09:39:55 crc kubenswrapper[4792]: I0301 09:39:55.771787 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/91d95c97-b82e-413c-b05a-3e9cb36e504e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:39:55 crc kubenswrapper[4792]: I0301 09:39:55.771801 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nk765\" (UniqueName: \"kubernetes.io/projected/91d95c97-b82e-413c-b05a-3e9cb36e504e-kube-api-access-nk765\") on node \"crc\" DevicePath \"\"" Mar 01 09:39:56 crc kubenswrapper[4792]: I0301 09:39:56.201697 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh" event={"ID":"91d95c97-b82e-413c-b05a-3e9cb36e504e","Type":"ContainerDied","Data":"30bb0ac3c1d1f68b2a280b73be598d2147d1223e5f8cd999679a757d2ca28d20"} Mar 01 09:39:56 crc kubenswrapper[4792]: I0301 09:39:56.201736 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30bb0ac3c1d1f68b2a280b73be598d2147d1223e5f8cd999679a757d2ca28d20" Mar 01 09:39:56 crc kubenswrapper[4792]: I0301 09:39:56.201758 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh" Mar 01 09:40:00 crc kubenswrapper[4792]: I0301 09:40:00.137730 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539300-ws5xb"] Mar 01 09:40:00 crc kubenswrapper[4792]: E0301 09:40:00.138833 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91d95c97-b82e-413c-b05a-3e9cb36e504e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:40:00 crc kubenswrapper[4792]: I0301 09:40:00.138853 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="91d95c97-b82e-413c-b05a-3e9cb36e504e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:40:00 crc kubenswrapper[4792]: I0301 09:40:00.139146 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="91d95c97-b82e-413c-b05a-3e9cb36e504e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:40:00 crc kubenswrapper[4792]: I0301 09:40:00.139808 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539300-ws5xb" Mar 01 09:40:00 crc kubenswrapper[4792]: I0301 09:40:00.143090 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:40:00 crc kubenswrapper[4792]: I0301 09:40:00.143422 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:40:00 crc kubenswrapper[4792]: I0301 09:40:00.143422 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:40:00 crc kubenswrapper[4792]: I0301 09:40:00.150135 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539300-ws5xb"] Mar 01 09:40:00 crc kubenswrapper[4792]: I0301 09:40:00.271066 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hmql\" (UniqueName: \"kubernetes.io/projected/79584910-9524-4e1c-8edf-5411aa71eb0a-kube-api-access-6hmql\") pod \"auto-csr-approver-29539300-ws5xb\" (UID: \"79584910-9524-4e1c-8edf-5411aa71eb0a\") " pod="openshift-infra/auto-csr-approver-29539300-ws5xb" Mar 01 09:40:00 crc kubenswrapper[4792]: I0301 09:40:00.373431 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hmql\" (UniqueName: \"kubernetes.io/projected/79584910-9524-4e1c-8edf-5411aa71eb0a-kube-api-access-6hmql\") pod \"auto-csr-approver-29539300-ws5xb\" (UID: \"79584910-9524-4e1c-8edf-5411aa71eb0a\") " pod="openshift-infra/auto-csr-approver-29539300-ws5xb" Mar 01 09:40:00 crc kubenswrapper[4792]: I0301 09:40:00.393923 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hmql\" (UniqueName: \"kubernetes.io/projected/79584910-9524-4e1c-8edf-5411aa71eb0a-kube-api-access-6hmql\") pod \"auto-csr-approver-29539300-ws5xb\" (UID: \"79584910-9524-4e1c-8edf-5411aa71eb0a\") " pod="openshift-infra/auto-csr-approver-29539300-ws5xb" Mar 01 09:40:00 crc kubenswrapper[4792]: I0301 09:40:00.464043 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539300-ws5xb" Mar 01 09:40:00 crc kubenswrapper[4792]: I0301 09:40:00.908381 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539300-ws5xb"] Mar 01 09:40:01 crc kubenswrapper[4792]: I0301 09:40:01.272578 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539300-ws5xb" event={"ID":"79584910-9524-4e1c-8edf-5411aa71eb0a","Type":"ContainerStarted","Data":"7de1a92d2ee09022176111af02588639b0d73b7ea1861e3604c90244c0b0692c"} Mar 01 09:40:02 crc kubenswrapper[4792]: I0301 09:40:02.281343 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539300-ws5xb" event={"ID":"79584910-9524-4e1c-8edf-5411aa71eb0a","Type":"ContainerStarted","Data":"8579b82c3f2aeab429db244bb7b4d62bd406e57babe3af839bf5e91664e2433c"} Mar 01 09:40:02 crc kubenswrapper[4792]: I0301 09:40:02.296776 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29539300-ws5xb" podStartSLOduration=1.372850259 podStartE2EDuration="2.29675851s" podCreationTimestamp="2026-03-01 09:40:00 +0000 UTC" firstStartedPulling="2026-03-01 09:40:00.913297914 +0000 UTC m=+1930.155177111" lastFinishedPulling="2026-03-01 09:40:01.837206165 +0000 UTC m=+1931.079085362" observedRunningTime="2026-03-01 09:40:02.293284412 +0000 UTC m=+1931.535163609" watchObservedRunningTime="2026-03-01 09:40:02.29675851 +0000 UTC m=+1931.538637707" Mar 01 09:40:03 crc kubenswrapper[4792]: I0301 09:40:03.292041 4792 generic.go:334] "Generic (PLEG): container finished" podID="79584910-9524-4e1c-8edf-5411aa71eb0a" containerID="8579b82c3f2aeab429db244bb7b4d62bd406e57babe3af839bf5e91664e2433c" exitCode=0 Mar 01 09:40:03 crc kubenswrapper[4792]: I0301 09:40:03.292278 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539300-ws5xb" event={"ID":"79584910-9524-4e1c-8edf-5411aa71eb0a","Type":"ContainerDied","Data":"8579b82c3f2aeab429db244bb7b4d62bd406e57babe3af839bf5e91664e2433c"} Mar 01 09:40:04 crc kubenswrapper[4792]: I0301 09:40:04.596292 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539300-ws5xb" Mar 01 09:40:04 crc kubenswrapper[4792]: I0301 09:40:04.748166 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hmql\" (UniqueName: \"kubernetes.io/projected/79584910-9524-4e1c-8edf-5411aa71eb0a-kube-api-access-6hmql\") pod \"79584910-9524-4e1c-8edf-5411aa71eb0a\" (UID: \"79584910-9524-4e1c-8edf-5411aa71eb0a\") " Mar 01 09:40:04 crc kubenswrapper[4792]: I0301 09:40:04.755239 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79584910-9524-4e1c-8edf-5411aa71eb0a-kube-api-access-6hmql" (OuterVolumeSpecName: "kube-api-access-6hmql") pod "79584910-9524-4e1c-8edf-5411aa71eb0a" (UID: "79584910-9524-4e1c-8edf-5411aa71eb0a"). InnerVolumeSpecName "kube-api-access-6hmql". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:40:04 crc kubenswrapper[4792]: I0301 09:40:04.849894 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hmql\" (UniqueName: \"kubernetes.io/projected/79584910-9524-4e1c-8edf-5411aa71eb0a-kube-api-access-6hmql\") on node \"crc\" DevicePath \"\"" Mar 01 09:40:05 crc kubenswrapper[4792]: I0301 09:40:05.036553 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-tp5l7"] Mar 01 09:40:05 crc kubenswrapper[4792]: I0301 09:40:05.043840 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-tp5l7"] Mar 01 09:40:05 crc kubenswrapper[4792]: I0301 09:40:05.308369 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539300-ws5xb" event={"ID":"79584910-9524-4e1c-8edf-5411aa71eb0a","Type":"ContainerDied","Data":"7de1a92d2ee09022176111af02588639b0d73b7ea1861e3604c90244c0b0692c"} Mar 01 09:40:05 crc kubenswrapper[4792]: I0301 09:40:05.308408 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7de1a92d2ee09022176111af02588639b0d73b7ea1861e3604c90244c0b0692c" Mar 01 09:40:05 crc kubenswrapper[4792]: I0301 09:40:05.308458 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539300-ws5xb" Mar 01 09:40:05 crc kubenswrapper[4792]: I0301 09:40:05.423245 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de6ead5c-face-41ff-ab6e-aebb7ca73c1c" path="/var/lib/kubelet/pods/de6ead5c-face-41ff-ab6e-aebb7ca73c1c/volumes" Mar 01 09:40:05 crc kubenswrapper[4792]: I0301 09:40:05.654854 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539294-5jtlh"] Mar 01 09:40:05 crc kubenswrapper[4792]: I0301 09:40:05.663288 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539294-5jtlh"] Mar 01 09:40:07 crc kubenswrapper[4792]: I0301 09:40:07.408626 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:40:07 crc kubenswrapper[4792]: E0301 09:40:07.408923 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:40:07 crc kubenswrapper[4792]: I0301 09:40:07.419171 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6725e35-5100-4360-85ca-00aad33007d4" path="/var/lib/kubelet/pods/a6725e35-5100-4360-85ca-00aad33007d4/volumes" Mar 01 09:40:19 crc kubenswrapper[4792]: I0301 09:40:19.408881 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:40:19 crc kubenswrapper[4792]: E0301 09:40:19.410706 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:40:24 crc kubenswrapper[4792]: I0301 09:40:24.034046 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-8vfwt"] Mar 01 09:40:24 crc kubenswrapper[4792]: I0301 09:40:24.043937 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-8vfwt"] Mar 01 09:40:25 crc kubenswrapper[4792]: I0301 09:40:25.081215 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tjd85"] Mar 01 09:40:25 crc kubenswrapper[4792]: I0301 09:40:25.098914 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tjd85"] Mar 01 09:40:25 crc kubenswrapper[4792]: I0301 09:40:25.424009 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32a84376-7418-49cd-9c62-fdd1af7ec31b" path="/var/lib/kubelet/pods/32a84376-7418-49cd-9c62-fdd1af7ec31b/volumes" Mar 01 09:40:25 crc kubenswrapper[4792]: I0301 09:40:25.424549 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7269b8b7-440f-4fae-b0f1-f624e9d5b29a" path="/var/lib/kubelet/pods/7269b8b7-440f-4fae-b0f1-f624e9d5b29a/volumes" Mar 01 09:40:29 crc kubenswrapper[4792]: I0301 09:40:29.452889 4792 scope.go:117] "RemoveContainer" containerID="05d5887b441a9b375453d0ad6f9bd8826e5d3d116043c948abf3e299df007d6e" Mar 01 09:40:29 crc kubenswrapper[4792]: I0301 09:40:29.473952 4792 scope.go:117] "RemoveContainer" containerID="e107bd42472fceec66462b44aaa6f7f47fb07e9eba8ac8e30bec4fee69d4eff3" Mar 01 09:40:29 crc kubenswrapper[4792]: I0301 09:40:29.547264 4792 scope.go:117] "RemoveContainer" containerID="7b120b9d05aec1bbdb715a0cec1430208c27b75e09d6308e245c67d773de0e22" Mar 01 09:40:29 crc kubenswrapper[4792]: I0301 09:40:29.596384 4792 scope.go:117] "RemoveContainer" containerID="cc49a2b9acb35bd4588c6c9cb6d10085e66d67e1476ad98c515c87fcc8a40be2" Mar 01 09:40:29 crc kubenswrapper[4792]: I0301 09:40:29.643935 4792 scope.go:117] "RemoveContainer" containerID="d36c7b1a79f4c4f6b03c61a96da47d8288af453f4f9225ccf2c3d4099f44d0df" Mar 01 09:40:29 crc kubenswrapper[4792]: I0301 09:40:29.664591 4792 scope.go:117] "RemoveContainer" containerID="7a8d1321567d66f2ccb1955a5edf06d8800b1b50205fae01644b45e7fa573653" Mar 01 09:40:29 crc kubenswrapper[4792]: I0301 09:40:29.699085 4792 scope.go:117] "RemoveContainer" containerID="82dc2fec75535cf7d5ba98c257213dcf0b978f1770576dc67153fee7dc3473af" Mar 01 09:40:29 crc kubenswrapper[4792]: I0301 09:40:29.721340 4792 scope.go:117] "RemoveContainer" containerID="92439593a89d55067268c29652937630710da016f1c0b75141189d39ceeced86" Mar 01 09:40:29 crc kubenswrapper[4792]: I0301 09:40:29.741469 4792 scope.go:117] "RemoveContainer" containerID="dffb4e0c554e65ca661a311e46e8c09863ee0d9c8fb1783c221a0323e637a5f5" Mar 01 09:40:29 crc kubenswrapper[4792]: I0301 09:40:29.762437 4792 scope.go:117] "RemoveContainer" containerID="b0e1e09f850992f4661d6be2a0a76260a157f0e5c0875fd88ff0bc92644a8d13" Mar 01 09:40:34 crc kubenswrapper[4792]: I0301 09:40:34.409140 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:40:34 crc kubenswrapper[4792]: E0301 09:40:34.409717 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:40:49 crc kubenswrapper[4792]: I0301 09:40:49.408777 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:40:49 crc kubenswrapper[4792]: E0301 09:40:49.409601 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:41:04 crc kubenswrapper[4792]: I0301 09:41:04.408734 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:41:04 crc kubenswrapper[4792]: E0301 09:41:04.409452 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:41:08 crc kubenswrapper[4792]: I0301 09:41:08.050736 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-6q8nq"] Mar 01 09:41:08 crc kubenswrapper[4792]: I0301 09:41:08.058091 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-6q8nq"] Mar 01 09:41:09 crc kubenswrapper[4792]: I0301 09:41:09.419000 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8782d670-70cd-42cc-b4d7-c0c8275e457b" path="/var/lib/kubelet/pods/8782d670-70cd-42cc-b4d7-c0c8275e457b/volumes" Mar 01 09:41:19 crc kubenswrapper[4792]: I0301 09:41:19.409998 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:41:19 crc kubenswrapper[4792]: E0301 09:41:19.410946 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:41:29 crc kubenswrapper[4792]: I0301 09:41:29.918428 4792 scope.go:117] "RemoveContainer" containerID="c8dd8174a7f29b772205511318ce1d6a20b6f7ef7820849db345c8f1e46e0166" Mar 01 09:41:32 crc kubenswrapper[4792]: I0301 09:41:32.408859 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:41:32 crc kubenswrapper[4792]: E0301 09:41:32.409504 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:41:47 crc kubenswrapper[4792]: I0301 09:41:47.409237 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:41:47 crc kubenswrapper[4792]: E0301 09:41:47.411613 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:42:00 crc kubenswrapper[4792]: I0301 09:42:00.141087 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539302-26gpt"] Mar 01 09:42:00 crc kubenswrapper[4792]: E0301 09:42:00.143556 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79584910-9524-4e1c-8edf-5411aa71eb0a" containerName="oc" Mar 01 09:42:00 crc kubenswrapper[4792]: I0301 09:42:00.143647 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="79584910-9524-4e1c-8edf-5411aa71eb0a" containerName="oc" Mar 01 09:42:00 crc kubenswrapper[4792]: I0301 09:42:00.143952 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="79584910-9524-4e1c-8edf-5411aa71eb0a" containerName="oc" Mar 01 09:42:00 crc kubenswrapper[4792]: I0301 09:42:00.144590 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539302-26gpt" Mar 01 09:42:00 crc kubenswrapper[4792]: I0301 09:42:00.147689 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:42:00 crc kubenswrapper[4792]: I0301 09:42:00.147890 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:42:00 crc kubenswrapper[4792]: I0301 09:42:00.148441 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:42:00 crc kubenswrapper[4792]: I0301 09:42:00.153011 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539302-26gpt"] Mar 01 09:42:00 crc kubenswrapper[4792]: I0301 09:42:00.189020 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkntj\" (UniqueName: \"kubernetes.io/projected/792c2a11-8c71-4c9a-b0e8-53ed5d9e77f5-kube-api-access-vkntj\") pod \"auto-csr-approver-29539302-26gpt\" (UID: \"792c2a11-8c71-4c9a-b0e8-53ed5d9e77f5\") " pod="openshift-infra/auto-csr-approver-29539302-26gpt" Mar 01 09:42:00 crc kubenswrapper[4792]: I0301 09:42:00.291341 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkntj\" (UniqueName: \"kubernetes.io/projected/792c2a11-8c71-4c9a-b0e8-53ed5d9e77f5-kube-api-access-vkntj\") pod \"auto-csr-approver-29539302-26gpt\" (UID: \"792c2a11-8c71-4c9a-b0e8-53ed5d9e77f5\") " pod="openshift-infra/auto-csr-approver-29539302-26gpt" Mar 01 09:42:00 crc kubenswrapper[4792]: I0301 09:42:00.309017 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkntj\" (UniqueName: \"kubernetes.io/projected/792c2a11-8c71-4c9a-b0e8-53ed5d9e77f5-kube-api-access-vkntj\") pod \"auto-csr-approver-29539302-26gpt\" (UID: \"792c2a11-8c71-4c9a-b0e8-53ed5d9e77f5\") " pod="openshift-infra/auto-csr-approver-29539302-26gpt" Mar 01 09:42:00 crc kubenswrapper[4792]: I0301 09:42:00.408679 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:42:00 crc kubenswrapper[4792]: E0301 09:42:00.409069 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:42:00 crc kubenswrapper[4792]: I0301 09:42:00.494390 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539302-26gpt" Mar 01 09:42:00 crc kubenswrapper[4792]: I0301 09:42:00.714541 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539302-26gpt"] Mar 01 09:42:01 crc kubenswrapper[4792]: I0301 09:42:01.395691 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539302-26gpt" event={"ID":"792c2a11-8c71-4c9a-b0e8-53ed5d9e77f5","Type":"ContainerStarted","Data":"1cad343828afa0c050d70d2277cd2b09f0e50afc289e1270ab226b13616a6c34"} Mar 01 09:42:02 crc kubenswrapper[4792]: I0301 09:42:02.404141 4792 generic.go:334] "Generic (PLEG): container finished" podID="792c2a11-8c71-4c9a-b0e8-53ed5d9e77f5" containerID="53fe1a8f0f86c9e965b90816a9566427d372fba1cc22db1d2bb0ca2e72f57708" exitCode=0 Mar 01 09:42:02 crc kubenswrapper[4792]: I0301 09:42:02.404178 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539302-26gpt" event={"ID":"792c2a11-8c71-4c9a-b0e8-53ed5d9e77f5","Type":"ContainerDied","Data":"53fe1a8f0f86c9e965b90816a9566427d372fba1cc22db1d2bb0ca2e72f57708"} Mar 01 09:42:03 crc kubenswrapper[4792]: I0301 09:42:03.718635 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539302-26gpt" Mar 01 09:42:03 crc kubenswrapper[4792]: I0301 09:42:03.855950 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkntj\" (UniqueName: \"kubernetes.io/projected/792c2a11-8c71-4c9a-b0e8-53ed5d9e77f5-kube-api-access-vkntj\") pod \"792c2a11-8c71-4c9a-b0e8-53ed5d9e77f5\" (UID: \"792c2a11-8c71-4c9a-b0e8-53ed5d9e77f5\") " Mar 01 09:42:03 crc kubenswrapper[4792]: I0301 09:42:03.866877 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/792c2a11-8c71-4c9a-b0e8-53ed5d9e77f5-kube-api-access-vkntj" (OuterVolumeSpecName: "kube-api-access-vkntj") pod "792c2a11-8c71-4c9a-b0e8-53ed5d9e77f5" (UID: "792c2a11-8c71-4c9a-b0e8-53ed5d9e77f5"). InnerVolumeSpecName "kube-api-access-vkntj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:42:03 crc kubenswrapper[4792]: I0301 09:42:03.958029 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkntj\" (UniqueName: \"kubernetes.io/projected/792c2a11-8c71-4c9a-b0e8-53ed5d9e77f5-kube-api-access-vkntj\") on node \"crc\" DevicePath \"\"" Mar 01 09:42:04 crc kubenswrapper[4792]: I0301 09:42:04.425755 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539302-26gpt" event={"ID":"792c2a11-8c71-4c9a-b0e8-53ed5d9e77f5","Type":"ContainerDied","Data":"1cad343828afa0c050d70d2277cd2b09f0e50afc289e1270ab226b13616a6c34"} Mar 01 09:42:04 crc kubenswrapper[4792]: I0301 09:42:04.425805 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cad343828afa0c050d70d2277cd2b09f0e50afc289e1270ab226b13616a6c34" Mar 01 09:42:04 crc kubenswrapper[4792]: I0301 09:42:04.425882 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539302-26gpt" Mar 01 09:42:04 crc kubenswrapper[4792]: I0301 09:42:04.788996 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539296-87tnt"] Mar 01 09:42:04 crc kubenswrapper[4792]: I0301 09:42:04.796233 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539296-87tnt"] Mar 01 09:42:05 crc kubenswrapper[4792]: I0301 09:42:05.419282 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0029741-30a3-4fc2-b71d-c77dbd652c35" path="/var/lib/kubelet/pods/f0029741-30a3-4fc2-b71d-c77dbd652c35/volumes" Mar 01 09:42:12 crc kubenswrapper[4792]: I0301 09:42:12.408946 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:42:12 crc kubenswrapper[4792]: E0301 09:42:12.409723 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:42:24 crc kubenswrapper[4792]: I0301 09:42:24.408691 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:42:24 crc kubenswrapper[4792]: E0301 09:42:24.409374 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:42:29 crc kubenswrapper[4792]: I0301 09:42:29.988145 4792 scope.go:117] "RemoveContainer" containerID="8b912a3a88a6f648dd530babff742abe47a9567e05895bab6379ed09d8bc8a56" Mar 01 09:42:37 crc kubenswrapper[4792]: I0301 09:42:37.409066 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:42:37 crc kubenswrapper[4792]: I0301 09:42:37.692877 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerStarted","Data":"80f7f8ed75933ce29e4ba1caec37158448f83396fe1f5a8bba0233aec8df1ec7"} Mar 01 09:43:14 crc kubenswrapper[4792]: I0301 09:43:14.614148 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-m7wjn"] Mar 01 09:43:14 crc kubenswrapper[4792]: E0301 09:43:14.615143 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="792c2a11-8c71-4c9a-b0e8-53ed5d9e77f5" containerName="oc" Mar 01 09:43:14 crc kubenswrapper[4792]: I0301 09:43:14.615158 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="792c2a11-8c71-4c9a-b0e8-53ed5d9e77f5" containerName="oc" Mar 01 09:43:14 crc kubenswrapper[4792]: I0301 09:43:14.615364 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="792c2a11-8c71-4c9a-b0e8-53ed5d9e77f5" containerName="oc" Mar 01 09:43:14 crc kubenswrapper[4792]: I0301 09:43:14.616829 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m7wjn" Mar 01 09:43:14 crc kubenswrapper[4792]: I0301 09:43:14.622685 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m7wjn"] Mar 01 09:43:14 crc kubenswrapper[4792]: I0301 09:43:14.780812 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b6b7\" (UniqueName: \"kubernetes.io/projected/0a8e78a0-a8cd-450d-ad43-bb8060b2111c-kube-api-access-4b6b7\") pod \"redhat-operators-m7wjn\" (UID: \"0a8e78a0-a8cd-450d-ad43-bb8060b2111c\") " pod="openshift-marketplace/redhat-operators-m7wjn" Mar 01 09:43:14 crc kubenswrapper[4792]: I0301 09:43:14.781218 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a8e78a0-a8cd-450d-ad43-bb8060b2111c-catalog-content\") pod \"redhat-operators-m7wjn\" (UID: \"0a8e78a0-a8cd-450d-ad43-bb8060b2111c\") " pod="openshift-marketplace/redhat-operators-m7wjn" Mar 01 09:43:14 crc kubenswrapper[4792]: I0301 09:43:14.781267 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a8e78a0-a8cd-450d-ad43-bb8060b2111c-utilities\") pod \"redhat-operators-m7wjn\" (UID: \"0a8e78a0-a8cd-450d-ad43-bb8060b2111c\") " pod="openshift-marketplace/redhat-operators-m7wjn" Mar 01 09:43:14 crc kubenswrapper[4792]: I0301 09:43:14.883724 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a8e78a0-a8cd-450d-ad43-bb8060b2111c-catalog-content\") pod \"redhat-operators-m7wjn\" (UID: \"0a8e78a0-a8cd-450d-ad43-bb8060b2111c\") " pod="openshift-marketplace/redhat-operators-m7wjn" Mar 01 09:43:14 crc kubenswrapper[4792]: I0301 09:43:14.883794 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a8e78a0-a8cd-450d-ad43-bb8060b2111c-utilities\") pod \"redhat-operators-m7wjn\" (UID: \"0a8e78a0-a8cd-450d-ad43-bb8060b2111c\") " pod="openshift-marketplace/redhat-operators-m7wjn" Mar 01 09:43:14 crc kubenswrapper[4792]: I0301 09:43:14.883944 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b6b7\" (UniqueName: \"kubernetes.io/projected/0a8e78a0-a8cd-450d-ad43-bb8060b2111c-kube-api-access-4b6b7\") pod \"redhat-operators-m7wjn\" (UID: \"0a8e78a0-a8cd-450d-ad43-bb8060b2111c\") " pod="openshift-marketplace/redhat-operators-m7wjn" Mar 01 09:43:14 crc kubenswrapper[4792]: I0301 09:43:14.884288 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a8e78a0-a8cd-450d-ad43-bb8060b2111c-catalog-content\") pod \"redhat-operators-m7wjn\" (UID: \"0a8e78a0-a8cd-450d-ad43-bb8060b2111c\") " pod="openshift-marketplace/redhat-operators-m7wjn" Mar 01 09:43:14 crc kubenswrapper[4792]: I0301 09:43:14.884362 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a8e78a0-a8cd-450d-ad43-bb8060b2111c-utilities\") pod \"redhat-operators-m7wjn\" (UID: \"0a8e78a0-a8cd-450d-ad43-bb8060b2111c\") " pod="openshift-marketplace/redhat-operators-m7wjn" Mar 01 09:43:14 crc kubenswrapper[4792]: I0301 09:43:14.908745 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b6b7\" (UniqueName: \"kubernetes.io/projected/0a8e78a0-a8cd-450d-ad43-bb8060b2111c-kube-api-access-4b6b7\") pod \"redhat-operators-m7wjn\" (UID: \"0a8e78a0-a8cd-450d-ad43-bb8060b2111c\") " pod="openshift-marketplace/redhat-operators-m7wjn" Mar 01 09:43:14 crc kubenswrapper[4792]: I0301 09:43:14.993538 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m7wjn" Mar 01 09:43:15 crc kubenswrapper[4792]: I0301 09:43:15.504433 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m7wjn"] Mar 01 09:43:15 crc kubenswrapper[4792]: I0301 09:43:15.991154 4792 generic.go:334] "Generic (PLEG): container finished" podID="0a8e78a0-a8cd-450d-ad43-bb8060b2111c" containerID="085cd189a3c67b0c9c3f79768720778b7cba1eb93db881daadfa726dffc69925" exitCode=0 Mar 01 09:43:15 crc kubenswrapper[4792]: I0301 09:43:15.991244 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m7wjn" event={"ID":"0a8e78a0-a8cd-450d-ad43-bb8060b2111c","Type":"ContainerDied","Data":"085cd189a3c67b0c9c3f79768720778b7cba1eb93db881daadfa726dffc69925"} Mar 01 09:43:15 crc kubenswrapper[4792]: I0301 09:43:15.991527 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m7wjn" event={"ID":"0a8e78a0-a8cd-450d-ad43-bb8060b2111c","Type":"ContainerStarted","Data":"10a006007bcbaefd78d690abc403d6226b7d5583a504c774c24e533714ae4bd4"} Mar 01 09:43:17 crc kubenswrapper[4792]: I0301 09:43:17.002251 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m7wjn" event={"ID":"0a8e78a0-a8cd-450d-ad43-bb8060b2111c","Type":"ContainerStarted","Data":"dfb04538eb0c09dde5d7f9310f315e428388d1e50f9cf7cff948a58ab0e26c86"} Mar 01 09:43:22 crc kubenswrapper[4792]: I0301 09:43:22.038245 4792 generic.go:334] "Generic (PLEG): container finished" podID="0a8e78a0-a8cd-450d-ad43-bb8060b2111c" containerID="dfb04538eb0c09dde5d7f9310f315e428388d1e50f9cf7cff948a58ab0e26c86" exitCode=0 Mar 01 09:43:22 crc kubenswrapper[4792]: I0301 09:43:22.038465 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m7wjn" event={"ID":"0a8e78a0-a8cd-450d-ad43-bb8060b2111c","Type":"ContainerDied","Data":"dfb04538eb0c09dde5d7f9310f315e428388d1e50f9cf7cff948a58ab0e26c86"} Mar 01 09:43:23 crc kubenswrapper[4792]: I0301 09:43:23.047093 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m7wjn" event={"ID":"0a8e78a0-a8cd-450d-ad43-bb8060b2111c","Type":"ContainerStarted","Data":"b3aab0c7882b63849c8964a4efa9c80b782da2e3bdd3f64208c27669cf25ba13"} Mar 01 09:43:23 crc kubenswrapper[4792]: I0301 09:43:23.065184 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-m7wjn" podStartSLOduration=2.661028159 podStartE2EDuration="9.065166796s" podCreationTimestamp="2026-03-01 09:43:14 +0000 UTC" firstStartedPulling="2026-03-01 09:43:15.992922702 +0000 UTC m=+2125.234801899" lastFinishedPulling="2026-03-01 09:43:22.397061339 +0000 UTC m=+2131.638940536" observedRunningTime="2026-03-01 09:43:23.062354285 +0000 UTC m=+2132.304233482" watchObservedRunningTime="2026-03-01 09:43:23.065166796 +0000 UTC m=+2132.307045993" Mar 01 09:43:24 crc kubenswrapper[4792]: I0301 09:43:24.993645 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-m7wjn" Mar 01 09:43:24 crc kubenswrapper[4792]: I0301 09:43:24.993995 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-m7wjn" Mar 01 09:43:26 crc kubenswrapper[4792]: I0301 09:43:26.044587 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m7wjn" podUID="0a8e78a0-a8cd-450d-ad43-bb8060b2111c" containerName="registry-server" probeResult="failure" output=< Mar 01 09:43:26 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 09:43:26 crc kubenswrapper[4792]: > Mar 01 09:43:35 crc kubenswrapper[4792]: I0301 09:43:35.037242 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-m7wjn" Mar 01 09:43:35 crc kubenswrapper[4792]: I0301 09:43:35.081844 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-m7wjn" Mar 01 09:43:35 crc kubenswrapper[4792]: I0301 09:43:35.273521 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m7wjn"] Mar 01 09:43:36 crc kubenswrapper[4792]: I0301 09:43:36.153282 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-m7wjn" podUID="0a8e78a0-a8cd-450d-ad43-bb8060b2111c" containerName="registry-server" containerID="cri-o://b3aab0c7882b63849c8964a4efa9c80b782da2e3bdd3f64208c27669cf25ba13" gracePeriod=2 Mar 01 09:43:36 crc kubenswrapper[4792]: I0301 09:43:36.532492 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m7wjn" Mar 01 09:43:36 crc kubenswrapper[4792]: I0301 09:43:36.726222 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4b6b7\" (UniqueName: \"kubernetes.io/projected/0a8e78a0-a8cd-450d-ad43-bb8060b2111c-kube-api-access-4b6b7\") pod \"0a8e78a0-a8cd-450d-ad43-bb8060b2111c\" (UID: \"0a8e78a0-a8cd-450d-ad43-bb8060b2111c\") " Mar 01 09:43:36 crc kubenswrapper[4792]: I0301 09:43:36.726722 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a8e78a0-a8cd-450d-ad43-bb8060b2111c-catalog-content\") pod \"0a8e78a0-a8cd-450d-ad43-bb8060b2111c\" (UID: \"0a8e78a0-a8cd-450d-ad43-bb8060b2111c\") " Mar 01 09:43:36 crc kubenswrapper[4792]: I0301 09:43:36.726775 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a8e78a0-a8cd-450d-ad43-bb8060b2111c-utilities\") pod \"0a8e78a0-a8cd-450d-ad43-bb8060b2111c\" (UID: \"0a8e78a0-a8cd-450d-ad43-bb8060b2111c\") " Mar 01 09:43:36 crc kubenswrapper[4792]: I0301 09:43:36.727326 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a8e78a0-a8cd-450d-ad43-bb8060b2111c-utilities" (OuterVolumeSpecName: "utilities") pod "0a8e78a0-a8cd-450d-ad43-bb8060b2111c" (UID: "0a8e78a0-a8cd-450d-ad43-bb8060b2111c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:43:36 crc kubenswrapper[4792]: I0301 09:43:36.734659 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a8e78a0-a8cd-450d-ad43-bb8060b2111c-kube-api-access-4b6b7" (OuterVolumeSpecName: "kube-api-access-4b6b7") pod "0a8e78a0-a8cd-450d-ad43-bb8060b2111c" (UID: "0a8e78a0-a8cd-450d-ad43-bb8060b2111c"). InnerVolumeSpecName "kube-api-access-4b6b7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:43:36 crc kubenswrapper[4792]: I0301 09:43:36.829266 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a8e78a0-a8cd-450d-ad43-bb8060b2111c-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:43:36 crc kubenswrapper[4792]: I0301 09:43:36.829756 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4b6b7\" (UniqueName: \"kubernetes.io/projected/0a8e78a0-a8cd-450d-ad43-bb8060b2111c-kube-api-access-4b6b7\") on node \"crc\" DevicePath \"\"" Mar 01 09:43:36 crc kubenswrapper[4792]: I0301 09:43:36.845272 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a8e78a0-a8cd-450d-ad43-bb8060b2111c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0a8e78a0-a8cd-450d-ad43-bb8060b2111c" (UID: "0a8e78a0-a8cd-450d-ad43-bb8060b2111c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:43:36 crc kubenswrapper[4792]: I0301 09:43:36.932675 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a8e78a0-a8cd-450d-ad43-bb8060b2111c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:43:37 crc kubenswrapper[4792]: I0301 09:43:37.173274 4792 generic.go:334] "Generic (PLEG): container finished" podID="0a8e78a0-a8cd-450d-ad43-bb8060b2111c" containerID="b3aab0c7882b63849c8964a4efa9c80b782da2e3bdd3f64208c27669cf25ba13" exitCode=0 Mar 01 09:43:37 crc kubenswrapper[4792]: I0301 09:43:37.173314 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m7wjn" event={"ID":"0a8e78a0-a8cd-450d-ad43-bb8060b2111c","Type":"ContainerDied","Data":"b3aab0c7882b63849c8964a4efa9c80b782da2e3bdd3f64208c27669cf25ba13"} Mar 01 09:43:37 crc kubenswrapper[4792]: I0301 09:43:37.173339 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m7wjn" event={"ID":"0a8e78a0-a8cd-450d-ad43-bb8060b2111c","Type":"ContainerDied","Data":"10a006007bcbaefd78d690abc403d6226b7d5583a504c774c24e533714ae4bd4"} Mar 01 09:43:37 crc kubenswrapper[4792]: I0301 09:43:37.173348 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m7wjn" Mar 01 09:43:37 crc kubenswrapper[4792]: I0301 09:43:37.173363 4792 scope.go:117] "RemoveContainer" containerID="b3aab0c7882b63849c8964a4efa9c80b782da2e3bdd3f64208c27669cf25ba13" Mar 01 09:43:37 crc kubenswrapper[4792]: I0301 09:43:37.189787 4792 scope.go:117] "RemoveContainer" containerID="dfb04538eb0c09dde5d7f9310f315e428388d1e50f9cf7cff948a58ab0e26c86" Mar 01 09:43:37 crc kubenswrapper[4792]: I0301 09:43:37.218230 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m7wjn"] Mar 01 09:43:37 crc kubenswrapper[4792]: I0301 09:43:37.224237 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-m7wjn"] Mar 01 09:43:37 crc kubenswrapper[4792]: I0301 09:43:37.226110 4792 scope.go:117] "RemoveContainer" containerID="085cd189a3c67b0c9c3f79768720778b7cba1eb93db881daadfa726dffc69925" Mar 01 09:43:37 crc kubenswrapper[4792]: I0301 09:43:37.263367 4792 scope.go:117] "RemoveContainer" containerID="b3aab0c7882b63849c8964a4efa9c80b782da2e3bdd3f64208c27669cf25ba13" Mar 01 09:43:37 crc kubenswrapper[4792]: E0301 09:43:37.265314 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3aab0c7882b63849c8964a4efa9c80b782da2e3bdd3f64208c27669cf25ba13\": container with ID starting with b3aab0c7882b63849c8964a4efa9c80b782da2e3bdd3f64208c27669cf25ba13 not found: ID does not exist" containerID="b3aab0c7882b63849c8964a4efa9c80b782da2e3bdd3f64208c27669cf25ba13" Mar 01 09:43:37 crc kubenswrapper[4792]: I0301 09:43:37.265361 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3aab0c7882b63849c8964a4efa9c80b782da2e3bdd3f64208c27669cf25ba13"} err="failed to get container status \"b3aab0c7882b63849c8964a4efa9c80b782da2e3bdd3f64208c27669cf25ba13\": rpc error: code = NotFound desc = could not find container \"b3aab0c7882b63849c8964a4efa9c80b782da2e3bdd3f64208c27669cf25ba13\": container with ID starting with b3aab0c7882b63849c8964a4efa9c80b782da2e3bdd3f64208c27669cf25ba13 not found: ID does not exist" Mar 01 09:43:37 crc kubenswrapper[4792]: I0301 09:43:37.265386 4792 scope.go:117] "RemoveContainer" containerID="dfb04538eb0c09dde5d7f9310f315e428388d1e50f9cf7cff948a58ab0e26c86" Mar 01 09:43:37 crc kubenswrapper[4792]: E0301 09:43:37.266621 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfb04538eb0c09dde5d7f9310f315e428388d1e50f9cf7cff948a58ab0e26c86\": container with ID starting with dfb04538eb0c09dde5d7f9310f315e428388d1e50f9cf7cff948a58ab0e26c86 not found: ID does not exist" containerID="dfb04538eb0c09dde5d7f9310f315e428388d1e50f9cf7cff948a58ab0e26c86" Mar 01 09:43:37 crc kubenswrapper[4792]: I0301 09:43:37.266653 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfb04538eb0c09dde5d7f9310f315e428388d1e50f9cf7cff948a58ab0e26c86"} err="failed to get container status \"dfb04538eb0c09dde5d7f9310f315e428388d1e50f9cf7cff948a58ab0e26c86\": rpc error: code = NotFound desc = could not find container \"dfb04538eb0c09dde5d7f9310f315e428388d1e50f9cf7cff948a58ab0e26c86\": container with ID starting with dfb04538eb0c09dde5d7f9310f315e428388d1e50f9cf7cff948a58ab0e26c86 not found: ID does not exist" Mar 01 09:43:37 crc kubenswrapper[4792]: I0301 09:43:37.266694 4792 scope.go:117] "RemoveContainer" containerID="085cd189a3c67b0c9c3f79768720778b7cba1eb93db881daadfa726dffc69925" Mar 01 09:43:37 crc kubenswrapper[4792]: E0301 09:43:37.266989 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"085cd189a3c67b0c9c3f79768720778b7cba1eb93db881daadfa726dffc69925\": container with ID starting with 085cd189a3c67b0c9c3f79768720778b7cba1eb93db881daadfa726dffc69925 not found: ID does not exist" containerID="085cd189a3c67b0c9c3f79768720778b7cba1eb93db881daadfa726dffc69925" Mar 01 09:43:37 crc kubenswrapper[4792]: I0301 09:43:37.267089 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"085cd189a3c67b0c9c3f79768720778b7cba1eb93db881daadfa726dffc69925"} err="failed to get container status \"085cd189a3c67b0c9c3f79768720778b7cba1eb93db881daadfa726dffc69925\": rpc error: code = NotFound desc = could not find container \"085cd189a3c67b0c9c3f79768720778b7cba1eb93db881daadfa726dffc69925\": container with ID starting with 085cd189a3c67b0c9c3f79768720778b7cba1eb93db881daadfa726dffc69925 not found: ID does not exist" Mar 01 09:43:37 crc kubenswrapper[4792]: I0301 09:43:37.418161 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a8e78a0-a8cd-450d-ad43-bb8060b2111c" path="/var/lib/kubelet/pods/0a8e78a0-a8cd-450d-ad43-bb8060b2111c/volumes" Mar 01 09:44:00 crc kubenswrapper[4792]: I0301 09:44:00.144765 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539304-c2vn2"] Mar 01 09:44:00 crc kubenswrapper[4792]: E0301 09:44:00.145674 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a8e78a0-a8cd-450d-ad43-bb8060b2111c" containerName="extract-utilities" Mar 01 09:44:00 crc kubenswrapper[4792]: I0301 09:44:00.145689 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a8e78a0-a8cd-450d-ad43-bb8060b2111c" containerName="extract-utilities" Mar 01 09:44:00 crc kubenswrapper[4792]: E0301 09:44:00.145707 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a8e78a0-a8cd-450d-ad43-bb8060b2111c" containerName="extract-content" Mar 01 09:44:00 crc kubenswrapper[4792]: I0301 09:44:00.145714 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a8e78a0-a8cd-450d-ad43-bb8060b2111c" containerName="extract-content" Mar 01 09:44:00 crc kubenswrapper[4792]: E0301 09:44:00.145748 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a8e78a0-a8cd-450d-ad43-bb8060b2111c" containerName="registry-server" Mar 01 09:44:00 crc kubenswrapper[4792]: I0301 09:44:00.145757 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a8e78a0-a8cd-450d-ad43-bb8060b2111c" containerName="registry-server" Mar 01 09:44:00 crc kubenswrapper[4792]: I0301 09:44:00.146045 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a8e78a0-a8cd-450d-ad43-bb8060b2111c" containerName="registry-server" Mar 01 09:44:00 crc kubenswrapper[4792]: I0301 09:44:00.146746 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539304-c2vn2" Mar 01 09:44:00 crc kubenswrapper[4792]: I0301 09:44:00.151419 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:44:00 crc kubenswrapper[4792]: I0301 09:44:00.151497 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:44:00 crc kubenswrapper[4792]: I0301 09:44:00.151590 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:44:00 crc kubenswrapper[4792]: I0301 09:44:00.158473 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539304-c2vn2"] Mar 01 09:44:00 crc kubenswrapper[4792]: I0301 09:44:00.253648 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6szt8\" (UniqueName: \"kubernetes.io/projected/97e68f99-8c1f-4046-bb89-66516bff6370-kube-api-access-6szt8\") pod \"auto-csr-approver-29539304-c2vn2\" (UID: \"97e68f99-8c1f-4046-bb89-66516bff6370\") " pod="openshift-infra/auto-csr-approver-29539304-c2vn2" Mar 01 09:44:00 crc kubenswrapper[4792]: I0301 09:44:00.355822 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6szt8\" (UniqueName: \"kubernetes.io/projected/97e68f99-8c1f-4046-bb89-66516bff6370-kube-api-access-6szt8\") pod \"auto-csr-approver-29539304-c2vn2\" (UID: \"97e68f99-8c1f-4046-bb89-66516bff6370\") " pod="openshift-infra/auto-csr-approver-29539304-c2vn2" Mar 01 09:44:00 crc kubenswrapper[4792]: I0301 09:44:00.380431 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6szt8\" (UniqueName: \"kubernetes.io/projected/97e68f99-8c1f-4046-bb89-66516bff6370-kube-api-access-6szt8\") pod \"auto-csr-approver-29539304-c2vn2\" (UID: \"97e68f99-8c1f-4046-bb89-66516bff6370\") " pod="openshift-infra/auto-csr-approver-29539304-c2vn2" Mar 01 09:44:00 crc kubenswrapper[4792]: I0301 09:44:00.478732 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539304-c2vn2" Mar 01 09:44:00 crc kubenswrapper[4792]: I0301 09:44:00.903854 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539304-c2vn2"] Mar 01 09:44:01 crc kubenswrapper[4792]: I0301 09:44:01.361511 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539304-c2vn2" event={"ID":"97e68f99-8c1f-4046-bb89-66516bff6370","Type":"ContainerStarted","Data":"4e11349cda068ff64a159b98fad5cbc797c08e7d4f6ef9edc61ac56dff713ded"} Mar 01 09:44:02 crc kubenswrapper[4792]: I0301 09:44:02.371102 4792 generic.go:334] "Generic (PLEG): container finished" podID="97e68f99-8c1f-4046-bb89-66516bff6370" containerID="3cbaa243041e250919798684d495339949ab384e80a45c460f4d0e0c2cfab407" exitCode=0 Mar 01 09:44:02 crc kubenswrapper[4792]: I0301 09:44:02.371207 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539304-c2vn2" event={"ID":"97e68f99-8c1f-4046-bb89-66516bff6370","Type":"ContainerDied","Data":"3cbaa243041e250919798684d495339949ab384e80a45c460f4d0e0c2cfab407"} Mar 01 09:44:03 crc kubenswrapper[4792]: I0301 09:44:03.716925 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539304-c2vn2" Mar 01 09:44:03 crc kubenswrapper[4792]: I0301 09:44:03.818847 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6szt8\" (UniqueName: \"kubernetes.io/projected/97e68f99-8c1f-4046-bb89-66516bff6370-kube-api-access-6szt8\") pod \"97e68f99-8c1f-4046-bb89-66516bff6370\" (UID: \"97e68f99-8c1f-4046-bb89-66516bff6370\") " Mar 01 09:44:03 crc kubenswrapper[4792]: I0301 09:44:03.825036 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97e68f99-8c1f-4046-bb89-66516bff6370-kube-api-access-6szt8" (OuterVolumeSpecName: "kube-api-access-6szt8") pod "97e68f99-8c1f-4046-bb89-66516bff6370" (UID: "97e68f99-8c1f-4046-bb89-66516bff6370"). InnerVolumeSpecName "kube-api-access-6szt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:44:03 crc kubenswrapper[4792]: I0301 09:44:03.920758 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6szt8\" (UniqueName: \"kubernetes.io/projected/97e68f99-8c1f-4046-bb89-66516bff6370-kube-api-access-6szt8\") on node \"crc\" DevicePath \"\"" Mar 01 09:44:04 crc kubenswrapper[4792]: I0301 09:44:04.390079 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539304-c2vn2" event={"ID":"97e68f99-8c1f-4046-bb89-66516bff6370","Type":"ContainerDied","Data":"4e11349cda068ff64a159b98fad5cbc797c08e7d4f6ef9edc61ac56dff713ded"} Mar 01 09:44:04 crc kubenswrapper[4792]: I0301 09:44:04.390376 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e11349cda068ff64a159b98fad5cbc797c08e7d4f6ef9edc61ac56dff713ded" Mar 01 09:44:04 crc kubenswrapper[4792]: I0301 09:44:04.390447 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539304-c2vn2" Mar 01 09:44:04 crc kubenswrapper[4792]: I0301 09:44:04.785392 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539298-ckkqh"] Mar 01 09:44:04 crc kubenswrapper[4792]: I0301 09:44:04.792244 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539298-ckkqh"] Mar 01 09:44:05 crc kubenswrapper[4792]: I0301 09:44:05.417323 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e9f36fa-467b-4b49-9d69-b465a22837e5" path="/var/lib/kubelet/pods/7e9f36fa-467b-4b49-9d69-b465a22837e5/volumes" Mar 01 09:44:30 crc kubenswrapper[4792]: I0301 09:44:30.085704 4792 scope.go:117] "RemoveContainer" containerID="fe8d25b14be5e63dea82359594e611616a93ae643f10a8ff38209498ecbc612f" Mar 01 09:45:00 crc kubenswrapper[4792]: I0301 09:45:00.144482 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539305-ml8xz"] Mar 01 09:45:00 crc kubenswrapper[4792]: E0301 09:45:00.145475 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97e68f99-8c1f-4046-bb89-66516bff6370" containerName="oc" Mar 01 09:45:00 crc kubenswrapper[4792]: I0301 09:45:00.145492 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="97e68f99-8c1f-4046-bb89-66516bff6370" containerName="oc" Mar 01 09:45:00 crc kubenswrapper[4792]: I0301 09:45:00.145720 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="97e68f99-8c1f-4046-bb89-66516bff6370" containerName="oc" Mar 01 09:45:00 crc kubenswrapper[4792]: I0301 09:45:00.146486 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539305-ml8xz" Mar 01 09:45:00 crc kubenswrapper[4792]: I0301 09:45:00.149355 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 01 09:45:00 crc kubenswrapper[4792]: I0301 09:45:00.149854 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 01 09:45:00 crc kubenswrapper[4792]: I0301 09:45:00.156426 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539305-ml8xz"] Mar 01 09:45:00 crc kubenswrapper[4792]: I0301 09:45:00.242382 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7a9ad8e-1c99-4a79-87eb-912aab1dc48c-secret-volume\") pod \"collect-profiles-29539305-ml8xz\" (UID: \"e7a9ad8e-1c99-4a79-87eb-912aab1dc48c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539305-ml8xz" Mar 01 09:45:00 crc kubenswrapper[4792]: I0301 09:45:00.242473 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7a9ad8e-1c99-4a79-87eb-912aab1dc48c-config-volume\") pod \"collect-profiles-29539305-ml8xz\" (UID: \"e7a9ad8e-1c99-4a79-87eb-912aab1dc48c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539305-ml8xz" Mar 01 09:45:00 crc kubenswrapper[4792]: I0301 09:45:00.242494 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4xqp\" (UniqueName: \"kubernetes.io/projected/e7a9ad8e-1c99-4a79-87eb-912aab1dc48c-kube-api-access-k4xqp\") pod \"collect-profiles-29539305-ml8xz\" (UID: \"e7a9ad8e-1c99-4a79-87eb-912aab1dc48c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539305-ml8xz" Mar 01 09:45:00 crc kubenswrapper[4792]: I0301 09:45:00.343895 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7a9ad8e-1c99-4a79-87eb-912aab1dc48c-secret-volume\") pod \"collect-profiles-29539305-ml8xz\" (UID: \"e7a9ad8e-1c99-4a79-87eb-912aab1dc48c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539305-ml8xz" Mar 01 09:45:00 crc kubenswrapper[4792]: I0301 09:45:00.343992 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7a9ad8e-1c99-4a79-87eb-912aab1dc48c-config-volume\") pod \"collect-profiles-29539305-ml8xz\" (UID: \"e7a9ad8e-1c99-4a79-87eb-912aab1dc48c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539305-ml8xz" Mar 01 09:45:00 crc kubenswrapper[4792]: I0301 09:45:00.344018 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4xqp\" (UniqueName: \"kubernetes.io/projected/e7a9ad8e-1c99-4a79-87eb-912aab1dc48c-kube-api-access-k4xqp\") pod \"collect-profiles-29539305-ml8xz\" (UID: \"e7a9ad8e-1c99-4a79-87eb-912aab1dc48c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539305-ml8xz" Mar 01 09:45:00 crc kubenswrapper[4792]: I0301 09:45:00.344959 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7a9ad8e-1c99-4a79-87eb-912aab1dc48c-config-volume\") pod \"collect-profiles-29539305-ml8xz\" (UID: \"e7a9ad8e-1c99-4a79-87eb-912aab1dc48c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539305-ml8xz" Mar 01 09:45:00 crc kubenswrapper[4792]: I0301 09:45:00.350201 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7a9ad8e-1c99-4a79-87eb-912aab1dc48c-secret-volume\") pod \"collect-profiles-29539305-ml8xz\" (UID: \"e7a9ad8e-1c99-4a79-87eb-912aab1dc48c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539305-ml8xz" Mar 01 09:45:00 crc kubenswrapper[4792]: I0301 09:45:00.365467 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4xqp\" (UniqueName: \"kubernetes.io/projected/e7a9ad8e-1c99-4a79-87eb-912aab1dc48c-kube-api-access-k4xqp\") pod \"collect-profiles-29539305-ml8xz\" (UID: \"e7a9ad8e-1c99-4a79-87eb-912aab1dc48c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539305-ml8xz" Mar 01 09:45:00 crc kubenswrapper[4792]: I0301 09:45:00.466316 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539305-ml8xz" Mar 01 09:45:00 crc kubenswrapper[4792]: I0301 09:45:00.893505 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539305-ml8xz"] Mar 01 09:45:01 crc kubenswrapper[4792]: I0301 09:45:01.862372 4792 generic.go:334] "Generic (PLEG): container finished" podID="e7a9ad8e-1c99-4a79-87eb-912aab1dc48c" containerID="5aa67c39154d74ad3f75b4616f3e6439b947759682dc6085d9ce37f8cd99894c" exitCode=0 Mar 01 09:45:01 crc kubenswrapper[4792]: I0301 09:45:01.862418 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539305-ml8xz" event={"ID":"e7a9ad8e-1c99-4a79-87eb-912aab1dc48c","Type":"ContainerDied","Data":"5aa67c39154d74ad3f75b4616f3e6439b947759682dc6085d9ce37f8cd99894c"} Mar 01 09:45:01 crc kubenswrapper[4792]: I0301 09:45:01.862680 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539305-ml8xz" event={"ID":"e7a9ad8e-1c99-4a79-87eb-912aab1dc48c","Type":"ContainerStarted","Data":"0df8929d04bb8c6b5b519f1a9969780072c9fcb0dfa7729cd770332cd5028e91"} Mar 01 09:45:03 crc kubenswrapper[4792]: I0301 09:45:03.222148 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539305-ml8xz" Mar 01 09:45:03 crc kubenswrapper[4792]: I0301 09:45:03.294356 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7a9ad8e-1c99-4a79-87eb-912aab1dc48c-secret-volume\") pod \"e7a9ad8e-1c99-4a79-87eb-912aab1dc48c\" (UID: \"e7a9ad8e-1c99-4a79-87eb-912aab1dc48c\") " Mar 01 09:45:03 crc kubenswrapper[4792]: I0301 09:45:03.294462 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7a9ad8e-1c99-4a79-87eb-912aab1dc48c-config-volume\") pod \"e7a9ad8e-1c99-4a79-87eb-912aab1dc48c\" (UID: \"e7a9ad8e-1c99-4a79-87eb-912aab1dc48c\") " Mar 01 09:45:03 crc kubenswrapper[4792]: I0301 09:45:03.294523 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4xqp\" (UniqueName: \"kubernetes.io/projected/e7a9ad8e-1c99-4a79-87eb-912aab1dc48c-kube-api-access-k4xqp\") pod \"e7a9ad8e-1c99-4a79-87eb-912aab1dc48c\" (UID: \"e7a9ad8e-1c99-4a79-87eb-912aab1dc48c\") " Mar 01 09:45:03 crc kubenswrapper[4792]: I0301 09:45:03.295468 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7a9ad8e-1c99-4a79-87eb-912aab1dc48c-config-volume" (OuterVolumeSpecName: "config-volume") pod "e7a9ad8e-1c99-4a79-87eb-912aab1dc48c" (UID: "e7a9ad8e-1c99-4a79-87eb-912aab1dc48c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:45:03 crc kubenswrapper[4792]: I0301 09:45:03.305180 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7a9ad8e-1c99-4a79-87eb-912aab1dc48c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e7a9ad8e-1c99-4a79-87eb-912aab1dc48c" (UID: "e7a9ad8e-1c99-4a79-87eb-912aab1dc48c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:45:03 crc kubenswrapper[4792]: I0301 09:45:03.305250 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7a9ad8e-1c99-4a79-87eb-912aab1dc48c-kube-api-access-k4xqp" (OuterVolumeSpecName: "kube-api-access-k4xqp") pod "e7a9ad8e-1c99-4a79-87eb-912aab1dc48c" (UID: "e7a9ad8e-1c99-4a79-87eb-912aab1dc48c"). InnerVolumeSpecName "kube-api-access-k4xqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:45:03 crc kubenswrapper[4792]: I0301 09:45:03.396863 4792 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7a9ad8e-1c99-4a79-87eb-912aab1dc48c-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 01 09:45:03 crc kubenswrapper[4792]: I0301 09:45:03.396899 4792 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7a9ad8e-1c99-4a79-87eb-912aab1dc48c-config-volume\") on node \"crc\" DevicePath \"\"" Mar 01 09:45:03 crc kubenswrapper[4792]: I0301 09:45:03.396932 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4xqp\" (UniqueName: \"kubernetes.io/projected/e7a9ad8e-1c99-4a79-87eb-912aab1dc48c-kube-api-access-k4xqp\") on node \"crc\" DevicePath \"\"" Mar 01 09:45:03 crc kubenswrapper[4792]: I0301 09:45:03.877418 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539305-ml8xz" event={"ID":"e7a9ad8e-1c99-4a79-87eb-912aab1dc48c","Type":"ContainerDied","Data":"0df8929d04bb8c6b5b519f1a9969780072c9fcb0dfa7729cd770332cd5028e91"} Mar 01 09:45:03 crc kubenswrapper[4792]: I0301 09:45:03.877460 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0df8929d04bb8c6b5b519f1a9969780072c9fcb0dfa7729cd770332cd5028e91" Mar 01 09:45:03 crc kubenswrapper[4792]: I0301 09:45:03.877580 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539305-ml8xz" Mar 01 09:45:04 crc kubenswrapper[4792]: I0301 09:45:04.296810 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539260-g6qtn"] Mar 01 09:45:04 crc kubenswrapper[4792]: I0301 09:45:04.304131 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539260-g6qtn"] Mar 01 09:45:04 crc kubenswrapper[4792]: I0301 09:45:04.943249 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:45:04 crc kubenswrapper[4792]: I0301 09:45:04.943298 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:45:05 crc kubenswrapper[4792]: I0301 09:45:05.422152 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3" path="/var/lib/kubelet/pods/6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3/volumes" Mar 01 09:45:08 crc kubenswrapper[4792]: I0301 09:45:08.089830 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2"] Mar 01 09:45:08 crc kubenswrapper[4792]: I0301 09:45:08.095504 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-j8zd5"] Mar 01 09:45:08 crc kubenswrapper[4792]: I0301 09:45:08.104014 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs"] Mar 01 09:45:08 crc kubenswrapper[4792]: I0301 09:45:08.111439 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw"] Mar 01 09:45:08 crc kubenswrapper[4792]: I0301 09:45:08.117845 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw"] Mar 01 09:45:08 crc kubenswrapper[4792]: I0301 09:45:08.124103 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs"] Mar 01 09:45:08 crc kubenswrapper[4792]: I0301 09:45:08.129867 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt"] Mar 01 09:45:08 crc kubenswrapper[4792]: I0301 09:45:08.135536 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln"] Mar 01 09:45:08 crc kubenswrapper[4792]: I0301 09:45:08.141779 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2"] Mar 01 09:45:08 crc kubenswrapper[4792]: I0301 09:45:08.147433 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-27xxx"] Mar 01 09:45:08 crc kubenswrapper[4792]: I0301 09:45:08.152797 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-j8zd5"] Mar 01 09:45:08 crc kubenswrapper[4792]: I0301 09:45:08.158146 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt"] Mar 01 09:45:08 crc kubenswrapper[4792]: I0301 09:45:08.163437 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh"] Mar 01 09:45:08 crc kubenswrapper[4792]: I0301 09:45:08.170567 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj"] Mar 01 09:45:08 crc kubenswrapper[4792]: I0301 09:45:08.177203 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj"] Mar 01 09:45:08 crc kubenswrapper[4792]: I0301 09:45:08.185000 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-27xxx"] Mar 01 09:45:08 crc kubenswrapper[4792]: I0301 09:45:08.191178 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh"] Mar 01 09:45:08 crc kubenswrapper[4792]: I0301 09:45:08.197095 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln"] Mar 01 09:45:08 crc kubenswrapper[4792]: I0301 09:45:08.202675 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-kbjsl"] Mar 01 09:45:08 crc kubenswrapper[4792]: I0301 09:45:08.208038 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-kbjsl"] Mar 01 09:45:09 crc kubenswrapper[4792]: I0301 09:45:09.419667 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f054d9d-4fbb-4909-826c-e6037c4716bd" path="/var/lib/kubelet/pods/1f054d9d-4fbb-4909-826c-e6037c4716bd/volumes" Mar 01 09:45:09 crc kubenswrapper[4792]: I0301 09:45:09.421560 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8" path="/var/lib/kubelet/pods/31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8/volumes" Mar 01 09:45:09 crc kubenswrapper[4792]: I0301 09:45:09.422738 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a742181-aebe-42f8-a83e-fee7b480366b" path="/var/lib/kubelet/pods/4a742181-aebe-42f8-a83e-fee7b480366b/volumes" Mar 01 09:45:09 crc kubenswrapper[4792]: I0301 09:45:09.423416 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54e68c85-54c7-4855-b4a0-a85d2014c7b7" path="/var/lib/kubelet/pods/54e68c85-54c7-4855-b4a0-a85d2014c7b7/volumes" Mar 01 09:45:09 crc kubenswrapper[4792]: I0301 09:45:09.424560 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c6429ad-21d8-4f58-900b-e5f6fe4d603d" path="/var/lib/kubelet/pods/5c6429ad-21d8-4f58-900b-e5f6fe4d603d/volumes" Mar 01 09:45:09 crc kubenswrapper[4792]: I0301 09:45:09.425482 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8787b5ba-7462-4594-a11d-2d0afbfe3c1c" path="/var/lib/kubelet/pods/8787b5ba-7462-4594-a11d-2d0afbfe3c1c/volumes" Mar 01 09:45:09 crc kubenswrapper[4792]: I0301 09:45:09.426055 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91d95c97-b82e-413c-b05a-3e9cb36e504e" path="/var/lib/kubelet/pods/91d95c97-b82e-413c-b05a-3e9cb36e504e/volumes" Mar 01 09:45:09 crc kubenswrapper[4792]: I0301 09:45:09.427022 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9af8a1fb-52d8-4b08-be39-ad106833ba1c" path="/var/lib/kubelet/pods/9af8a1fb-52d8-4b08-be39-ad106833ba1c/volumes" Mar 01 09:45:09 crc kubenswrapper[4792]: I0301 09:45:09.427570 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6a7e948-b141-4fb0-b717-3d02a9014dd4" path="/var/lib/kubelet/pods/a6a7e948-b141-4fb0-b717-3d02a9014dd4/volumes" Mar 01 09:45:09 crc kubenswrapper[4792]: I0301 09:45:09.428129 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5ccb279-c8b2-4288-9072-1175061be204" path="/var/lib/kubelet/pods/b5ccb279-c8b2-4288-9072-1175061be204/volumes" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.101350 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64"] Mar 01 09:45:21 crc kubenswrapper[4792]: E0301 09:45:21.102331 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7a9ad8e-1c99-4a79-87eb-912aab1dc48c" containerName="collect-profiles" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.102347 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7a9ad8e-1c99-4a79-87eb-912aab1dc48c" containerName="collect-profiles" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.102794 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7a9ad8e-1c99-4a79-87eb-912aab1dc48c" containerName="collect-profiles" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.103430 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.106888 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.107193 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.107417 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.107707 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.107950 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.120489 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64"] Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.205131 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64\" (UID: \"6c517000-6918-4f58-871b-7c4d26197ccf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.205259 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqf5q\" (UniqueName: \"kubernetes.io/projected/6c517000-6918-4f58-871b-7c4d26197ccf-kube-api-access-rqf5q\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64\" (UID: \"6c517000-6918-4f58-871b-7c4d26197ccf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.205289 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64\" (UID: \"6c517000-6918-4f58-871b-7c4d26197ccf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.205318 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64\" (UID: \"6c517000-6918-4f58-871b-7c4d26197ccf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.205356 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64\" (UID: \"6c517000-6918-4f58-871b-7c4d26197ccf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.307256 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqf5q\" (UniqueName: \"kubernetes.io/projected/6c517000-6918-4f58-871b-7c4d26197ccf-kube-api-access-rqf5q\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64\" (UID: \"6c517000-6918-4f58-871b-7c4d26197ccf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.307313 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64\" (UID: \"6c517000-6918-4f58-871b-7c4d26197ccf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.307360 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64\" (UID: \"6c517000-6918-4f58-871b-7c4d26197ccf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.307413 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64\" (UID: \"6c517000-6918-4f58-871b-7c4d26197ccf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.307488 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64\" (UID: \"6c517000-6918-4f58-871b-7c4d26197ccf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.313179 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64\" (UID: \"6c517000-6918-4f58-871b-7c4d26197ccf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.313987 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64\" (UID: \"6c517000-6918-4f58-871b-7c4d26197ccf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.319240 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64\" (UID: \"6c517000-6918-4f58-871b-7c4d26197ccf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.320659 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64\" (UID: \"6c517000-6918-4f58-871b-7c4d26197ccf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.325000 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqf5q\" (UniqueName: \"kubernetes.io/projected/6c517000-6918-4f58-871b-7c4d26197ccf-kube-api-access-rqf5q\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64\" (UID: \"6c517000-6918-4f58-871b-7c4d26197ccf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.480710 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" Mar 01 09:45:22 crc kubenswrapper[4792]: I0301 09:45:22.035584 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64"] Mar 01 09:45:22 crc kubenswrapper[4792]: I0301 09:45:22.041478 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 01 09:45:22 crc kubenswrapper[4792]: I0301 09:45:22.074397 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" event={"ID":"6c517000-6918-4f58-871b-7c4d26197ccf","Type":"ContainerStarted","Data":"9c1974b1db8a8438cbc5a90dfa75c48b2a61a08d08a187b7c66e02d0f00f2799"} Mar 01 09:45:23 crc kubenswrapper[4792]: I0301 09:45:23.084111 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" event={"ID":"6c517000-6918-4f58-871b-7c4d26197ccf","Type":"ContainerStarted","Data":"9485400e953b4441f321cb6741c7780ced074d1502b17d2cd0f2132161817682"} Mar 01 09:45:30 crc kubenswrapper[4792]: I0301 09:45:30.214291 4792 scope.go:117] "RemoveContainer" containerID="5773fb565f7454c83bd5a97647f2258db8af4721a7199d6a6eb816399f3a0abe" Mar 01 09:45:30 crc kubenswrapper[4792]: I0301 09:45:30.246219 4792 scope.go:117] "RemoveContainer" containerID="6ad37eb5a9ce8285310a5d61f804630e8b0a3954519985d12a0d71d52e93d217" Mar 01 09:45:30 crc kubenswrapper[4792]: I0301 09:45:30.296714 4792 scope.go:117] "RemoveContainer" containerID="40d6bf3bff7640be17f842b1e208ca6cbd13ff723d54eb172fb51e9f37d11d71" Mar 01 09:45:30 crc kubenswrapper[4792]: I0301 09:45:30.355168 4792 scope.go:117] "RemoveContainer" containerID="2b11700043caea12ce3ec9b1a685865b6228db2c927ef984914abec9ff9701b8" Mar 01 09:45:30 crc kubenswrapper[4792]: I0301 09:45:30.420478 4792 scope.go:117] "RemoveContainer" containerID="a0ae22fc112a93c604faf629d5ca18987a7aa343d92829f2e65e4501f2454496" Mar 01 09:45:30 crc kubenswrapper[4792]: I0301 09:45:30.456655 4792 scope.go:117] "RemoveContainer" containerID="1fcee8427ea6340db8e69cb0e43a52de1fe2f18dc84e960d49fc0b0918052c29" Mar 01 09:45:30 crc kubenswrapper[4792]: I0301 09:45:30.573121 4792 scope.go:117] "RemoveContainer" containerID="291e30822651ff807afe1c8290d577ee02386c82661acb3758a8b13541958167" Mar 01 09:45:30 crc kubenswrapper[4792]: I0301 09:45:30.623140 4792 scope.go:117] "RemoveContainer" containerID="263ab29e6e451a06b962c7883f7e5448fbe4595217de2d6fcca680e67789bfae" Mar 01 09:45:30 crc kubenswrapper[4792]: I0301 09:45:30.652081 4792 scope.go:117] "RemoveContainer" containerID="db93ce9b530ac529df661a0d9b5aa2418392875dd62f9aecf46dc33ff7dfd43c" Mar 01 09:45:31 crc kubenswrapper[4792]: I0301 09:45:31.293931 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" podStartSLOduration=9.860264431000001 podStartE2EDuration="10.293892714s" podCreationTimestamp="2026-03-01 09:45:21 +0000 UTC" firstStartedPulling="2026-03-01 09:45:22.041134685 +0000 UTC m=+2251.283013882" lastFinishedPulling="2026-03-01 09:45:22.474762968 +0000 UTC m=+2251.716642165" observedRunningTime="2026-03-01 09:45:23.104201431 +0000 UTC m=+2252.346080638" watchObservedRunningTime="2026-03-01 09:45:31.293892714 +0000 UTC m=+2260.535771911" Mar 01 09:45:31 crc kubenswrapper[4792]: I0301 09:45:31.296779 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6s29r"] Mar 01 09:45:31 crc kubenswrapper[4792]: I0301 09:45:31.298657 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6s29r" Mar 01 09:45:31 crc kubenswrapper[4792]: I0301 09:45:31.322034 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6s29r"] Mar 01 09:45:31 crc kubenswrapper[4792]: I0301 09:45:31.384763 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fab28167-0dde-44ec-a712-e11f418fd4e7-utilities\") pod \"community-operators-6s29r\" (UID: \"fab28167-0dde-44ec-a712-e11f418fd4e7\") " pod="openshift-marketplace/community-operators-6s29r" Mar 01 09:45:31 crc kubenswrapper[4792]: I0301 09:45:31.384862 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqj95\" (UniqueName: \"kubernetes.io/projected/fab28167-0dde-44ec-a712-e11f418fd4e7-kube-api-access-gqj95\") pod \"community-operators-6s29r\" (UID: \"fab28167-0dde-44ec-a712-e11f418fd4e7\") " pod="openshift-marketplace/community-operators-6s29r" Mar 01 09:45:31 crc kubenswrapper[4792]: I0301 09:45:31.384924 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fab28167-0dde-44ec-a712-e11f418fd4e7-catalog-content\") pod \"community-operators-6s29r\" (UID: \"fab28167-0dde-44ec-a712-e11f418fd4e7\") " pod="openshift-marketplace/community-operators-6s29r" Mar 01 09:45:31 crc kubenswrapper[4792]: I0301 09:45:31.486700 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqj95\" (UniqueName: \"kubernetes.io/projected/fab28167-0dde-44ec-a712-e11f418fd4e7-kube-api-access-gqj95\") pod \"community-operators-6s29r\" (UID: \"fab28167-0dde-44ec-a712-e11f418fd4e7\") " pod="openshift-marketplace/community-operators-6s29r" Mar 01 09:45:31 crc kubenswrapper[4792]: I0301 09:45:31.486800 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fab28167-0dde-44ec-a712-e11f418fd4e7-catalog-content\") pod \"community-operators-6s29r\" (UID: \"fab28167-0dde-44ec-a712-e11f418fd4e7\") " pod="openshift-marketplace/community-operators-6s29r" Mar 01 09:45:31 crc kubenswrapper[4792]: I0301 09:45:31.486940 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fab28167-0dde-44ec-a712-e11f418fd4e7-utilities\") pod \"community-operators-6s29r\" (UID: \"fab28167-0dde-44ec-a712-e11f418fd4e7\") " pod="openshift-marketplace/community-operators-6s29r" Mar 01 09:45:31 crc kubenswrapper[4792]: I0301 09:45:31.487259 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fab28167-0dde-44ec-a712-e11f418fd4e7-catalog-content\") pod \"community-operators-6s29r\" (UID: \"fab28167-0dde-44ec-a712-e11f418fd4e7\") " pod="openshift-marketplace/community-operators-6s29r" Mar 01 09:45:31 crc kubenswrapper[4792]: I0301 09:45:31.488105 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fab28167-0dde-44ec-a712-e11f418fd4e7-utilities\") pod \"community-operators-6s29r\" (UID: \"fab28167-0dde-44ec-a712-e11f418fd4e7\") " pod="openshift-marketplace/community-operators-6s29r" Mar 01 09:45:31 crc kubenswrapper[4792]: I0301 09:45:31.522732 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqj95\" (UniqueName: \"kubernetes.io/projected/fab28167-0dde-44ec-a712-e11f418fd4e7-kube-api-access-gqj95\") pod \"community-operators-6s29r\" (UID: \"fab28167-0dde-44ec-a712-e11f418fd4e7\") " pod="openshift-marketplace/community-operators-6s29r" Mar 01 09:45:31 crc kubenswrapper[4792]: I0301 09:45:31.613656 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6s29r" Mar 01 09:45:32 crc kubenswrapper[4792]: I0301 09:45:32.202420 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6s29r"] Mar 01 09:45:33 crc kubenswrapper[4792]: I0301 09:45:33.164144 4792 generic.go:334] "Generic (PLEG): container finished" podID="fab28167-0dde-44ec-a712-e11f418fd4e7" containerID="0e5e77d1eebb5d5009420264a58cc31b26cb46afeb4f4d6643cae7cd76caa0a9" exitCode=0 Mar 01 09:45:33 crc kubenswrapper[4792]: I0301 09:45:33.164257 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6s29r" event={"ID":"fab28167-0dde-44ec-a712-e11f418fd4e7","Type":"ContainerDied","Data":"0e5e77d1eebb5d5009420264a58cc31b26cb46afeb4f4d6643cae7cd76caa0a9"} Mar 01 09:45:33 crc kubenswrapper[4792]: I0301 09:45:33.164461 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6s29r" event={"ID":"fab28167-0dde-44ec-a712-e11f418fd4e7","Type":"ContainerStarted","Data":"650137c017e7367d56531d13c48673396dc39fd07a4b1f503918cddfea0fc738"} Mar 01 09:45:34 crc kubenswrapper[4792]: I0301 09:45:34.174032 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6s29r" event={"ID":"fab28167-0dde-44ec-a712-e11f418fd4e7","Type":"ContainerStarted","Data":"9c71fd2aaaa034791780733eaa2cd9c185e726e89dd37e1afbe56a3c6596b906"} Mar 01 09:45:34 crc kubenswrapper[4792]: I0301 09:45:34.943601 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:45:34 crc kubenswrapper[4792]: I0301 09:45:34.943666 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:45:35 crc kubenswrapper[4792]: I0301 09:45:35.182171 4792 generic.go:334] "Generic (PLEG): container finished" podID="6c517000-6918-4f58-871b-7c4d26197ccf" containerID="9485400e953b4441f321cb6741c7780ced074d1502b17d2cd0f2132161817682" exitCode=0 Mar 01 09:45:35 crc kubenswrapper[4792]: I0301 09:45:35.183312 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" event={"ID":"6c517000-6918-4f58-871b-7c4d26197ccf","Type":"ContainerDied","Data":"9485400e953b4441f321cb6741c7780ced074d1502b17d2cd0f2132161817682"} Mar 01 09:45:36 crc kubenswrapper[4792]: I0301 09:45:36.193554 4792 generic.go:334] "Generic (PLEG): container finished" podID="fab28167-0dde-44ec-a712-e11f418fd4e7" containerID="9c71fd2aaaa034791780733eaa2cd9c185e726e89dd37e1afbe56a3c6596b906" exitCode=0 Mar 01 09:45:36 crc kubenswrapper[4792]: I0301 09:45:36.193657 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6s29r" event={"ID":"fab28167-0dde-44ec-a712-e11f418fd4e7","Type":"ContainerDied","Data":"9c71fd2aaaa034791780733eaa2cd9c185e726e89dd37e1afbe56a3c6596b906"} Mar 01 09:45:36 crc kubenswrapper[4792]: I0301 09:45:36.626950 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" Mar 01 09:45:36 crc kubenswrapper[4792]: I0301 09:45:36.690120 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-repo-setup-combined-ca-bundle\") pod \"6c517000-6918-4f58-871b-7c4d26197ccf\" (UID: \"6c517000-6918-4f58-871b-7c4d26197ccf\") " Mar 01 09:45:36 crc kubenswrapper[4792]: I0301 09:45:36.690167 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-ssh-key-openstack-edpm-ipam\") pod \"6c517000-6918-4f58-871b-7c4d26197ccf\" (UID: \"6c517000-6918-4f58-871b-7c4d26197ccf\") " Mar 01 09:45:36 crc kubenswrapper[4792]: I0301 09:45:36.690203 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-inventory\") pod \"6c517000-6918-4f58-871b-7c4d26197ccf\" (UID: \"6c517000-6918-4f58-871b-7c4d26197ccf\") " Mar 01 09:45:36 crc kubenswrapper[4792]: I0301 09:45:36.690278 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqf5q\" (UniqueName: \"kubernetes.io/projected/6c517000-6918-4f58-871b-7c4d26197ccf-kube-api-access-rqf5q\") pod \"6c517000-6918-4f58-871b-7c4d26197ccf\" (UID: \"6c517000-6918-4f58-871b-7c4d26197ccf\") " Mar 01 09:45:36 crc kubenswrapper[4792]: I0301 09:45:36.690631 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-ceph\") pod \"6c517000-6918-4f58-871b-7c4d26197ccf\" (UID: \"6c517000-6918-4f58-871b-7c4d26197ccf\") " Mar 01 09:45:36 crc kubenswrapper[4792]: I0301 09:45:36.697576 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-ceph" (OuterVolumeSpecName: "ceph") pod "6c517000-6918-4f58-871b-7c4d26197ccf" (UID: "6c517000-6918-4f58-871b-7c4d26197ccf"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:45:36 crc kubenswrapper[4792]: I0301 09:45:36.697883 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "6c517000-6918-4f58-871b-7c4d26197ccf" (UID: "6c517000-6918-4f58-871b-7c4d26197ccf"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:45:36 crc kubenswrapper[4792]: I0301 09:45:36.698110 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c517000-6918-4f58-871b-7c4d26197ccf-kube-api-access-rqf5q" (OuterVolumeSpecName: "kube-api-access-rqf5q") pod "6c517000-6918-4f58-871b-7c4d26197ccf" (UID: "6c517000-6918-4f58-871b-7c4d26197ccf"). InnerVolumeSpecName "kube-api-access-rqf5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:45:36 crc kubenswrapper[4792]: I0301 09:45:36.717272 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6c517000-6918-4f58-871b-7c4d26197ccf" (UID: "6c517000-6918-4f58-871b-7c4d26197ccf"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:45:36 crc kubenswrapper[4792]: I0301 09:45:36.722607 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-inventory" (OuterVolumeSpecName: "inventory") pod "6c517000-6918-4f58-871b-7c4d26197ccf" (UID: "6c517000-6918-4f58-871b-7c4d26197ccf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:45:36 crc kubenswrapper[4792]: I0301 09:45:36.792618 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqf5q\" (UniqueName: \"kubernetes.io/projected/6c517000-6918-4f58-871b-7c4d26197ccf-kube-api-access-rqf5q\") on node \"crc\" DevicePath \"\"" Mar 01 09:45:36 crc kubenswrapper[4792]: I0301 09:45:36.792888 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-ceph\") on node \"crc\" DevicePath \"\"" Mar 01 09:45:36 crc kubenswrapper[4792]: I0301 09:45:36.792899 4792 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:45:36 crc kubenswrapper[4792]: I0301 09:45:36.792921 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:45:36 crc kubenswrapper[4792]: I0301 09:45:36.792930 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.243699 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6s29r" event={"ID":"fab28167-0dde-44ec-a712-e11f418fd4e7","Type":"ContainerStarted","Data":"e15a487c524290dc29e46d692cc0b283d02c1c375a139b510bca5cd9870feae0"} Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.246993 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" event={"ID":"6c517000-6918-4f58-871b-7c4d26197ccf","Type":"ContainerDied","Data":"9c1974b1db8a8438cbc5a90dfa75c48b2a61a08d08a187b7c66e02d0f00f2799"} Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.247035 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c1974b1db8a8438cbc5a90dfa75c48b2a61a08d08a187b7c66e02d0f00f2799" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.247088 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.277919 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6s29r" podStartSLOduration=2.844325643 podStartE2EDuration="6.277890814s" podCreationTimestamp="2026-03-01 09:45:31 +0000 UTC" firstStartedPulling="2026-03-01 09:45:33.165812012 +0000 UTC m=+2262.407691229" lastFinishedPulling="2026-03-01 09:45:36.599377193 +0000 UTC m=+2265.841256400" observedRunningTime="2026-03-01 09:45:37.264167411 +0000 UTC m=+2266.506046608" watchObservedRunningTime="2026-03-01 09:45:37.277890814 +0000 UTC m=+2266.519770011" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.325050 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb"] Mar 01 09:45:37 crc kubenswrapper[4792]: E0301 09:45:37.325386 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c517000-6918-4f58-871b-7c4d26197ccf" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.325404 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c517000-6918-4f58-871b-7c4d26197ccf" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.325629 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c517000-6918-4f58-871b-7c4d26197ccf" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.326259 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.328876 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.329439 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.329695 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.329973 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.330200 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.342412 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb"] Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.409934 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx4hm\" (UniqueName: \"kubernetes.io/projected/1201ca91-41eb-45d0-991d-71883b4014ae-kube-api-access-vx4hm\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb\" (UID: \"1201ca91-41eb-45d0-991d-71883b4014ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.410253 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb\" (UID: \"1201ca91-41eb-45d0-991d-71883b4014ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.410388 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb\" (UID: \"1201ca91-41eb-45d0-991d-71883b4014ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.410589 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb\" (UID: \"1201ca91-41eb-45d0-991d-71883b4014ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.410739 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb\" (UID: \"1201ca91-41eb-45d0-991d-71883b4014ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.512462 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb\" (UID: \"1201ca91-41eb-45d0-991d-71883b4014ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.512790 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb\" (UID: \"1201ca91-41eb-45d0-991d-71883b4014ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.513340 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb\" (UID: \"1201ca91-41eb-45d0-991d-71883b4014ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.513745 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb\" (UID: \"1201ca91-41eb-45d0-991d-71883b4014ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.513976 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx4hm\" (UniqueName: \"kubernetes.io/projected/1201ca91-41eb-45d0-991d-71883b4014ae-kube-api-access-vx4hm\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb\" (UID: \"1201ca91-41eb-45d0-991d-71883b4014ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.517424 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb\" (UID: \"1201ca91-41eb-45d0-991d-71883b4014ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.517658 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb\" (UID: \"1201ca91-41eb-45d0-991d-71883b4014ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.517871 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb\" (UID: \"1201ca91-41eb-45d0-991d-71883b4014ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.518107 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb\" (UID: \"1201ca91-41eb-45d0-991d-71883b4014ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.534480 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx4hm\" (UniqueName: \"kubernetes.io/projected/1201ca91-41eb-45d0-991d-71883b4014ae-kube-api-access-vx4hm\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb\" (UID: \"1201ca91-41eb-45d0-991d-71883b4014ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.654839 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" Mar 01 09:45:38 crc kubenswrapper[4792]: I0301 09:45:38.172697 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb"] Mar 01 09:45:38 crc kubenswrapper[4792]: W0301 09:45:38.179335 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1201ca91_41eb_45d0_991d_71883b4014ae.slice/crio-d3a561fc80333203759a7141294eee19a809aea7b68c78b4e20c79a31fc48cc8 WatchSource:0}: Error finding container d3a561fc80333203759a7141294eee19a809aea7b68c78b4e20c79a31fc48cc8: Status 404 returned error can't find the container with id d3a561fc80333203759a7141294eee19a809aea7b68c78b4e20c79a31fc48cc8 Mar 01 09:45:38 crc kubenswrapper[4792]: I0301 09:45:38.255875 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" event={"ID":"1201ca91-41eb-45d0-991d-71883b4014ae","Type":"ContainerStarted","Data":"d3a561fc80333203759a7141294eee19a809aea7b68c78b4e20c79a31fc48cc8"} Mar 01 09:45:39 crc kubenswrapper[4792]: I0301 09:45:39.268109 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" event={"ID":"1201ca91-41eb-45d0-991d-71883b4014ae","Type":"ContainerStarted","Data":"f9bde77e46db7a7a43711f203d62d437c0037f5981ab90809f1911e681e22c3b"} Mar 01 09:45:39 crc kubenswrapper[4792]: I0301 09:45:39.302856 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" podStartSLOduration=1.8496646289999998 podStartE2EDuration="2.302827851s" podCreationTimestamp="2026-03-01 09:45:37 +0000 UTC" firstStartedPulling="2026-03-01 09:45:38.181811166 +0000 UTC m=+2267.423690363" lastFinishedPulling="2026-03-01 09:45:38.634974388 +0000 UTC m=+2267.876853585" observedRunningTime="2026-03-01 09:45:39.287543809 +0000 UTC m=+2268.529423036" watchObservedRunningTime="2026-03-01 09:45:39.302827851 +0000 UTC m=+2268.544707058" Mar 01 09:45:41 crc kubenswrapper[4792]: I0301 09:45:41.614659 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6s29r" Mar 01 09:45:41 crc kubenswrapper[4792]: I0301 09:45:41.614986 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6s29r" Mar 01 09:45:41 crc kubenswrapper[4792]: I0301 09:45:41.657569 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6s29r" Mar 01 09:45:42 crc kubenswrapper[4792]: I0301 09:45:42.336226 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6s29r" Mar 01 09:45:42 crc kubenswrapper[4792]: I0301 09:45:42.378386 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6s29r"] Mar 01 09:45:44 crc kubenswrapper[4792]: I0301 09:45:44.306744 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6s29r" podUID="fab28167-0dde-44ec-a712-e11f418fd4e7" containerName="registry-server" containerID="cri-o://e15a487c524290dc29e46d692cc0b283d02c1c375a139b510bca5cd9870feae0" gracePeriod=2 Mar 01 09:45:44 crc kubenswrapper[4792]: I0301 09:45:44.834671 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6s29r" Mar 01 09:45:44 crc kubenswrapper[4792]: I0301 09:45:44.968863 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqj95\" (UniqueName: \"kubernetes.io/projected/fab28167-0dde-44ec-a712-e11f418fd4e7-kube-api-access-gqj95\") pod \"fab28167-0dde-44ec-a712-e11f418fd4e7\" (UID: \"fab28167-0dde-44ec-a712-e11f418fd4e7\") " Mar 01 09:45:44 crc kubenswrapper[4792]: I0301 09:45:44.969205 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fab28167-0dde-44ec-a712-e11f418fd4e7-utilities\") pod \"fab28167-0dde-44ec-a712-e11f418fd4e7\" (UID: \"fab28167-0dde-44ec-a712-e11f418fd4e7\") " Mar 01 09:45:44 crc kubenswrapper[4792]: I0301 09:45:44.969239 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fab28167-0dde-44ec-a712-e11f418fd4e7-catalog-content\") pod \"fab28167-0dde-44ec-a712-e11f418fd4e7\" (UID: \"fab28167-0dde-44ec-a712-e11f418fd4e7\") " Mar 01 09:45:44 crc kubenswrapper[4792]: I0301 09:45:44.969967 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fab28167-0dde-44ec-a712-e11f418fd4e7-utilities" (OuterVolumeSpecName: "utilities") pod "fab28167-0dde-44ec-a712-e11f418fd4e7" (UID: "fab28167-0dde-44ec-a712-e11f418fd4e7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:45:44 crc kubenswrapper[4792]: I0301 09:45:44.976603 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fab28167-0dde-44ec-a712-e11f418fd4e7-kube-api-access-gqj95" (OuterVolumeSpecName: "kube-api-access-gqj95") pod "fab28167-0dde-44ec-a712-e11f418fd4e7" (UID: "fab28167-0dde-44ec-a712-e11f418fd4e7"). InnerVolumeSpecName "kube-api-access-gqj95". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:45:45 crc kubenswrapper[4792]: I0301 09:45:45.071191 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fab28167-0dde-44ec-a712-e11f418fd4e7-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:45:45 crc kubenswrapper[4792]: I0301 09:45:45.071228 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqj95\" (UniqueName: \"kubernetes.io/projected/fab28167-0dde-44ec-a712-e11f418fd4e7-kube-api-access-gqj95\") on node \"crc\" DevicePath \"\"" Mar 01 09:45:45 crc kubenswrapper[4792]: I0301 09:45:45.321376 4792 generic.go:334] "Generic (PLEG): container finished" podID="fab28167-0dde-44ec-a712-e11f418fd4e7" containerID="e15a487c524290dc29e46d692cc0b283d02c1c375a139b510bca5cd9870feae0" exitCode=0 Mar 01 09:45:45 crc kubenswrapper[4792]: I0301 09:45:45.321432 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6s29r" event={"ID":"fab28167-0dde-44ec-a712-e11f418fd4e7","Type":"ContainerDied","Data":"e15a487c524290dc29e46d692cc0b283d02c1c375a139b510bca5cd9870feae0"} Mar 01 09:45:45 crc kubenswrapper[4792]: I0301 09:45:45.321478 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6s29r" event={"ID":"fab28167-0dde-44ec-a712-e11f418fd4e7","Type":"ContainerDied","Data":"650137c017e7367d56531d13c48673396dc39fd07a4b1f503918cddfea0fc738"} Mar 01 09:45:45 crc kubenswrapper[4792]: I0301 09:45:45.321480 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6s29r" Mar 01 09:45:45 crc kubenswrapper[4792]: I0301 09:45:45.321494 4792 scope.go:117] "RemoveContainer" containerID="e15a487c524290dc29e46d692cc0b283d02c1c375a139b510bca5cd9870feae0" Mar 01 09:45:45 crc kubenswrapper[4792]: I0301 09:45:45.346002 4792 scope.go:117] "RemoveContainer" containerID="9c71fd2aaaa034791780733eaa2cd9c185e726e89dd37e1afbe56a3c6596b906" Mar 01 09:45:45 crc kubenswrapper[4792]: I0301 09:45:45.369125 4792 scope.go:117] "RemoveContainer" containerID="0e5e77d1eebb5d5009420264a58cc31b26cb46afeb4f4d6643cae7cd76caa0a9" Mar 01 09:45:45 crc kubenswrapper[4792]: I0301 09:45:45.415698 4792 scope.go:117] "RemoveContainer" containerID="e15a487c524290dc29e46d692cc0b283d02c1c375a139b510bca5cd9870feae0" Mar 01 09:45:45 crc kubenswrapper[4792]: E0301 09:45:45.416360 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e15a487c524290dc29e46d692cc0b283d02c1c375a139b510bca5cd9870feae0\": container with ID starting with e15a487c524290dc29e46d692cc0b283d02c1c375a139b510bca5cd9870feae0 not found: ID does not exist" containerID="e15a487c524290dc29e46d692cc0b283d02c1c375a139b510bca5cd9870feae0" Mar 01 09:45:45 crc kubenswrapper[4792]: I0301 09:45:45.416397 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e15a487c524290dc29e46d692cc0b283d02c1c375a139b510bca5cd9870feae0"} err="failed to get container status \"e15a487c524290dc29e46d692cc0b283d02c1c375a139b510bca5cd9870feae0\": rpc error: code = NotFound desc = could not find container \"e15a487c524290dc29e46d692cc0b283d02c1c375a139b510bca5cd9870feae0\": container with ID starting with e15a487c524290dc29e46d692cc0b283d02c1c375a139b510bca5cd9870feae0 not found: ID does not exist" Mar 01 09:45:45 crc kubenswrapper[4792]: I0301 09:45:45.416450 4792 scope.go:117] "RemoveContainer" containerID="9c71fd2aaaa034791780733eaa2cd9c185e726e89dd37e1afbe56a3c6596b906" Mar 01 09:45:45 crc kubenswrapper[4792]: E0301 09:45:45.417159 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c71fd2aaaa034791780733eaa2cd9c185e726e89dd37e1afbe56a3c6596b906\": container with ID starting with 9c71fd2aaaa034791780733eaa2cd9c185e726e89dd37e1afbe56a3c6596b906 not found: ID does not exist" containerID="9c71fd2aaaa034791780733eaa2cd9c185e726e89dd37e1afbe56a3c6596b906" Mar 01 09:45:45 crc kubenswrapper[4792]: I0301 09:45:45.417220 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c71fd2aaaa034791780733eaa2cd9c185e726e89dd37e1afbe56a3c6596b906"} err="failed to get container status \"9c71fd2aaaa034791780733eaa2cd9c185e726e89dd37e1afbe56a3c6596b906\": rpc error: code = NotFound desc = could not find container \"9c71fd2aaaa034791780733eaa2cd9c185e726e89dd37e1afbe56a3c6596b906\": container with ID starting with 9c71fd2aaaa034791780733eaa2cd9c185e726e89dd37e1afbe56a3c6596b906 not found: ID does not exist" Mar 01 09:45:45 crc kubenswrapper[4792]: I0301 09:45:45.417254 4792 scope.go:117] "RemoveContainer" containerID="0e5e77d1eebb5d5009420264a58cc31b26cb46afeb4f4d6643cae7cd76caa0a9" Mar 01 09:45:45 crc kubenswrapper[4792]: E0301 09:45:45.417610 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e5e77d1eebb5d5009420264a58cc31b26cb46afeb4f4d6643cae7cd76caa0a9\": container with ID starting with 0e5e77d1eebb5d5009420264a58cc31b26cb46afeb4f4d6643cae7cd76caa0a9 not found: ID does not exist" containerID="0e5e77d1eebb5d5009420264a58cc31b26cb46afeb4f4d6643cae7cd76caa0a9" Mar 01 09:45:45 crc kubenswrapper[4792]: I0301 09:45:45.417640 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e5e77d1eebb5d5009420264a58cc31b26cb46afeb4f4d6643cae7cd76caa0a9"} err="failed to get container status \"0e5e77d1eebb5d5009420264a58cc31b26cb46afeb4f4d6643cae7cd76caa0a9\": rpc error: code = NotFound desc = could not find container \"0e5e77d1eebb5d5009420264a58cc31b26cb46afeb4f4d6643cae7cd76caa0a9\": container with ID starting with 0e5e77d1eebb5d5009420264a58cc31b26cb46afeb4f4d6643cae7cd76caa0a9 not found: ID does not exist" Mar 01 09:45:45 crc kubenswrapper[4792]: I0301 09:45:45.489414 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fab28167-0dde-44ec-a712-e11f418fd4e7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fab28167-0dde-44ec-a712-e11f418fd4e7" (UID: "fab28167-0dde-44ec-a712-e11f418fd4e7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:45:45 crc kubenswrapper[4792]: I0301 09:45:45.580307 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fab28167-0dde-44ec-a712-e11f418fd4e7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:45:45 crc kubenswrapper[4792]: I0301 09:45:45.661746 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6s29r"] Mar 01 09:45:45 crc kubenswrapper[4792]: I0301 09:45:45.673480 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6s29r"] Mar 01 09:45:47 crc kubenswrapper[4792]: I0301 09:45:47.420571 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fab28167-0dde-44ec-a712-e11f418fd4e7" path="/var/lib/kubelet/pods/fab28167-0dde-44ec-a712-e11f418fd4e7/volumes" Mar 01 09:46:00 crc kubenswrapper[4792]: I0301 09:46:00.132295 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539306-tt9nk"] Mar 01 09:46:00 crc kubenswrapper[4792]: E0301 09:46:00.133226 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fab28167-0dde-44ec-a712-e11f418fd4e7" containerName="extract-utilities" Mar 01 09:46:00 crc kubenswrapper[4792]: I0301 09:46:00.133244 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="fab28167-0dde-44ec-a712-e11f418fd4e7" containerName="extract-utilities" Mar 01 09:46:00 crc kubenswrapper[4792]: E0301 09:46:00.133270 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fab28167-0dde-44ec-a712-e11f418fd4e7" containerName="extract-content" Mar 01 09:46:00 crc kubenswrapper[4792]: I0301 09:46:00.133279 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="fab28167-0dde-44ec-a712-e11f418fd4e7" containerName="extract-content" Mar 01 09:46:00 crc kubenswrapper[4792]: E0301 09:46:00.133291 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fab28167-0dde-44ec-a712-e11f418fd4e7" containerName="registry-server" Mar 01 09:46:00 crc kubenswrapper[4792]: I0301 09:46:00.133298 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="fab28167-0dde-44ec-a712-e11f418fd4e7" containerName="registry-server" Mar 01 09:46:00 crc kubenswrapper[4792]: I0301 09:46:00.133502 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="fab28167-0dde-44ec-a712-e11f418fd4e7" containerName="registry-server" Mar 01 09:46:00 crc kubenswrapper[4792]: I0301 09:46:00.134178 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539306-tt9nk" Mar 01 09:46:00 crc kubenswrapper[4792]: I0301 09:46:00.136066 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:46:00 crc kubenswrapper[4792]: I0301 09:46:00.139234 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:46:00 crc kubenswrapper[4792]: I0301 09:46:00.139550 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:46:00 crc kubenswrapper[4792]: I0301 09:46:00.141849 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539306-tt9nk"] Mar 01 09:46:00 crc kubenswrapper[4792]: I0301 09:46:00.237958 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-462ld\" (UniqueName: \"kubernetes.io/projected/f5d29db8-7573-4364-9e18-20658b790d1f-kube-api-access-462ld\") pod \"auto-csr-approver-29539306-tt9nk\" (UID: \"f5d29db8-7573-4364-9e18-20658b790d1f\") " pod="openshift-infra/auto-csr-approver-29539306-tt9nk" Mar 01 09:46:00 crc kubenswrapper[4792]: I0301 09:46:00.339605 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-462ld\" (UniqueName: \"kubernetes.io/projected/f5d29db8-7573-4364-9e18-20658b790d1f-kube-api-access-462ld\") pod \"auto-csr-approver-29539306-tt9nk\" (UID: \"f5d29db8-7573-4364-9e18-20658b790d1f\") " pod="openshift-infra/auto-csr-approver-29539306-tt9nk" Mar 01 09:46:00 crc kubenswrapper[4792]: I0301 09:46:00.358735 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-462ld\" (UniqueName: \"kubernetes.io/projected/f5d29db8-7573-4364-9e18-20658b790d1f-kube-api-access-462ld\") pod \"auto-csr-approver-29539306-tt9nk\" (UID: \"f5d29db8-7573-4364-9e18-20658b790d1f\") " pod="openshift-infra/auto-csr-approver-29539306-tt9nk" Mar 01 09:46:00 crc kubenswrapper[4792]: I0301 09:46:00.449653 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539306-tt9nk" Mar 01 09:46:00 crc kubenswrapper[4792]: I0301 09:46:00.880301 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539306-tt9nk"] Mar 01 09:46:01 crc kubenswrapper[4792]: I0301 09:46:01.446255 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539306-tt9nk" event={"ID":"f5d29db8-7573-4364-9e18-20658b790d1f","Type":"ContainerStarted","Data":"0bedb45f125679d6b8c0bd25245b8ca7d427fb9cc496e01b3f86aa772d139092"} Mar 01 09:46:02 crc kubenswrapper[4792]: E0301 09:46:02.300030 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5d29db8_7573_4364_9e18_20658b790d1f.slice/crio-conmon-739afaccd13faa05c6c15e2c6b70ac689c35aa13a309f0b869c97a20dddff65e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5d29db8_7573_4364_9e18_20658b790d1f.slice/crio-739afaccd13faa05c6c15e2c6b70ac689c35aa13a309f0b869c97a20dddff65e.scope\": RecentStats: unable to find data in memory cache]" Mar 01 09:46:02 crc kubenswrapper[4792]: I0301 09:46:02.455957 4792 generic.go:334] "Generic (PLEG): container finished" podID="f5d29db8-7573-4364-9e18-20658b790d1f" containerID="739afaccd13faa05c6c15e2c6b70ac689c35aa13a309f0b869c97a20dddff65e" exitCode=0 Mar 01 09:46:02 crc kubenswrapper[4792]: I0301 09:46:02.456003 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539306-tt9nk" event={"ID":"f5d29db8-7573-4364-9e18-20658b790d1f","Type":"ContainerDied","Data":"739afaccd13faa05c6c15e2c6b70ac689c35aa13a309f0b869c97a20dddff65e"} Mar 01 09:46:03 crc kubenswrapper[4792]: I0301 09:46:03.769253 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539306-tt9nk" Mar 01 09:46:03 crc kubenswrapper[4792]: I0301 09:46:03.904018 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-462ld\" (UniqueName: \"kubernetes.io/projected/f5d29db8-7573-4364-9e18-20658b790d1f-kube-api-access-462ld\") pod \"f5d29db8-7573-4364-9e18-20658b790d1f\" (UID: \"f5d29db8-7573-4364-9e18-20658b790d1f\") " Mar 01 09:46:03 crc kubenswrapper[4792]: I0301 09:46:03.909194 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5d29db8-7573-4364-9e18-20658b790d1f-kube-api-access-462ld" (OuterVolumeSpecName: "kube-api-access-462ld") pod "f5d29db8-7573-4364-9e18-20658b790d1f" (UID: "f5d29db8-7573-4364-9e18-20658b790d1f"). InnerVolumeSpecName "kube-api-access-462ld". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:46:04 crc kubenswrapper[4792]: I0301 09:46:04.006188 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-462ld\" (UniqueName: \"kubernetes.io/projected/f5d29db8-7573-4364-9e18-20658b790d1f-kube-api-access-462ld\") on node \"crc\" DevicePath \"\"" Mar 01 09:46:04 crc kubenswrapper[4792]: I0301 09:46:04.471572 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539306-tt9nk" event={"ID":"f5d29db8-7573-4364-9e18-20658b790d1f","Type":"ContainerDied","Data":"0bedb45f125679d6b8c0bd25245b8ca7d427fb9cc496e01b3f86aa772d139092"} Mar 01 09:46:04 crc kubenswrapper[4792]: I0301 09:46:04.471610 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bedb45f125679d6b8c0bd25245b8ca7d427fb9cc496e01b3f86aa772d139092" Mar 01 09:46:04 crc kubenswrapper[4792]: I0301 09:46:04.471631 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539306-tt9nk" Mar 01 09:46:04 crc kubenswrapper[4792]: I0301 09:46:04.843026 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539300-ws5xb"] Mar 01 09:46:04 crc kubenswrapper[4792]: I0301 09:46:04.851284 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539300-ws5xb"] Mar 01 09:46:04 crc kubenswrapper[4792]: I0301 09:46:04.942456 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:46:04 crc kubenswrapper[4792]: I0301 09:46:04.942498 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:46:04 crc kubenswrapper[4792]: I0301 09:46:04.942532 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:46:04 crc kubenswrapper[4792]: I0301 09:46:04.943175 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"80f7f8ed75933ce29e4ba1caec37158448f83396fe1f5a8bba0233aec8df1ec7"} pod="openshift-machine-config-operator/machine-config-daemon-bqszv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 01 09:46:04 crc kubenswrapper[4792]: I0301 09:46:04.943217 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" containerID="cri-o://80f7f8ed75933ce29e4ba1caec37158448f83396fe1f5a8bba0233aec8df1ec7" gracePeriod=600 Mar 01 09:46:05 crc kubenswrapper[4792]: I0301 09:46:05.432228 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79584910-9524-4e1c-8edf-5411aa71eb0a" path="/var/lib/kubelet/pods/79584910-9524-4e1c-8edf-5411aa71eb0a/volumes" Mar 01 09:46:05 crc kubenswrapper[4792]: I0301 09:46:05.484257 4792 generic.go:334] "Generic (PLEG): container finished" podID="9105f6b0-6f16-47aa-8009-73736a90b765" containerID="80f7f8ed75933ce29e4ba1caec37158448f83396fe1f5a8bba0233aec8df1ec7" exitCode=0 Mar 01 09:46:05 crc kubenswrapper[4792]: I0301 09:46:05.484307 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerDied","Data":"80f7f8ed75933ce29e4ba1caec37158448f83396fe1f5a8bba0233aec8df1ec7"} Mar 01 09:46:05 crc kubenswrapper[4792]: I0301 09:46:05.484413 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerStarted","Data":"f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983"} Mar 01 09:46:05 crc kubenswrapper[4792]: I0301 09:46:05.484478 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:46:27 crc kubenswrapper[4792]: I0301 09:46:27.883524 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6n47w"] Mar 01 09:46:27 crc kubenswrapper[4792]: E0301 09:46:27.884275 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5d29db8-7573-4364-9e18-20658b790d1f" containerName="oc" Mar 01 09:46:27 crc kubenswrapper[4792]: I0301 09:46:27.884287 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5d29db8-7573-4364-9e18-20658b790d1f" containerName="oc" Mar 01 09:46:27 crc kubenswrapper[4792]: I0301 09:46:27.884457 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5d29db8-7573-4364-9e18-20658b790d1f" containerName="oc" Mar 01 09:46:27 crc kubenswrapper[4792]: I0301 09:46:27.885596 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6n47w" Mar 01 09:46:27 crc kubenswrapper[4792]: I0301 09:46:27.894090 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6n47w"] Mar 01 09:46:27 crc kubenswrapper[4792]: I0301 09:46:27.955989 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54fe8edc-deda-4a44-b14f-263f77d4c545-catalog-content\") pod \"redhat-marketplace-6n47w\" (UID: \"54fe8edc-deda-4a44-b14f-263f77d4c545\") " pod="openshift-marketplace/redhat-marketplace-6n47w" Mar 01 09:46:27 crc kubenswrapper[4792]: I0301 09:46:27.956333 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54fe8edc-deda-4a44-b14f-263f77d4c545-utilities\") pod \"redhat-marketplace-6n47w\" (UID: \"54fe8edc-deda-4a44-b14f-263f77d4c545\") " pod="openshift-marketplace/redhat-marketplace-6n47w" Mar 01 09:46:27 crc kubenswrapper[4792]: I0301 09:46:27.956407 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh5px\" (UniqueName: \"kubernetes.io/projected/54fe8edc-deda-4a44-b14f-263f77d4c545-kube-api-access-dh5px\") pod \"redhat-marketplace-6n47w\" (UID: \"54fe8edc-deda-4a44-b14f-263f77d4c545\") " pod="openshift-marketplace/redhat-marketplace-6n47w" Mar 01 09:46:28 crc kubenswrapper[4792]: I0301 09:46:28.058017 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54fe8edc-deda-4a44-b14f-263f77d4c545-catalog-content\") pod \"redhat-marketplace-6n47w\" (UID: \"54fe8edc-deda-4a44-b14f-263f77d4c545\") " pod="openshift-marketplace/redhat-marketplace-6n47w" Mar 01 09:46:28 crc kubenswrapper[4792]: I0301 09:46:28.058115 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54fe8edc-deda-4a44-b14f-263f77d4c545-utilities\") pod \"redhat-marketplace-6n47w\" (UID: \"54fe8edc-deda-4a44-b14f-263f77d4c545\") " pod="openshift-marketplace/redhat-marketplace-6n47w" Mar 01 09:46:28 crc kubenswrapper[4792]: I0301 09:46:28.058181 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh5px\" (UniqueName: \"kubernetes.io/projected/54fe8edc-deda-4a44-b14f-263f77d4c545-kube-api-access-dh5px\") pod \"redhat-marketplace-6n47w\" (UID: \"54fe8edc-deda-4a44-b14f-263f77d4c545\") " pod="openshift-marketplace/redhat-marketplace-6n47w" Mar 01 09:46:28 crc kubenswrapper[4792]: I0301 09:46:28.058588 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54fe8edc-deda-4a44-b14f-263f77d4c545-utilities\") pod \"redhat-marketplace-6n47w\" (UID: \"54fe8edc-deda-4a44-b14f-263f77d4c545\") " pod="openshift-marketplace/redhat-marketplace-6n47w" Mar 01 09:46:28 crc kubenswrapper[4792]: I0301 09:46:28.058621 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54fe8edc-deda-4a44-b14f-263f77d4c545-catalog-content\") pod \"redhat-marketplace-6n47w\" (UID: \"54fe8edc-deda-4a44-b14f-263f77d4c545\") " pod="openshift-marketplace/redhat-marketplace-6n47w" Mar 01 09:46:28 crc kubenswrapper[4792]: I0301 09:46:28.084516 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh5px\" (UniqueName: \"kubernetes.io/projected/54fe8edc-deda-4a44-b14f-263f77d4c545-kube-api-access-dh5px\") pod \"redhat-marketplace-6n47w\" (UID: \"54fe8edc-deda-4a44-b14f-263f77d4c545\") " pod="openshift-marketplace/redhat-marketplace-6n47w" Mar 01 09:46:28 crc kubenswrapper[4792]: I0301 09:46:28.205349 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6n47w" Mar 01 09:46:28 crc kubenswrapper[4792]: I0301 09:46:28.719131 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6n47w"] Mar 01 09:46:29 crc kubenswrapper[4792]: I0301 09:46:29.686490 4792 generic.go:334] "Generic (PLEG): container finished" podID="54fe8edc-deda-4a44-b14f-263f77d4c545" containerID="167b6d9651d45aa34b171933ad6a0854589140bcc6856976316b5cce113f3f16" exitCode=0 Mar 01 09:46:29 crc kubenswrapper[4792]: I0301 09:46:29.686744 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6n47w" event={"ID":"54fe8edc-deda-4a44-b14f-263f77d4c545","Type":"ContainerDied","Data":"167b6d9651d45aa34b171933ad6a0854589140bcc6856976316b5cce113f3f16"} Mar 01 09:46:29 crc kubenswrapper[4792]: I0301 09:46:29.686794 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6n47w" event={"ID":"54fe8edc-deda-4a44-b14f-263f77d4c545","Type":"ContainerStarted","Data":"aeb132f4e48cdf93635bdad095b234bbcae82fcff8988e2298e8a24863eaf81c"} Mar 01 09:46:30 crc kubenswrapper[4792]: I0301 09:46:30.695027 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6n47w" event={"ID":"54fe8edc-deda-4a44-b14f-263f77d4c545","Type":"ContainerStarted","Data":"c398ffc0f4f3830fdd01b5402a7b585b2e9ced0443568ac24dd95327806a3b07"} Mar 01 09:46:30 crc kubenswrapper[4792]: I0301 09:46:30.811070 4792 scope.go:117] "RemoveContainer" containerID="c7d695ec4b923c131de05f4b3929cfdf1d810db81219d72b727b76b052713eaf" Mar 01 09:46:30 crc kubenswrapper[4792]: I0301 09:46:30.848204 4792 scope.go:117] "RemoveContainer" containerID="4100c168a5febe541b2b6fdd770ebafed4e19b504bd5f0104ce3f530de9d8c6d" Mar 01 09:46:30 crc kubenswrapper[4792]: I0301 09:46:30.905180 4792 scope.go:117] "RemoveContainer" containerID="8579b82c3f2aeab429db244bb7b4d62bd406e57babe3af839bf5e91664e2433c" Mar 01 09:46:31 crc kubenswrapper[4792]: I0301 09:46:31.703460 4792 generic.go:334] "Generic (PLEG): container finished" podID="54fe8edc-deda-4a44-b14f-263f77d4c545" containerID="c398ffc0f4f3830fdd01b5402a7b585b2e9ced0443568ac24dd95327806a3b07" exitCode=0 Mar 01 09:46:31 crc kubenswrapper[4792]: I0301 09:46:31.703499 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6n47w" event={"ID":"54fe8edc-deda-4a44-b14f-263f77d4c545","Type":"ContainerDied","Data":"c398ffc0f4f3830fdd01b5402a7b585b2e9ced0443568ac24dd95327806a3b07"} Mar 01 09:46:32 crc kubenswrapper[4792]: I0301 09:46:32.716038 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6n47w" event={"ID":"54fe8edc-deda-4a44-b14f-263f77d4c545","Type":"ContainerStarted","Data":"777633989bccc7df92624b56ebe9fc99a3b70de7c01e07c533c777807881f6dd"} Mar 01 09:46:32 crc kubenswrapper[4792]: I0301 09:46:32.741495 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6n47w" podStartSLOduration=3.262183815 podStartE2EDuration="5.741461864s" podCreationTimestamp="2026-03-01 09:46:27 +0000 UTC" firstStartedPulling="2026-03-01 09:46:29.690345433 +0000 UTC m=+2318.932224630" lastFinishedPulling="2026-03-01 09:46:32.169623472 +0000 UTC m=+2321.411502679" observedRunningTime="2026-03-01 09:46:32.739493665 +0000 UTC m=+2321.981372912" watchObservedRunningTime="2026-03-01 09:46:32.741461864 +0000 UTC m=+2321.983341071" Mar 01 09:46:38 crc kubenswrapper[4792]: I0301 09:46:38.205806 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6n47w" Mar 01 09:46:38 crc kubenswrapper[4792]: I0301 09:46:38.207515 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6n47w" Mar 01 09:46:38 crc kubenswrapper[4792]: I0301 09:46:38.252557 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6n47w" Mar 01 09:46:38 crc kubenswrapper[4792]: I0301 09:46:38.805387 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6n47w" Mar 01 09:46:38 crc kubenswrapper[4792]: I0301 09:46:38.857764 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6n47w"] Mar 01 09:46:40 crc kubenswrapper[4792]: I0301 09:46:40.770493 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6n47w" podUID="54fe8edc-deda-4a44-b14f-263f77d4c545" containerName="registry-server" containerID="cri-o://777633989bccc7df92624b56ebe9fc99a3b70de7c01e07c533c777807881f6dd" gracePeriod=2 Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.286342 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6n47w" Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.412375 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dh5px\" (UniqueName: \"kubernetes.io/projected/54fe8edc-deda-4a44-b14f-263f77d4c545-kube-api-access-dh5px\") pod \"54fe8edc-deda-4a44-b14f-263f77d4c545\" (UID: \"54fe8edc-deda-4a44-b14f-263f77d4c545\") " Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.412536 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54fe8edc-deda-4a44-b14f-263f77d4c545-utilities\") pod \"54fe8edc-deda-4a44-b14f-263f77d4c545\" (UID: \"54fe8edc-deda-4a44-b14f-263f77d4c545\") " Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.412591 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54fe8edc-deda-4a44-b14f-263f77d4c545-catalog-content\") pod \"54fe8edc-deda-4a44-b14f-263f77d4c545\" (UID: \"54fe8edc-deda-4a44-b14f-263f77d4c545\") " Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.416223 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54fe8edc-deda-4a44-b14f-263f77d4c545-utilities" (OuterVolumeSpecName: "utilities") pod "54fe8edc-deda-4a44-b14f-263f77d4c545" (UID: "54fe8edc-deda-4a44-b14f-263f77d4c545"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.420303 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54fe8edc-deda-4a44-b14f-263f77d4c545-kube-api-access-dh5px" (OuterVolumeSpecName: "kube-api-access-dh5px") pod "54fe8edc-deda-4a44-b14f-263f77d4c545" (UID: "54fe8edc-deda-4a44-b14f-263f77d4c545"). InnerVolumeSpecName "kube-api-access-dh5px". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.440998 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54fe8edc-deda-4a44-b14f-263f77d4c545-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "54fe8edc-deda-4a44-b14f-263f77d4c545" (UID: "54fe8edc-deda-4a44-b14f-263f77d4c545"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.514894 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54fe8edc-deda-4a44-b14f-263f77d4c545-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.514938 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54fe8edc-deda-4a44-b14f-263f77d4c545-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.514954 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dh5px\" (UniqueName: \"kubernetes.io/projected/54fe8edc-deda-4a44-b14f-263f77d4c545-kube-api-access-dh5px\") on node \"crc\" DevicePath \"\"" Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.787773 4792 generic.go:334] "Generic (PLEG): container finished" podID="54fe8edc-deda-4a44-b14f-263f77d4c545" containerID="777633989bccc7df92624b56ebe9fc99a3b70de7c01e07c533c777807881f6dd" exitCode=0 Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.788015 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6n47w" event={"ID":"54fe8edc-deda-4a44-b14f-263f77d4c545","Type":"ContainerDied","Data":"777633989bccc7df92624b56ebe9fc99a3b70de7c01e07c533c777807881f6dd"} Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.788148 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6n47w" Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.788158 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6n47w" event={"ID":"54fe8edc-deda-4a44-b14f-263f77d4c545","Type":"ContainerDied","Data":"aeb132f4e48cdf93635bdad095b234bbcae82fcff8988e2298e8a24863eaf81c"} Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.788182 4792 scope.go:117] "RemoveContainer" containerID="777633989bccc7df92624b56ebe9fc99a3b70de7c01e07c533c777807881f6dd" Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.816369 4792 scope.go:117] "RemoveContainer" containerID="c398ffc0f4f3830fdd01b5402a7b585b2e9ced0443568ac24dd95327806a3b07" Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.818746 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6n47w"] Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.825891 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6n47w"] Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.843932 4792 scope.go:117] "RemoveContainer" containerID="167b6d9651d45aa34b171933ad6a0854589140bcc6856976316b5cce113f3f16" Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.867257 4792 scope.go:117] "RemoveContainer" containerID="777633989bccc7df92624b56ebe9fc99a3b70de7c01e07c533c777807881f6dd" Mar 01 09:46:41 crc kubenswrapper[4792]: E0301 09:46:41.867720 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"777633989bccc7df92624b56ebe9fc99a3b70de7c01e07c533c777807881f6dd\": container with ID starting with 777633989bccc7df92624b56ebe9fc99a3b70de7c01e07c533c777807881f6dd not found: ID does not exist" containerID="777633989bccc7df92624b56ebe9fc99a3b70de7c01e07c533c777807881f6dd" Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.867749 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"777633989bccc7df92624b56ebe9fc99a3b70de7c01e07c533c777807881f6dd"} err="failed to get container status \"777633989bccc7df92624b56ebe9fc99a3b70de7c01e07c533c777807881f6dd\": rpc error: code = NotFound desc = could not find container \"777633989bccc7df92624b56ebe9fc99a3b70de7c01e07c533c777807881f6dd\": container with ID starting with 777633989bccc7df92624b56ebe9fc99a3b70de7c01e07c533c777807881f6dd not found: ID does not exist" Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.867770 4792 scope.go:117] "RemoveContainer" containerID="c398ffc0f4f3830fdd01b5402a7b585b2e9ced0443568ac24dd95327806a3b07" Mar 01 09:46:41 crc kubenswrapper[4792]: E0301 09:46:41.868178 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c398ffc0f4f3830fdd01b5402a7b585b2e9ced0443568ac24dd95327806a3b07\": container with ID starting with c398ffc0f4f3830fdd01b5402a7b585b2e9ced0443568ac24dd95327806a3b07 not found: ID does not exist" containerID="c398ffc0f4f3830fdd01b5402a7b585b2e9ced0443568ac24dd95327806a3b07" Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.868207 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c398ffc0f4f3830fdd01b5402a7b585b2e9ced0443568ac24dd95327806a3b07"} err="failed to get container status \"c398ffc0f4f3830fdd01b5402a7b585b2e9ced0443568ac24dd95327806a3b07\": rpc error: code = NotFound desc = could not find container \"c398ffc0f4f3830fdd01b5402a7b585b2e9ced0443568ac24dd95327806a3b07\": container with ID starting with c398ffc0f4f3830fdd01b5402a7b585b2e9ced0443568ac24dd95327806a3b07 not found: ID does not exist" Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.868219 4792 scope.go:117] "RemoveContainer" containerID="167b6d9651d45aa34b171933ad6a0854589140bcc6856976316b5cce113f3f16" Mar 01 09:46:41 crc kubenswrapper[4792]: E0301 09:46:41.868392 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"167b6d9651d45aa34b171933ad6a0854589140bcc6856976316b5cce113f3f16\": container with ID starting with 167b6d9651d45aa34b171933ad6a0854589140bcc6856976316b5cce113f3f16 not found: ID does not exist" containerID="167b6d9651d45aa34b171933ad6a0854589140bcc6856976316b5cce113f3f16" Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.868419 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"167b6d9651d45aa34b171933ad6a0854589140bcc6856976316b5cce113f3f16"} err="failed to get container status \"167b6d9651d45aa34b171933ad6a0854589140bcc6856976316b5cce113f3f16\": rpc error: code = NotFound desc = could not find container \"167b6d9651d45aa34b171933ad6a0854589140bcc6856976316b5cce113f3f16\": container with ID starting with 167b6d9651d45aa34b171933ad6a0854589140bcc6856976316b5cce113f3f16 not found: ID does not exist" Mar 01 09:46:43 crc kubenswrapper[4792]: I0301 09:46:43.421518 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54fe8edc-deda-4a44-b14f-263f77d4c545" path="/var/lib/kubelet/pods/54fe8edc-deda-4a44-b14f-263f77d4c545/volumes" Mar 01 09:46:43 crc kubenswrapper[4792]: I0301 09:46:43.892952 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ngjhc"] Mar 01 09:46:43 crc kubenswrapper[4792]: E0301 09:46:43.893331 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54fe8edc-deda-4a44-b14f-263f77d4c545" containerName="registry-server" Mar 01 09:46:43 crc kubenswrapper[4792]: I0301 09:46:43.893345 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="54fe8edc-deda-4a44-b14f-263f77d4c545" containerName="registry-server" Mar 01 09:46:43 crc kubenswrapper[4792]: E0301 09:46:43.893372 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54fe8edc-deda-4a44-b14f-263f77d4c545" containerName="extract-content" Mar 01 09:46:43 crc kubenswrapper[4792]: I0301 09:46:43.893378 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="54fe8edc-deda-4a44-b14f-263f77d4c545" containerName="extract-content" Mar 01 09:46:43 crc kubenswrapper[4792]: E0301 09:46:43.893386 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54fe8edc-deda-4a44-b14f-263f77d4c545" containerName="extract-utilities" Mar 01 09:46:43 crc kubenswrapper[4792]: I0301 09:46:43.893392 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="54fe8edc-deda-4a44-b14f-263f77d4c545" containerName="extract-utilities" Mar 01 09:46:43 crc kubenswrapper[4792]: I0301 09:46:43.893546 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="54fe8edc-deda-4a44-b14f-263f77d4c545" containerName="registry-server" Mar 01 09:46:43 crc kubenswrapper[4792]: I0301 09:46:43.894700 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ngjhc" Mar 01 09:46:43 crc kubenswrapper[4792]: I0301 09:46:43.913164 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ngjhc"] Mar 01 09:46:43 crc kubenswrapper[4792]: I0301 09:46:43.960969 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3-catalog-content\") pod \"certified-operators-ngjhc\" (UID: \"d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3\") " pod="openshift-marketplace/certified-operators-ngjhc" Mar 01 09:46:43 crc kubenswrapper[4792]: I0301 09:46:43.961027 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3-utilities\") pod \"certified-operators-ngjhc\" (UID: \"d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3\") " pod="openshift-marketplace/certified-operators-ngjhc" Mar 01 09:46:43 crc kubenswrapper[4792]: I0301 09:46:43.961409 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kp4d\" (UniqueName: \"kubernetes.io/projected/d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3-kube-api-access-8kp4d\") pod \"certified-operators-ngjhc\" (UID: \"d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3\") " pod="openshift-marketplace/certified-operators-ngjhc" Mar 01 09:46:44 crc kubenswrapper[4792]: I0301 09:46:44.063411 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kp4d\" (UniqueName: \"kubernetes.io/projected/d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3-kube-api-access-8kp4d\") pod \"certified-operators-ngjhc\" (UID: \"d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3\") " pod="openshift-marketplace/certified-operators-ngjhc" Mar 01 09:46:44 crc kubenswrapper[4792]: I0301 09:46:44.063505 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3-catalog-content\") pod \"certified-operators-ngjhc\" (UID: \"d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3\") " pod="openshift-marketplace/certified-operators-ngjhc" Mar 01 09:46:44 crc kubenswrapper[4792]: I0301 09:46:44.063527 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3-utilities\") pod \"certified-operators-ngjhc\" (UID: \"d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3\") " pod="openshift-marketplace/certified-operators-ngjhc" Mar 01 09:46:44 crc kubenswrapper[4792]: I0301 09:46:44.064106 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3-utilities\") pod \"certified-operators-ngjhc\" (UID: \"d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3\") " pod="openshift-marketplace/certified-operators-ngjhc" Mar 01 09:46:44 crc kubenswrapper[4792]: I0301 09:46:44.064266 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3-catalog-content\") pod \"certified-operators-ngjhc\" (UID: \"d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3\") " pod="openshift-marketplace/certified-operators-ngjhc" Mar 01 09:46:44 crc kubenswrapper[4792]: I0301 09:46:44.080978 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kp4d\" (UniqueName: \"kubernetes.io/projected/d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3-kube-api-access-8kp4d\") pod \"certified-operators-ngjhc\" (UID: \"d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3\") " pod="openshift-marketplace/certified-operators-ngjhc" Mar 01 09:46:44 crc kubenswrapper[4792]: I0301 09:46:44.218215 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ngjhc" Mar 01 09:46:44 crc kubenswrapper[4792]: I0301 09:46:44.817396 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ngjhc"] Mar 01 09:46:45 crc kubenswrapper[4792]: I0301 09:46:45.813884 4792 generic.go:334] "Generic (PLEG): container finished" podID="d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3" containerID="4d936e88bfe740547ec3ad69b38a532ec388cf67e43e3209ea871881ebe7b0b3" exitCode=0 Mar 01 09:46:45 crc kubenswrapper[4792]: I0301 09:46:45.814048 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngjhc" event={"ID":"d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3","Type":"ContainerDied","Data":"4d936e88bfe740547ec3ad69b38a532ec388cf67e43e3209ea871881ebe7b0b3"} Mar 01 09:46:45 crc kubenswrapper[4792]: I0301 09:46:45.814420 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngjhc" event={"ID":"d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3","Type":"ContainerStarted","Data":"69f0c5dc0c1619083b7daceb6ad82ab03f1d67532fb7c06ccbeefe11fdc99439"} Mar 01 09:46:46 crc kubenswrapper[4792]: I0301 09:46:46.823335 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngjhc" event={"ID":"d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3","Type":"ContainerStarted","Data":"6f64b6331ce23adb7d1f2935446438b352d2e4ffc202dfc8aceabf6b55312303"} Mar 01 09:46:50 crc kubenswrapper[4792]: I0301 09:46:50.854602 4792 generic.go:334] "Generic (PLEG): container finished" podID="d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3" containerID="6f64b6331ce23adb7d1f2935446438b352d2e4ffc202dfc8aceabf6b55312303" exitCode=0 Mar 01 09:46:50 crc kubenswrapper[4792]: I0301 09:46:50.854794 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngjhc" event={"ID":"d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3","Type":"ContainerDied","Data":"6f64b6331ce23adb7d1f2935446438b352d2e4ffc202dfc8aceabf6b55312303"} Mar 01 09:46:52 crc kubenswrapper[4792]: I0301 09:46:52.872605 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngjhc" event={"ID":"d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3","Type":"ContainerStarted","Data":"ac24232ff9dc9a3289df755cfb01ddcde4ad46d0fb7b6c271ed181ad1e56d80e"} Mar 01 09:46:52 crc kubenswrapper[4792]: I0301 09:46:52.902258 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ngjhc" podStartSLOduration=3.439604086 podStartE2EDuration="9.902238425s" podCreationTimestamp="2026-03-01 09:46:43 +0000 UTC" firstStartedPulling="2026-03-01 09:46:45.815958058 +0000 UTC m=+2335.057837265" lastFinishedPulling="2026-03-01 09:46:52.278592407 +0000 UTC m=+2341.520471604" observedRunningTime="2026-03-01 09:46:52.899580779 +0000 UTC m=+2342.141459976" watchObservedRunningTime="2026-03-01 09:46:52.902238425 +0000 UTC m=+2342.144117622" Mar 01 09:46:54 crc kubenswrapper[4792]: I0301 09:46:54.218841 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ngjhc" Mar 01 09:46:54 crc kubenswrapper[4792]: I0301 09:46:54.219213 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ngjhc" Mar 01 09:46:55 crc kubenswrapper[4792]: I0301 09:46:55.265517 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-ngjhc" podUID="d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3" containerName="registry-server" probeResult="failure" output=< Mar 01 09:46:55 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 09:46:55 crc kubenswrapper[4792]: > Mar 01 09:47:04 crc kubenswrapper[4792]: I0301 09:47:04.261778 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ngjhc" Mar 01 09:47:04 crc kubenswrapper[4792]: I0301 09:47:04.314063 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ngjhc" Mar 01 09:47:04 crc kubenswrapper[4792]: I0301 09:47:04.496292 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ngjhc"] Mar 01 09:47:05 crc kubenswrapper[4792]: I0301 09:47:05.959938 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ngjhc" podUID="d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3" containerName="registry-server" containerID="cri-o://ac24232ff9dc9a3289df755cfb01ddcde4ad46d0fb7b6c271ed181ad1e56d80e" gracePeriod=2 Mar 01 09:47:06 crc kubenswrapper[4792]: I0301 09:47:06.386334 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ngjhc" Mar 01 09:47:06 crc kubenswrapper[4792]: I0301 09:47:06.559664 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kp4d\" (UniqueName: \"kubernetes.io/projected/d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3-kube-api-access-8kp4d\") pod \"d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3\" (UID: \"d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3\") " Mar 01 09:47:06 crc kubenswrapper[4792]: I0301 09:47:06.559723 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3-catalog-content\") pod \"d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3\" (UID: \"d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3\") " Mar 01 09:47:06 crc kubenswrapper[4792]: I0301 09:47:06.559958 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3-utilities\") pod \"d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3\" (UID: \"d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3\") " Mar 01 09:47:06 crc kubenswrapper[4792]: I0301 09:47:06.562849 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3-utilities" (OuterVolumeSpecName: "utilities") pod "d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3" (UID: "d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:47:06 crc kubenswrapper[4792]: I0301 09:47:06.569604 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3-kube-api-access-8kp4d" (OuterVolumeSpecName: "kube-api-access-8kp4d") pod "d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3" (UID: "d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3"). InnerVolumeSpecName "kube-api-access-8kp4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:47:06 crc kubenswrapper[4792]: I0301 09:47:06.630570 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3" (UID: "d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:47:06 crc kubenswrapper[4792]: I0301 09:47:06.662104 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kp4d\" (UniqueName: \"kubernetes.io/projected/d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3-kube-api-access-8kp4d\") on node \"crc\" DevicePath \"\"" Mar 01 09:47:06 crc kubenswrapper[4792]: I0301 09:47:06.662130 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:47:06 crc kubenswrapper[4792]: I0301 09:47:06.662140 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:47:06 crc kubenswrapper[4792]: I0301 09:47:06.969581 4792 generic.go:334] "Generic (PLEG): container finished" podID="d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3" containerID="ac24232ff9dc9a3289df755cfb01ddcde4ad46d0fb7b6c271ed181ad1e56d80e" exitCode=0 Mar 01 09:47:06 crc kubenswrapper[4792]: I0301 09:47:06.969632 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngjhc" event={"ID":"d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3","Type":"ContainerDied","Data":"ac24232ff9dc9a3289df755cfb01ddcde4ad46d0fb7b6c271ed181ad1e56d80e"} Mar 01 09:47:06 crc kubenswrapper[4792]: I0301 09:47:06.969693 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngjhc" event={"ID":"d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3","Type":"ContainerDied","Data":"69f0c5dc0c1619083b7daceb6ad82ab03f1d67532fb7c06ccbeefe11fdc99439"} Mar 01 09:47:06 crc kubenswrapper[4792]: I0301 09:47:06.969714 4792 scope.go:117] "RemoveContainer" containerID="ac24232ff9dc9a3289df755cfb01ddcde4ad46d0fb7b6c271ed181ad1e56d80e" Mar 01 09:47:06 crc kubenswrapper[4792]: I0301 09:47:06.969865 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ngjhc" Mar 01 09:47:07 crc kubenswrapper[4792]: I0301 09:47:06.999228 4792 scope.go:117] "RemoveContainer" containerID="6f64b6331ce23adb7d1f2935446438b352d2e4ffc202dfc8aceabf6b55312303" Mar 01 09:47:07 crc kubenswrapper[4792]: I0301 09:47:07.005284 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ngjhc"] Mar 01 09:47:07 crc kubenswrapper[4792]: I0301 09:47:07.011083 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ngjhc"] Mar 01 09:47:07 crc kubenswrapper[4792]: I0301 09:47:07.020355 4792 scope.go:117] "RemoveContainer" containerID="4d936e88bfe740547ec3ad69b38a532ec388cf67e43e3209ea871881ebe7b0b3" Mar 01 09:47:07 crc kubenswrapper[4792]: I0301 09:47:07.061125 4792 scope.go:117] "RemoveContainer" containerID="ac24232ff9dc9a3289df755cfb01ddcde4ad46d0fb7b6c271ed181ad1e56d80e" Mar 01 09:47:07 crc kubenswrapper[4792]: E0301 09:47:07.061702 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac24232ff9dc9a3289df755cfb01ddcde4ad46d0fb7b6c271ed181ad1e56d80e\": container with ID starting with ac24232ff9dc9a3289df755cfb01ddcde4ad46d0fb7b6c271ed181ad1e56d80e not found: ID does not exist" containerID="ac24232ff9dc9a3289df755cfb01ddcde4ad46d0fb7b6c271ed181ad1e56d80e" Mar 01 09:47:07 crc kubenswrapper[4792]: I0301 09:47:07.061801 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac24232ff9dc9a3289df755cfb01ddcde4ad46d0fb7b6c271ed181ad1e56d80e"} err="failed to get container status \"ac24232ff9dc9a3289df755cfb01ddcde4ad46d0fb7b6c271ed181ad1e56d80e\": rpc error: code = NotFound desc = could not find container \"ac24232ff9dc9a3289df755cfb01ddcde4ad46d0fb7b6c271ed181ad1e56d80e\": container with ID starting with ac24232ff9dc9a3289df755cfb01ddcde4ad46d0fb7b6c271ed181ad1e56d80e not found: ID does not exist" Mar 01 09:47:07 crc kubenswrapper[4792]: I0301 09:47:07.061925 4792 scope.go:117] "RemoveContainer" containerID="6f64b6331ce23adb7d1f2935446438b352d2e4ffc202dfc8aceabf6b55312303" Mar 01 09:47:07 crc kubenswrapper[4792]: E0301 09:47:07.062223 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f64b6331ce23adb7d1f2935446438b352d2e4ffc202dfc8aceabf6b55312303\": container with ID starting with 6f64b6331ce23adb7d1f2935446438b352d2e4ffc202dfc8aceabf6b55312303 not found: ID does not exist" containerID="6f64b6331ce23adb7d1f2935446438b352d2e4ffc202dfc8aceabf6b55312303" Mar 01 09:47:07 crc kubenswrapper[4792]: I0301 09:47:07.062329 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f64b6331ce23adb7d1f2935446438b352d2e4ffc202dfc8aceabf6b55312303"} err="failed to get container status \"6f64b6331ce23adb7d1f2935446438b352d2e4ffc202dfc8aceabf6b55312303\": rpc error: code = NotFound desc = could not find container \"6f64b6331ce23adb7d1f2935446438b352d2e4ffc202dfc8aceabf6b55312303\": container with ID starting with 6f64b6331ce23adb7d1f2935446438b352d2e4ffc202dfc8aceabf6b55312303 not found: ID does not exist" Mar 01 09:47:07 crc kubenswrapper[4792]: I0301 09:47:07.062417 4792 scope.go:117] "RemoveContainer" containerID="4d936e88bfe740547ec3ad69b38a532ec388cf67e43e3209ea871881ebe7b0b3" Mar 01 09:47:07 crc kubenswrapper[4792]: E0301 09:47:07.062787 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d936e88bfe740547ec3ad69b38a532ec388cf67e43e3209ea871881ebe7b0b3\": container with ID starting with 4d936e88bfe740547ec3ad69b38a532ec388cf67e43e3209ea871881ebe7b0b3 not found: ID does not exist" containerID="4d936e88bfe740547ec3ad69b38a532ec388cf67e43e3209ea871881ebe7b0b3" Mar 01 09:47:07 crc kubenswrapper[4792]: I0301 09:47:07.062882 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d936e88bfe740547ec3ad69b38a532ec388cf67e43e3209ea871881ebe7b0b3"} err="failed to get container status \"4d936e88bfe740547ec3ad69b38a532ec388cf67e43e3209ea871881ebe7b0b3\": rpc error: code = NotFound desc = could not find container \"4d936e88bfe740547ec3ad69b38a532ec388cf67e43e3209ea871881ebe7b0b3\": container with ID starting with 4d936e88bfe740547ec3ad69b38a532ec388cf67e43e3209ea871881ebe7b0b3 not found: ID does not exist" Mar 01 09:47:07 crc kubenswrapper[4792]: I0301 09:47:07.419684 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3" path="/var/lib/kubelet/pods/d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3/volumes" Mar 01 09:47:20 crc kubenswrapper[4792]: I0301 09:47:20.079766 4792 generic.go:334] "Generic (PLEG): container finished" podID="1201ca91-41eb-45d0-991d-71883b4014ae" containerID="f9bde77e46db7a7a43711f203d62d437c0037f5981ab90809f1911e681e22c3b" exitCode=0 Mar 01 09:47:20 crc kubenswrapper[4792]: I0301 09:47:20.079861 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" event={"ID":"1201ca91-41eb-45d0-991d-71883b4014ae","Type":"ContainerDied","Data":"f9bde77e46db7a7a43711f203d62d437c0037f5981ab90809f1911e681e22c3b"} Mar 01 09:47:21 crc kubenswrapper[4792]: I0301 09:47:21.518127 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" Mar 01 09:47:21 crc kubenswrapper[4792]: I0301 09:47:21.646492 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-ceph\") pod \"1201ca91-41eb-45d0-991d-71883b4014ae\" (UID: \"1201ca91-41eb-45d0-991d-71883b4014ae\") " Mar 01 09:47:21 crc kubenswrapper[4792]: I0301 09:47:21.646546 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-ssh-key-openstack-edpm-ipam\") pod \"1201ca91-41eb-45d0-991d-71883b4014ae\" (UID: \"1201ca91-41eb-45d0-991d-71883b4014ae\") " Mar 01 09:47:21 crc kubenswrapper[4792]: I0301 09:47:21.646600 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-bootstrap-combined-ca-bundle\") pod \"1201ca91-41eb-45d0-991d-71883b4014ae\" (UID: \"1201ca91-41eb-45d0-991d-71883b4014ae\") " Mar 01 09:47:21 crc kubenswrapper[4792]: I0301 09:47:21.646630 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vx4hm\" (UniqueName: \"kubernetes.io/projected/1201ca91-41eb-45d0-991d-71883b4014ae-kube-api-access-vx4hm\") pod \"1201ca91-41eb-45d0-991d-71883b4014ae\" (UID: \"1201ca91-41eb-45d0-991d-71883b4014ae\") " Mar 01 09:47:21 crc kubenswrapper[4792]: I0301 09:47:21.646662 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-inventory\") pod \"1201ca91-41eb-45d0-991d-71883b4014ae\" (UID: \"1201ca91-41eb-45d0-991d-71883b4014ae\") " Mar 01 09:47:21 crc kubenswrapper[4792]: I0301 09:47:21.651272 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "1201ca91-41eb-45d0-991d-71883b4014ae" (UID: "1201ca91-41eb-45d0-991d-71883b4014ae"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:47:21 crc kubenswrapper[4792]: I0301 09:47:21.651430 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-ceph" (OuterVolumeSpecName: "ceph") pod "1201ca91-41eb-45d0-991d-71883b4014ae" (UID: "1201ca91-41eb-45d0-991d-71883b4014ae"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:47:21 crc kubenswrapper[4792]: I0301 09:47:21.652923 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1201ca91-41eb-45d0-991d-71883b4014ae-kube-api-access-vx4hm" (OuterVolumeSpecName: "kube-api-access-vx4hm") pod "1201ca91-41eb-45d0-991d-71883b4014ae" (UID: "1201ca91-41eb-45d0-991d-71883b4014ae"). InnerVolumeSpecName "kube-api-access-vx4hm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:47:21 crc kubenswrapper[4792]: I0301 09:47:21.675090 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1201ca91-41eb-45d0-991d-71883b4014ae" (UID: "1201ca91-41eb-45d0-991d-71883b4014ae"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:47:21 crc kubenswrapper[4792]: I0301 09:47:21.675525 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-inventory" (OuterVolumeSpecName: "inventory") pod "1201ca91-41eb-45d0-991d-71883b4014ae" (UID: "1201ca91-41eb-45d0-991d-71883b4014ae"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:47:21 crc kubenswrapper[4792]: I0301 09:47:21.748585 4792 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:47:21 crc kubenswrapper[4792]: I0301 09:47:21.748614 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vx4hm\" (UniqueName: \"kubernetes.io/projected/1201ca91-41eb-45d0-991d-71883b4014ae-kube-api-access-vx4hm\") on node \"crc\" DevicePath \"\"" Mar 01 09:47:21 crc kubenswrapper[4792]: I0301 09:47:21.748624 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 09:47:21 crc kubenswrapper[4792]: I0301 09:47:21.748632 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-ceph\") on node \"crc\" DevicePath \"\"" Mar 01 09:47:21 crc kubenswrapper[4792]: I0301 09:47:21.748641 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.097459 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" event={"ID":"1201ca91-41eb-45d0-991d-71883b4014ae","Type":"ContainerDied","Data":"d3a561fc80333203759a7141294eee19a809aea7b68c78b4e20c79a31fc48cc8"} Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.097506 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3a561fc80333203759a7141294eee19a809aea7b68c78b4e20c79a31fc48cc8" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.097557 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.187622 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks"] Mar 01 09:47:22 crc kubenswrapper[4792]: E0301 09:47:22.187968 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3" containerName="registry-server" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.187984 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3" containerName="registry-server" Mar 01 09:47:22 crc kubenswrapper[4792]: E0301 09:47:22.187994 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3" containerName="extract-utilities" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.188002 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3" containerName="extract-utilities" Mar 01 09:47:22 crc kubenswrapper[4792]: E0301 09:47:22.188023 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3" containerName="extract-content" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.188030 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3" containerName="extract-content" Mar 01 09:47:22 crc kubenswrapper[4792]: E0301 09:47:22.188046 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1201ca91-41eb-45d0-991d-71883b4014ae" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.188052 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="1201ca91-41eb-45d0-991d-71883b4014ae" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.188209 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3" containerName="registry-server" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.188223 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="1201ca91-41eb-45d0-991d-71883b4014ae" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.188834 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.193948 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.194195 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.194346 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.194610 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.194673 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.216468 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks"] Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.358059 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f25228f4-912f-408c-a1d6-9279c350b767-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dtgks\" (UID: \"f25228f4-912f-408c-a1d6-9279c350b767\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.358473 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f25228f4-912f-408c-a1d6-9279c350b767-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dtgks\" (UID: \"f25228f4-912f-408c-a1d6-9279c350b767\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.358561 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f25228f4-912f-408c-a1d6-9279c350b767-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dtgks\" (UID: \"f25228f4-912f-408c-a1d6-9279c350b767\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.358613 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lck8\" (UniqueName: \"kubernetes.io/projected/f25228f4-912f-408c-a1d6-9279c350b767-kube-api-access-9lck8\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dtgks\" (UID: \"f25228f4-912f-408c-a1d6-9279c350b767\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.460424 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f25228f4-912f-408c-a1d6-9279c350b767-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dtgks\" (UID: \"f25228f4-912f-408c-a1d6-9279c350b767\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.460513 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lck8\" (UniqueName: \"kubernetes.io/projected/f25228f4-912f-408c-a1d6-9279c350b767-kube-api-access-9lck8\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dtgks\" (UID: \"f25228f4-912f-408c-a1d6-9279c350b767\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.460562 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f25228f4-912f-408c-a1d6-9279c350b767-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dtgks\" (UID: \"f25228f4-912f-408c-a1d6-9279c350b767\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.460655 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f25228f4-912f-408c-a1d6-9279c350b767-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dtgks\" (UID: \"f25228f4-912f-408c-a1d6-9279c350b767\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.474590 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f25228f4-912f-408c-a1d6-9279c350b767-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dtgks\" (UID: \"f25228f4-912f-408c-a1d6-9279c350b767\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.474630 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f25228f4-912f-408c-a1d6-9279c350b767-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dtgks\" (UID: \"f25228f4-912f-408c-a1d6-9279c350b767\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.475345 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f25228f4-912f-408c-a1d6-9279c350b767-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dtgks\" (UID: \"f25228f4-912f-408c-a1d6-9279c350b767\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.515557 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lck8\" (UniqueName: \"kubernetes.io/projected/f25228f4-912f-408c-a1d6-9279c350b767-kube-api-access-9lck8\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dtgks\" (UID: \"f25228f4-912f-408c-a1d6-9279c350b767\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.810408 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks" Mar 01 09:47:23 crc kubenswrapper[4792]: I0301 09:47:23.305842 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks"] Mar 01 09:47:23 crc kubenswrapper[4792]: W0301 09:47:23.311213 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf25228f4_912f_408c_a1d6_9279c350b767.slice/crio-f2bebe2a2356f7743d8d689e986f92baffce6ac133e06edc7d2543f477b746c7 WatchSource:0}: Error finding container f2bebe2a2356f7743d8d689e986f92baffce6ac133e06edc7d2543f477b746c7: Status 404 returned error can't find the container with id f2bebe2a2356f7743d8d689e986f92baffce6ac133e06edc7d2543f477b746c7 Mar 01 09:47:24 crc kubenswrapper[4792]: I0301 09:47:24.118800 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks" event={"ID":"f25228f4-912f-408c-a1d6-9279c350b767","Type":"ContainerStarted","Data":"19c6ec5339d76a1fab1bfb2d7d9edc634f8fcdd436f9a2e8873cd35a55eb96f9"} Mar 01 09:47:24 crc kubenswrapper[4792]: I0301 09:47:24.119171 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks" event={"ID":"f25228f4-912f-408c-a1d6-9279c350b767","Type":"ContainerStarted","Data":"f2bebe2a2356f7743d8d689e986f92baffce6ac133e06edc7d2543f477b746c7"} Mar 01 09:47:24 crc kubenswrapper[4792]: I0301 09:47:24.133250 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks" podStartSLOduration=1.580982858 podStartE2EDuration="2.133223s" podCreationTimestamp="2026-03-01 09:47:22 +0000 UTC" firstStartedPulling="2026-03-01 09:47:23.314029967 +0000 UTC m=+2372.555909164" lastFinishedPulling="2026-03-01 09:47:23.866270109 +0000 UTC m=+2373.108149306" observedRunningTime="2026-03-01 09:47:24.132128542 +0000 UTC m=+2373.374007749" watchObservedRunningTime="2026-03-01 09:47:24.133223 +0000 UTC m=+2373.375102197" Mar 01 09:47:50 crc kubenswrapper[4792]: I0301 09:47:50.325058 4792 generic.go:334] "Generic (PLEG): container finished" podID="f25228f4-912f-408c-a1d6-9279c350b767" containerID="19c6ec5339d76a1fab1bfb2d7d9edc634f8fcdd436f9a2e8873cd35a55eb96f9" exitCode=0 Mar 01 09:47:50 crc kubenswrapper[4792]: I0301 09:47:50.325194 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks" event={"ID":"f25228f4-912f-408c-a1d6-9279c350b767","Type":"ContainerDied","Data":"19c6ec5339d76a1fab1bfb2d7d9edc634f8fcdd436f9a2e8873cd35a55eb96f9"} Mar 01 09:47:51 crc kubenswrapper[4792]: I0301 09:47:51.966036 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.064368 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f25228f4-912f-408c-a1d6-9279c350b767-ceph\") pod \"f25228f4-912f-408c-a1d6-9279c350b767\" (UID: \"f25228f4-912f-408c-a1d6-9279c350b767\") " Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.064468 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lck8\" (UniqueName: \"kubernetes.io/projected/f25228f4-912f-408c-a1d6-9279c350b767-kube-api-access-9lck8\") pod \"f25228f4-912f-408c-a1d6-9279c350b767\" (UID: \"f25228f4-912f-408c-a1d6-9279c350b767\") " Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.064527 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f25228f4-912f-408c-a1d6-9279c350b767-inventory\") pod \"f25228f4-912f-408c-a1d6-9279c350b767\" (UID: \"f25228f4-912f-408c-a1d6-9279c350b767\") " Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.064549 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f25228f4-912f-408c-a1d6-9279c350b767-ssh-key-openstack-edpm-ipam\") pod \"f25228f4-912f-408c-a1d6-9279c350b767\" (UID: \"f25228f4-912f-408c-a1d6-9279c350b767\") " Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.070771 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f25228f4-912f-408c-a1d6-9279c350b767-kube-api-access-9lck8" (OuterVolumeSpecName: "kube-api-access-9lck8") pod "f25228f4-912f-408c-a1d6-9279c350b767" (UID: "f25228f4-912f-408c-a1d6-9279c350b767"). InnerVolumeSpecName "kube-api-access-9lck8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.073059 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f25228f4-912f-408c-a1d6-9279c350b767-ceph" (OuterVolumeSpecName: "ceph") pod "f25228f4-912f-408c-a1d6-9279c350b767" (UID: "f25228f4-912f-408c-a1d6-9279c350b767"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.091096 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f25228f4-912f-408c-a1d6-9279c350b767-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f25228f4-912f-408c-a1d6-9279c350b767" (UID: "f25228f4-912f-408c-a1d6-9279c350b767"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.092296 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f25228f4-912f-408c-a1d6-9279c350b767-inventory" (OuterVolumeSpecName: "inventory") pod "f25228f4-912f-408c-a1d6-9279c350b767" (UID: "f25228f4-912f-408c-a1d6-9279c350b767"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.166777 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f25228f4-912f-408c-a1d6-9279c350b767-ceph\") on node \"crc\" DevicePath \"\"" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.166820 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lck8\" (UniqueName: \"kubernetes.io/projected/f25228f4-912f-408c-a1d6-9279c350b767-kube-api-access-9lck8\") on node \"crc\" DevicePath \"\"" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.166832 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f25228f4-912f-408c-a1d6-9279c350b767-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.166842 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f25228f4-912f-408c-a1d6-9279c350b767-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.340834 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks" event={"ID":"f25228f4-912f-408c-a1d6-9279c350b767","Type":"ContainerDied","Data":"f2bebe2a2356f7743d8d689e986f92baffce6ac133e06edc7d2543f477b746c7"} Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.340871 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2bebe2a2356f7743d8d689e986f92baffce6ac133e06edc7d2543f477b746c7" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.340950 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.431131 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l"] Mar 01 09:47:52 crc kubenswrapper[4792]: E0301 09:47:52.431471 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f25228f4-912f-408c-a1d6-9279c350b767" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.431483 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f25228f4-912f-408c-a1d6-9279c350b767" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.431651 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f25228f4-912f-408c-a1d6-9279c350b767" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.432303 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.434772 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.435089 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.435322 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.435876 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.438731 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.451249 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l"] Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.578659 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8449\" (UniqueName: \"kubernetes.io/projected/59b987d8-9463-48cb-9651-1e5cb16aa764-kube-api-access-p8449\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-phn2l\" (UID: \"59b987d8-9463-48cb-9651-1e5cb16aa764\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.578796 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/59b987d8-9463-48cb-9651-1e5cb16aa764-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-phn2l\" (UID: \"59b987d8-9463-48cb-9651-1e5cb16aa764\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.578838 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59b987d8-9463-48cb-9651-1e5cb16aa764-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-phn2l\" (UID: \"59b987d8-9463-48cb-9651-1e5cb16aa764\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.578870 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/59b987d8-9463-48cb-9651-1e5cb16aa764-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-phn2l\" (UID: \"59b987d8-9463-48cb-9651-1e5cb16aa764\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.680042 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/59b987d8-9463-48cb-9651-1e5cb16aa764-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-phn2l\" (UID: \"59b987d8-9463-48cb-9651-1e5cb16aa764\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.680123 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59b987d8-9463-48cb-9651-1e5cb16aa764-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-phn2l\" (UID: \"59b987d8-9463-48cb-9651-1e5cb16aa764\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.680158 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/59b987d8-9463-48cb-9651-1e5cb16aa764-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-phn2l\" (UID: \"59b987d8-9463-48cb-9651-1e5cb16aa764\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.680250 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8449\" (UniqueName: \"kubernetes.io/projected/59b987d8-9463-48cb-9651-1e5cb16aa764-kube-api-access-p8449\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-phn2l\" (UID: \"59b987d8-9463-48cb-9651-1e5cb16aa764\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.690494 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59b987d8-9463-48cb-9651-1e5cb16aa764-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-phn2l\" (UID: \"59b987d8-9463-48cb-9651-1e5cb16aa764\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.691541 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/59b987d8-9463-48cb-9651-1e5cb16aa764-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-phn2l\" (UID: \"59b987d8-9463-48cb-9651-1e5cb16aa764\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.699774 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/59b987d8-9463-48cb-9651-1e5cb16aa764-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-phn2l\" (UID: \"59b987d8-9463-48cb-9651-1e5cb16aa764\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.705111 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8449\" (UniqueName: \"kubernetes.io/projected/59b987d8-9463-48cb-9651-1e5cb16aa764-kube-api-access-p8449\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-phn2l\" (UID: \"59b987d8-9463-48cb-9651-1e5cb16aa764\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.744898 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l" Mar 01 09:47:53 crc kubenswrapper[4792]: I0301 09:47:53.307515 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l"] Mar 01 09:47:53 crc kubenswrapper[4792]: I0301 09:47:53.348191 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l" event={"ID":"59b987d8-9463-48cb-9651-1e5cb16aa764","Type":"ContainerStarted","Data":"0965a313b710b708c5af294657c525d6ca94115e8707a0d5c46e3b088cb75fcc"} Mar 01 09:47:54 crc kubenswrapper[4792]: I0301 09:47:54.357598 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l" event={"ID":"59b987d8-9463-48cb-9651-1e5cb16aa764","Type":"ContainerStarted","Data":"29293b309e902e27c992cdd4a10eda8522305bb670dcf4444d1ee4e4d67716f9"} Mar 01 09:47:54 crc kubenswrapper[4792]: I0301 09:47:54.382213 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l" podStartSLOduration=1.877013566 podStartE2EDuration="2.382192169s" podCreationTimestamp="2026-03-01 09:47:52 +0000 UTC" firstStartedPulling="2026-03-01 09:47:53.32378724 +0000 UTC m=+2402.565666437" lastFinishedPulling="2026-03-01 09:47:53.828965843 +0000 UTC m=+2403.070845040" observedRunningTime="2026-03-01 09:47:54.380052015 +0000 UTC m=+2403.621931222" watchObservedRunningTime="2026-03-01 09:47:54.382192169 +0000 UTC m=+2403.624071366" Mar 01 09:47:59 crc kubenswrapper[4792]: I0301 09:47:59.399068 4792 generic.go:334] "Generic (PLEG): container finished" podID="59b987d8-9463-48cb-9651-1e5cb16aa764" containerID="29293b309e902e27c992cdd4a10eda8522305bb670dcf4444d1ee4e4d67716f9" exitCode=0 Mar 01 09:47:59 crc kubenswrapper[4792]: I0301 09:47:59.399260 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l" event={"ID":"59b987d8-9463-48cb-9651-1e5cb16aa764","Type":"ContainerDied","Data":"29293b309e902e27c992cdd4a10eda8522305bb670dcf4444d1ee4e4d67716f9"} Mar 01 09:48:00 crc kubenswrapper[4792]: I0301 09:48:00.125751 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539308-lnc4n"] Mar 01 09:48:00 crc kubenswrapper[4792]: I0301 09:48:00.126931 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539308-lnc4n" Mar 01 09:48:00 crc kubenswrapper[4792]: I0301 09:48:00.128982 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:48:00 crc kubenswrapper[4792]: I0301 09:48:00.131247 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:48:00 crc kubenswrapper[4792]: I0301 09:48:00.131428 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:48:00 crc kubenswrapper[4792]: I0301 09:48:00.141786 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539308-lnc4n"] Mar 01 09:48:00 crc kubenswrapper[4792]: I0301 09:48:00.248393 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt7t7\" (UniqueName: \"kubernetes.io/projected/1e3f198d-a642-45b3-9a5a-fd5906670db8-kube-api-access-jt7t7\") pod \"auto-csr-approver-29539308-lnc4n\" (UID: \"1e3f198d-a642-45b3-9a5a-fd5906670db8\") " pod="openshift-infra/auto-csr-approver-29539308-lnc4n" Mar 01 09:48:00 crc kubenswrapper[4792]: I0301 09:48:00.350163 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt7t7\" (UniqueName: \"kubernetes.io/projected/1e3f198d-a642-45b3-9a5a-fd5906670db8-kube-api-access-jt7t7\") pod \"auto-csr-approver-29539308-lnc4n\" (UID: \"1e3f198d-a642-45b3-9a5a-fd5906670db8\") " pod="openshift-infra/auto-csr-approver-29539308-lnc4n" Mar 01 09:48:00 crc kubenswrapper[4792]: I0301 09:48:00.368773 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt7t7\" (UniqueName: \"kubernetes.io/projected/1e3f198d-a642-45b3-9a5a-fd5906670db8-kube-api-access-jt7t7\") pod \"auto-csr-approver-29539308-lnc4n\" (UID: \"1e3f198d-a642-45b3-9a5a-fd5906670db8\") " pod="openshift-infra/auto-csr-approver-29539308-lnc4n" Mar 01 09:48:00 crc kubenswrapper[4792]: I0301 09:48:00.453437 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539308-lnc4n" Mar 01 09:48:00 crc kubenswrapper[4792]: I0301 09:48:00.973598 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539308-lnc4n"] Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.009775 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.062866 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59b987d8-9463-48cb-9651-1e5cb16aa764-inventory\") pod \"59b987d8-9463-48cb-9651-1e5cb16aa764\" (UID: \"59b987d8-9463-48cb-9651-1e5cb16aa764\") " Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.063402 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8449\" (UniqueName: \"kubernetes.io/projected/59b987d8-9463-48cb-9651-1e5cb16aa764-kube-api-access-p8449\") pod \"59b987d8-9463-48cb-9651-1e5cb16aa764\" (UID: \"59b987d8-9463-48cb-9651-1e5cb16aa764\") " Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.063496 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/59b987d8-9463-48cb-9651-1e5cb16aa764-ssh-key-openstack-edpm-ipam\") pod \"59b987d8-9463-48cb-9651-1e5cb16aa764\" (UID: \"59b987d8-9463-48cb-9651-1e5cb16aa764\") " Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.063520 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/59b987d8-9463-48cb-9651-1e5cb16aa764-ceph\") pod \"59b987d8-9463-48cb-9651-1e5cb16aa764\" (UID: \"59b987d8-9463-48cb-9651-1e5cb16aa764\") " Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.068489 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59b987d8-9463-48cb-9651-1e5cb16aa764-ceph" (OuterVolumeSpecName: "ceph") pod "59b987d8-9463-48cb-9651-1e5cb16aa764" (UID: "59b987d8-9463-48cb-9651-1e5cb16aa764"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.068801 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59b987d8-9463-48cb-9651-1e5cb16aa764-kube-api-access-p8449" (OuterVolumeSpecName: "kube-api-access-p8449") pod "59b987d8-9463-48cb-9651-1e5cb16aa764" (UID: "59b987d8-9463-48cb-9651-1e5cb16aa764"). InnerVolumeSpecName "kube-api-access-p8449". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.092378 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59b987d8-9463-48cb-9651-1e5cb16aa764-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "59b987d8-9463-48cb-9651-1e5cb16aa764" (UID: "59b987d8-9463-48cb-9651-1e5cb16aa764"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.092680 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59b987d8-9463-48cb-9651-1e5cb16aa764-inventory" (OuterVolumeSpecName: "inventory") pod "59b987d8-9463-48cb-9651-1e5cb16aa764" (UID: "59b987d8-9463-48cb-9651-1e5cb16aa764"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.164932 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59b987d8-9463-48cb-9651-1e5cb16aa764-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.164967 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8449\" (UniqueName: \"kubernetes.io/projected/59b987d8-9463-48cb-9651-1e5cb16aa764-kube-api-access-p8449\") on node \"crc\" DevicePath \"\"" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.164979 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/59b987d8-9463-48cb-9651-1e5cb16aa764-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.164987 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/59b987d8-9463-48cb-9651-1e5cb16aa764-ceph\") on node \"crc\" DevicePath \"\"" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.422763 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539308-lnc4n" event={"ID":"1e3f198d-a642-45b3-9a5a-fd5906670db8","Type":"ContainerStarted","Data":"824c90eca7cbf0dcb5df0b863beabbb76571e3954f409dbed59aaeac345a811b"} Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.424158 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l" event={"ID":"59b987d8-9463-48cb-9651-1e5cb16aa764","Type":"ContainerDied","Data":"0965a313b710b708c5af294657c525d6ca94115e8707a0d5c46e3b088cb75fcc"} Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.424201 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0965a313b710b708c5af294657c525d6ca94115e8707a0d5c46e3b088cb75fcc" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.424250 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.496479 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28"] Mar 01 09:48:01 crc kubenswrapper[4792]: E0301 09:48:01.496791 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59b987d8-9463-48cb-9651-1e5cb16aa764" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.496804 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="59b987d8-9463-48cb-9651-1e5cb16aa764" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.496990 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="59b987d8-9463-48cb-9651-1e5cb16aa764" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.497579 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.500737 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.500916 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.501490 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.502033 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.505043 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.541245 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28"] Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.674262 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8g2n\" (UniqueName: \"kubernetes.io/projected/822af429-9091-43e5-a16d-7a287f2c5bb2-kube-api-access-m8g2n\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4rw28\" (UID: \"822af429-9091-43e5-a16d-7a287f2c5bb2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.674356 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/822af429-9091-43e5-a16d-7a287f2c5bb2-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4rw28\" (UID: \"822af429-9091-43e5-a16d-7a287f2c5bb2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.674515 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/822af429-9091-43e5-a16d-7a287f2c5bb2-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4rw28\" (UID: \"822af429-9091-43e5-a16d-7a287f2c5bb2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.674674 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/822af429-9091-43e5-a16d-7a287f2c5bb2-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4rw28\" (UID: \"822af429-9091-43e5-a16d-7a287f2c5bb2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.776679 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/822af429-9091-43e5-a16d-7a287f2c5bb2-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4rw28\" (UID: \"822af429-9091-43e5-a16d-7a287f2c5bb2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.776737 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/822af429-9091-43e5-a16d-7a287f2c5bb2-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4rw28\" (UID: \"822af429-9091-43e5-a16d-7a287f2c5bb2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.776805 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/822af429-9091-43e5-a16d-7a287f2c5bb2-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4rw28\" (UID: \"822af429-9091-43e5-a16d-7a287f2c5bb2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.776960 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8g2n\" (UniqueName: \"kubernetes.io/projected/822af429-9091-43e5-a16d-7a287f2c5bb2-kube-api-access-m8g2n\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4rw28\" (UID: \"822af429-9091-43e5-a16d-7a287f2c5bb2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.781651 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/822af429-9091-43e5-a16d-7a287f2c5bb2-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4rw28\" (UID: \"822af429-9091-43e5-a16d-7a287f2c5bb2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.781685 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/822af429-9091-43e5-a16d-7a287f2c5bb2-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4rw28\" (UID: \"822af429-9091-43e5-a16d-7a287f2c5bb2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.782461 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/822af429-9091-43e5-a16d-7a287f2c5bb2-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4rw28\" (UID: \"822af429-9091-43e5-a16d-7a287f2c5bb2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.805294 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8g2n\" (UniqueName: \"kubernetes.io/projected/822af429-9091-43e5-a16d-7a287f2c5bb2-kube-api-access-m8g2n\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4rw28\" (UID: \"822af429-9091-43e5-a16d-7a287f2c5bb2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.812146 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28" Mar 01 09:48:02 crc kubenswrapper[4792]: I0301 09:48:02.361860 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28"] Mar 01 09:48:02 crc kubenswrapper[4792]: W0301 09:48:02.372109 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod822af429_9091_43e5_a16d_7a287f2c5bb2.slice/crio-81b2f396188a63440fcba678466e372145f1b2a7a97016d2c98f9e382bb3bf96 WatchSource:0}: Error finding container 81b2f396188a63440fcba678466e372145f1b2a7a97016d2c98f9e382bb3bf96: Status 404 returned error can't find the container with id 81b2f396188a63440fcba678466e372145f1b2a7a97016d2c98f9e382bb3bf96 Mar 01 09:48:02 crc kubenswrapper[4792]: I0301 09:48:02.431957 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28" event={"ID":"822af429-9091-43e5-a16d-7a287f2c5bb2","Type":"ContainerStarted","Data":"81b2f396188a63440fcba678466e372145f1b2a7a97016d2c98f9e382bb3bf96"} Mar 01 09:48:04 crc kubenswrapper[4792]: I0301 09:48:04.450047 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28" event={"ID":"822af429-9091-43e5-a16d-7a287f2c5bb2","Type":"ContainerStarted","Data":"97d084e532d6eb8f554aee35118c23f6e5264bb4a9857125f5d1e0995c3a8746"} Mar 01 09:48:04 crc kubenswrapper[4792]: I0301 09:48:04.452189 4792 generic.go:334] "Generic (PLEG): container finished" podID="1e3f198d-a642-45b3-9a5a-fd5906670db8" containerID="cde0b22712c7c2f1430743fdebf0e1e49438b47b056e66c49fd78cf546ba54f9" exitCode=0 Mar 01 09:48:04 crc kubenswrapper[4792]: I0301 09:48:04.452239 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539308-lnc4n" event={"ID":"1e3f198d-a642-45b3-9a5a-fd5906670db8","Type":"ContainerDied","Data":"cde0b22712c7c2f1430743fdebf0e1e49438b47b056e66c49fd78cf546ba54f9"} Mar 01 09:48:04 crc kubenswrapper[4792]: I0301 09:48:04.465372 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28" podStartSLOduration=2.284158907 podStartE2EDuration="3.465355819s" podCreationTimestamp="2026-03-01 09:48:01 +0000 UTC" firstStartedPulling="2026-03-01 09:48:02.373952547 +0000 UTC m=+2411.615831744" lastFinishedPulling="2026-03-01 09:48:03.555149459 +0000 UTC m=+2412.797028656" observedRunningTime="2026-03-01 09:48:04.463529803 +0000 UTC m=+2413.705409000" watchObservedRunningTime="2026-03-01 09:48:04.465355819 +0000 UTC m=+2413.707235016" Mar 01 09:48:05 crc kubenswrapper[4792]: I0301 09:48:05.787223 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539308-lnc4n" Mar 01 09:48:05 crc kubenswrapper[4792]: I0301 09:48:05.859355 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jt7t7\" (UniqueName: \"kubernetes.io/projected/1e3f198d-a642-45b3-9a5a-fd5906670db8-kube-api-access-jt7t7\") pod \"1e3f198d-a642-45b3-9a5a-fd5906670db8\" (UID: \"1e3f198d-a642-45b3-9a5a-fd5906670db8\") " Mar 01 09:48:05 crc kubenswrapper[4792]: I0301 09:48:05.867174 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e3f198d-a642-45b3-9a5a-fd5906670db8-kube-api-access-jt7t7" (OuterVolumeSpecName: "kube-api-access-jt7t7") pod "1e3f198d-a642-45b3-9a5a-fd5906670db8" (UID: "1e3f198d-a642-45b3-9a5a-fd5906670db8"). InnerVolumeSpecName "kube-api-access-jt7t7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:48:05 crc kubenswrapper[4792]: I0301 09:48:05.962094 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jt7t7\" (UniqueName: \"kubernetes.io/projected/1e3f198d-a642-45b3-9a5a-fd5906670db8-kube-api-access-jt7t7\") on node \"crc\" DevicePath \"\"" Mar 01 09:48:06 crc kubenswrapper[4792]: I0301 09:48:06.468466 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539308-lnc4n" event={"ID":"1e3f198d-a642-45b3-9a5a-fd5906670db8","Type":"ContainerDied","Data":"824c90eca7cbf0dcb5df0b863beabbb76571e3954f409dbed59aaeac345a811b"} Mar 01 09:48:06 crc kubenswrapper[4792]: I0301 09:48:06.468500 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="824c90eca7cbf0dcb5df0b863beabbb76571e3954f409dbed59aaeac345a811b" Mar 01 09:48:06 crc kubenswrapper[4792]: I0301 09:48:06.468527 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539308-lnc4n" Mar 01 09:48:06 crc kubenswrapper[4792]: I0301 09:48:06.855023 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539302-26gpt"] Mar 01 09:48:06 crc kubenswrapper[4792]: I0301 09:48:06.860933 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539302-26gpt"] Mar 01 09:48:07 crc kubenswrapper[4792]: I0301 09:48:07.423016 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="792c2a11-8c71-4c9a-b0e8-53ed5d9e77f5" path="/var/lib/kubelet/pods/792c2a11-8c71-4c9a-b0e8-53ed5d9e77f5/volumes" Mar 01 09:48:31 crc kubenswrapper[4792]: I0301 09:48:31.120295 4792 scope.go:117] "RemoveContainer" containerID="53fe1a8f0f86c9e965b90816a9566427d372fba1cc22db1d2bb0ca2e72f57708" Mar 01 09:48:34 crc kubenswrapper[4792]: I0301 09:48:34.943144 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:48:34 crc kubenswrapper[4792]: I0301 09:48:34.943653 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:48:41 crc kubenswrapper[4792]: I0301 09:48:41.779789 4792 generic.go:334] "Generic (PLEG): container finished" podID="822af429-9091-43e5-a16d-7a287f2c5bb2" containerID="97d084e532d6eb8f554aee35118c23f6e5264bb4a9857125f5d1e0995c3a8746" exitCode=0 Mar 01 09:48:41 crc kubenswrapper[4792]: I0301 09:48:41.779869 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28" event={"ID":"822af429-9091-43e5-a16d-7a287f2c5bb2","Type":"ContainerDied","Data":"97d084e532d6eb8f554aee35118c23f6e5264bb4a9857125f5d1e0995c3a8746"} Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.178700 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28" Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.249399 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/822af429-9091-43e5-a16d-7a287f2c5bb2-inventory\") pod \"822af429-9091-43e5-a16d-7a287f2c5bb2\" (UID: \"822af429-9091-43e5-a16d-7a287f2c5bb2\") " Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.249540 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/822af429-9091-43e5-a16d-7a287f2c5bb2-ssh-key-openstack-edpm-ipam\") pod \"822af429-9091-43e5-a16d-7a287f2c5bb2\" (UID: \"822af429-9091-43e5-a16d-7a287f2c5bb2\") " Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.249583 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8g2n\" (UniqueName: \"kubernetes.io/projected/822af429-9091-43e5-a16d-7a287f2c5bb2-kube-api-access-m8g2n\") pod \"822af429-9091-43e5-a16d-7a287f2c5bb2\" (UID: \"822af429-9091-43e5-a16d-7a287f2c5bb2\") " Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.249608 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/822af429-9091-43e5-a16d-7a287f2c5bb2-ceph\") pod \"822af429-9091-43e5-a16d-7a287f2c5bb2\" (UID: \"822af429-9091-43e5-a16d-7a287f2c5bb2\") " Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.263605 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/822af429-9091-43e5-a16d-7a287f2c5bb2-kube-api-access-m8g2n" (OuterVolumeSpecName: "kube-api-access-m8g2n") pod "822af429-9091-43e5-a16d-7a287f2c5bb2" (UID: "822af429-9091-43e5-a16d-7a287f2c5bb2"). InnerVolumeSpecName "kube-api-access-m8g2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.271867 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/822af429-9091-43e5-a16d-7a287f2c5bb2-ceph" (OuterVolumeSpecName: "ceph") pod "822af429-9091-43e5-a16d-7a287f2c5bb2" (UID: "822af429-9091-43e5-a16d-7a287f2c5bb2"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:48:43 crc kubenswrapper[4792]: E0301 09:48:43.296669 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/822af429-9091-43e5-a16d-7a287f2c5bb2-inventory podName:822af429-9091-43e5-a16d-7a287f2c5bb2 nodeName:}" failed. No retries permitted until 2026-03-01 09:48:43.796641405 +0000 UTC m=+2453.038520602 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "inventory" (UniqueName: "kubernetes.io/secret/822af429-9091-43e5-a16d-7a287f2c5bb2-inventory") pod "822af429-9091-43e5-a16d-7a287f2c5bb2" (UID: "822af429-9091-43e5-a16d-7a287f2c5bb2") : error deleting /var/lib/kubelet/pods/822af429-9091-43e5-a16d-7a287f2c5bb2/volume-subpaths: remove /var/lib/kubelet/pods/822af429-9091-43e5-a16d-7a287f2c5bb2/volume-subpaths: no such file or directory Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.299874 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/822af429-9091-43e5-a16d-7a287f2c5bb2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "822af429-9091-43e5-a16d-7a287f2c5bb2" (UID: "822af429-9091-43e5-a16d-7a287f2c5bb2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.351962 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/822af429-9091-43e5-a16d-7a287f2c5bb2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.351994 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8g2n\" (UniqueName: \"kubernetes.io/projected/822af429-9091-43e5-a16d-7a287f2c5bb2-kube-api-access-m8g2n\") on node \"crc\" DevicePath \"\"" Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.352003 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/822af429-9091-43e5-a16d-7a287f2c5bb2-ceph\") on node \"crc\" DevicePath \"\"" Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.798268 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28" event={"ID":"822af429-9091-43e5-a16d-7a287f2c5bb2","Type":"ContainerDied","Data":"81b2f396188a63440fcba678466e372145f1b2a7a97016d2c98f9e382bb3bf96"} Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.798305 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81b2f396188a63440fcba678466e372145f1b2a7a97016d2c98f9e382bb3bf96" Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.798721 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28" Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.861739 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/822af429-9091-43e5-a16d-7a287f2c5bb2-inventory\") pod \"822af429-9091-43e5-a16d-7a287f2c5bb2\" (UID: \"822af429-9091-43e5-a16d-7a287f2c5bb2\") " Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.868106 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/822af429-9091-43e5-a16d-7a287f2c5bb2-inventory" (OuterVolumeSpecName: "inventory") pod "822af429-9091-43e5-a16d-7a287f2c5bb2" (UID: "822af429-9091-43e5-a16d-7a287f2c5bb2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.906031 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m"] Mar 01 09:48:43 crc kubenswrapper[4792]: E0301 09:48:43.906398 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="822af429-9091-43e5-a16d-7a287f2c5bb2" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.906419 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="822af429-9091-43e5-a16d-7a287f2c5bb2" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:48:43 crc kubenswrapper[4792]: E0301 09:48:43.906442 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e3f198d-a642-45b3-9a5a-fd5906670db8" containerName="oc" Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.906449 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e3f198d-a642-45b3-9a5a-fd5906670db8" containerName="oc" Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.906610 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e3f198d-a642-45b3-9a5a-fd5906670db8" containerName="oc" Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.906625 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="822af429-9091-43e5-a16d-7a287f2c5bb2" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.907184 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m" Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.964334 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m"] Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.966177 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/822af429-9091-43e5-a16d-7a287f2c5bb2-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 09:48:44 crc kubenswrapper[4792]: I0301 09:48:44.068183 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m\" (UID: \"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m" Mar 01 09:48:44 crc kubenswrapper[4792]: I0301 09:48:44.068447 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m\" (UID: \"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m" Mar 01 09:48:44 crc kubenswrapper[4792]: I0301 09:48:44.068649 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mktvs\" (UniqueName: \"kubernetes.io/projected/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-kube-api-access-mktvs\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m\" (UID: \"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m" Mar 01 09:48:44 crc kubenswrapper[4792]: I0301 09:48:44.068727 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m\" (UID: \"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m" Mar 01 09:48:44 crc kubenswrapper[4792]: I0301 09:48:44.170043 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m\" (UID: \"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m" Mar 01 09:48:44 crc kubenswrapper[4792]: I0301 09:48:44.170148 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mktvs\" (UniqueName: \"kubernetes.io/projected/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-kube-api-access-mktvs\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m\" (UID: \"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m" Mar 01 09:48:44 crc kubenswrapper[4792]: I0301 09:48:44.170198 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m\" (UID: \"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m" Mar 01 09:48:44 crc kubenswrapper[4792]: I0301 09:48:44.170249 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m\" (UID: \"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m" Mar 01 09:48:44 crc kubenswrapper[4792]: I0301 09:48:44.173224 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m\" (UID: \"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m" Mar 01 09:48:44 crc kubenswrapper[4792]: I0301 09:48:44.173640 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m\" (UID: \"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m" Mar 01 09:48:44 crc kubenswrapper[4792]: I0301 09:48:44.174556 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m\" (UID: \"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m" Mar 01 09:48:44 crc kubenswrapper[4792]: I0301 09:48:44.191091 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mktvs\" (UniqueName: \"kubernetes.io/projected/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-kube-api-access-mktvs\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m\" (UID: \"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m" Mar 01 09:48:44 crc kubenswrapper[4792]: I0301 09:48:44.243419 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m" Mar 01 09:48:44 crc kubenswrapper[4792]: I0301 09:48:44.790819 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m"] Mar 01 09:48:44 crc kubenswrapper[4792]: I0301 09:48:44.811195 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m" event={"ID":"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37","Type":"ContainerStarted","Data":"dc03f6c62bcf64863e46028ea3bb7dc4615b60cbd65d489e3c6a07cbcfb6540c"} Mar 01 09:48:45 crc kubenswrapper[4792]: I0301 09:48:45.821338 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m" event={"ID":"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37","Type":"ContainerStarted","Data":"85c9f21baa52c98be995a258e8b085fb46979f0ec9568bd0eb472bbf230fefcc"} Mar 01 09:48:45 crc kubenswrapper[4792]: I0301 09:48:45.836107 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m" podStartSLOduration=2.212627227 podStartE2EDuration="2.836089179s" podCreationTimestamp="2026-03-01 09:48:43 +0000 UTC" firstStartedPulling="2026-03-01 09:48:44.801619651 +0000 UTC m=+2454.043498858" lastFinishedPulling="2026-03-01 09:48:45.425081593 +0000 UTC m=+2454.666960810" observedRunningTime="2026-03-01 09:48:45.835693109 +0000 UTC m=+2455.077572306" watchObservedRunningTime="2026-03-01 09:48:45.836089179 +0000 UTC m=+2455.077968376" Mar 01 09:48:49 crc kubenswrapper[4792]: I0301 09:48:49.853798 4792 generic.go:334] "Generic (PLEG): container finished" podID="2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37" containerID="85c9f21baa52c98be995a258e8b085fb46979f0ec9568bd0eb472bbf230fefcc" exitCode=0 Mar 01 09:48:49 crc kubenswrapper[4792]: I0301 09:48:49.853883 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m" event={"ID":"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37","Type":"ContainerDied","Data":"85c9f21baa52c98be995a258e8b085fb46979f0ec9568bd0eb472bbf230fefcc"} Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.265137 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m" Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.320230 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mktvs\" (UniqueName: \"kubernetes.io/projected/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-kube-api-access-mktvs\") pod \"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37\" (UID: \"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37\") " Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.320299 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-ceph\") pod \"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37\" (UID: \"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37\") " Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.320368 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-ssh-key-openstack-edpm-ipam\") pod \"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37\" (UID: \"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37\") " Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.320405 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-inventory\") pod \"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37\" (UID: \"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37\") " Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.330545 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-kube-api-access-mktvs" (OuterVolumeSpecName: "kube-api-access-mktvs") pod "2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37" (UID: "2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37"). InnerVolumeSpecName "kube-api-access-mktvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.330643 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-ceph" (OuterVolumeSpecName: "ceph") pod "2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37" (UID: "2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.353562 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37" (UID: "2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.355750 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-inventory" (OuterVolumeSpecName: "inventory") pod "2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37" (UID: "2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.422487 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mktvs\" (UniqueName: \"kubernetes.io/projected/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-kube-api-access-mktvs\") on node \"crc\" DevicePath \"\"" Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.422846 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-ceph\") on node \"crc\" DevicePath \"\"" Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.422861 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.422878 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.872060 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m" event={"ID":"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37","Type":"ContainerDied","Data":"dc03f6c62bcf64863e46028ea3bb7dc4615b60cbd65d489e3c6a07cbcfb6540c"} Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.872100 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc03f6c62bcf64863e46028ea3bb7dc4615b60cbd65d489e3c6a07cbcfb6540c" Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.872166 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m" Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.937487 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5"] Mar 01 09:48:51 crc kubenswrapper[4792]: E0301 09:48:51.937922 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.937942 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.938179 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.938888 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5" Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.941884 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.942076 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.942173 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.942227 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.943881 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.951723 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5"] Mar 01 09:48:52 crc kubenswrapper[4792]: I0301 09:48:52.038359 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5\" (UID: \"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5" Mar 01 09:48:52 crc kubenswrapper[4792]: I0301 09:48:52.038409 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdcf4\" (UniqueName: \"kubernetes.io/projected/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-kube-api-access-pdcf4\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5\" (UID: \"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5" Mar 01 09:48:52 crc kubenswrapper[4792]: I0301 09:48:52.038511 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5\" (UID: \"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5" Mar 01 09:48:52 crc kubenswrapper[4792]: I0301 09:48:52.038577 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5\" (UID: \"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5" Mar 01 09:48:52 crc kubenswrapper[4792]: I0301 09:48:52.140401 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5\" (UID: \"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5" Mar 01 09:48:52 crc kubenswrapper[4792]: I0301 09:48:52.140478 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5\" (UID: \"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5" Mar 01 09:48:52 crc kubenswrapper[4792]: I0301 09:48:52.140549 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5\" (UID: \"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5" Mar 01 09:48:52 crc kubenswrapper[4792]: I0301 09:48:52.140577 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdcf4\" (UniqueName: \"kubernetes.io/projected/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-kube-api-access-pdcf4\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5\" (UID: \"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5" Mar 01 09:48:52 crc kubenswrapper[4792]: I0301 09:48:52.145298 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5\" (UID: \"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5" Mar 01 09:48:52 crc kubenswrapper[4792]: I0301 09:48:52.146102 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5\" (UID: \"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5" Mar 01 09:48:52 crc kubenswrapper[4792]: I0301 09:48:52.149809 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5\" (UID: \"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5" Mar 01 09:48:52 crc kubenswrapper[4792]: I0301 09:48:52.165287 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdcf4\" (UniqueName: \"kubernetes.io/projected/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-kube-api-access-pdcf4\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5\" (UID: \"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5" Mar 01 09:48:52 crc kubenswrapper[4792]: I0301 09:48:52.258219 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5" Mar 01 09:48:52 crc kubenswrapper[4792]: I0301 09:48:52.764444 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5"] Mar 01 09:48:52 crc kubenswrapper[4792]: I0301 09:48:52.882981 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5" event={"ID":"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0","Type":"ContainerStarted","Data":"5991a571e746a511019bcebff55e511b0b510410f4c7048f3a390aaf57e77022"} Mar 01 09:48:53 crc kubenswrapper[4792]: I0301 09:48:53.906795 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5" event={"ID":"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0","Type":"ContainerStarted","Data":"3b3b94d508283fee55d1620f5691acc50b6a3016f8452ef7d9860b67978a91f7"} Mar 01 09:48:53 crc kubenswrapper[4792]: I0301 09:48:53.931425 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5" podStartSLOduration=2.403768947 podStartE2EDuration="2.931409208s" podCreationTimestamp="2026-03-01 09:48:51 +0000 UTC" firstStartedPulling="2026-03-01 09:48:52.767132343 +0000 UTC m=+2462.009011540" lastFinishedPulling="2026-03-01 09:48:53.294772604 +0000 UTC m=+2462.536651801" observedRunningTime="2026-03-01 09:48:53.922893767 +0000 UTC m=+2463.164772964" watchObservedRunningTime="2026-03-01 09:48:53.931409208 +0000 UTC m=+2463.173288405" Mar 01 09:49:04 crc kubenswrapper[4792]: I0301 09:49:04.943648 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:49:04 crc kubenswrapper[4792]: I0301 09:49:04.944196 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:49:34 crc kubenswrapper[4792]: I0301 09:49:34.942657 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:49:34 crc kubenswrapper[4792]: I0301 09:49:34.944259 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:49:34 crc kubenswrapper[4792]: I0301 09:49:34.944408 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:49:34 crc kubenswrapper[4792]: I0301 09:49:34.945143 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983"} pod="openshift-machine-config-operator/machine-config-daemon-bqszv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 01 09:49:34 crc kubenswrapper[4792]: I0301 09:49:34.945263 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" containerID="cri-o://f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" gracePeriod=600 Mar 01 09:49:35 crc kubenswrapper[4792]: E0301 09:49:35.082551 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:49:35 crc kubenswrapper[4792]: I0301 09:49:35.235766 4792 generic.go:334] "Generic (PLEG): container finished" podID="9105f6b0-6f16-47aa-8009-73736a90b765" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" exitCode=0 Mar 01 09:49:35 crc kubenswrapper[4792]: I0301 09:49:35.235808 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerDied","Data":"f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983"} Mar 01 09:49:35 crc kubenswrapper[4792]: I0301 09:49:35.235839 4792 scope.go:117] "RemoveContainer" containerID="80f7f8ed75933ce29e4ba1caec37158448f83396fe1f5a8bba0233aec8df1ec7" Mar 01 09:49:35 crc kubenswrapper[4792]: I0301 09:49:35.236437 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:49:35 crc kubenswrapper[4792]: E0301 09:49:35.236662 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:49:39 crc kubenswrapper[4792]: I0301 09:49:39.269442 4792 generic.go:334] "Generic (PLEG): container finished" podID="cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0" containerID="3b3b94d508283fee55d1620f5691acc50b6a3016f8452ef7d9860b67978a91f7" exitCode=0 Mar 01 09:49:39 crc kubenswrapper[4792]: I0301 09:49:39.269519 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5" event={"ID":"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0","Type":"ContainerDied","Data":"3b3b94d508283fee55d1620f5691acc50b6a3016f8452ef7d9860b67978a91f7"} Mar 01 09:49:40 crc kubenswrapper[4792]: I0301 09:49:40.615000 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5" Mar 01 09:49:40 crc kubenswrapper[4792]: I0301 09:49:40.774685 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdcf4\" (UniqueName: \"kubernetes.io/projected/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-kube-api-access-pdcf4\") pod \"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0\" (UID: \"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0\") " Mar 01 09:49:40 crc kubenswrapper[4792]: I0301 09:49:40.774813 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-inventory\") pod \"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0\" (UID: \"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0\") " Mar 01 09:49:40 crc kubenswrapper[4792]: I0301 09:49:40.774828 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-ceph\") pod \"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0\" (UID: \"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0\") " Mar 01 09:49:40 crc kubenswrapper[4792]: I0301 09:49:40.774845 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-ssh-key-openstack-edpm-ipam\") pod \"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0\" (UID: \"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0\") " Mar 01 09:49:40 crc kubenswrapper[4792]: I0301 09:49:40.780325 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-kube-api-access-pdcf4" (OuterVolumeSpecName: "kube-api-access-pdcf4") pod "cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0" (UID: "cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0"). InnerVolumeSpecName "kube-api-access-pdcf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:49:40 crc kubenswrapper[4792]: I0301 09:49:40.785075 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-ceph" (OuterVolumeSpecName: "ceph") pod "cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0" (UID: "cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:49:40 crc kubenswrapper[4792]: I0301 09:49:40.804439 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-inventory" (OuterVolumeSpecName: "inventory") pod "cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0" (UID: "cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:49:40 crc kubenswrapper[4792]: I0301 09:49:40.812185 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0" (UID: "cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:49:40 crc kubenswrapper[4792]: I0301 09:49:40.877012 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdcf4\" (UniqueName: \"kubernetes.io/projected/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-kube-api-access-pdcf4\") on node \"crc\" DevicePath \"\"" Mar 01 09:49:40 crc kubenswrapper[4792]: I0301 09:49:40.877047 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 09:49:40 crc kubenswrapper[4792]: I0301 09:49:40.877056 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-ceph\") on node \"crc\" DevicePath \"\"" Mar 01 09:49:40 crc kubenswrapper[4792]: I0301 09:49:40.877065 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.286490 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5" event={"ID":"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0","Type":"ContainerDied","Data":"5991a571e746a511019bcebff55e511b0b510410f4c7048f3a390aaf57e77022"} Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.286748 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5991a571e746a511019bcebff55e511b0b510410f4c7048f3a390aaf57e77022" Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.286555 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5" Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.391209 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-8k5rj"] Mar 01 09:49:41 crc kubenswrapper[4792]: E0301 09:49:41.391694 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.391716 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.391922 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.392660 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-8k5rj" Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.397481 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.397721 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.398026 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.398159 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.398663 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.405446 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-8k5rj"] Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.589621 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac58ff00-ba74-492a-97f1-e72c56686f1d-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-8k5rj\" (UID: \"ac58ff00-ba74-492a-97f1-e72c56686f1d\") " pod="openstack/ssh-known-hosts-edpm-deployment-8k5rj" Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.589740 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ac58ff00-ba74-492a-97f1-e72c56686f1d-ceph\") pod \"ssh-known-hosts-edpm-deployment-8k5rj\" (UID: \"ac58ff00-ba74-492a-97f1-e72c56686f1d\") " pod="openstack/ssh-known-hosts-edpm-deployment-8k5rj" Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.589887 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ac58ff00-ba74-492a-97f1-e72c56686f1d-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-8k5rj\" (UID: \"ac58ff00-ba74-492a-97f1-e72c56686f1d\") " pod="openstack/ssh-known-hosts-edpm-deployment-8k5rj" Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.590002 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99vt8\" (UniqueName: \"kubernetes.io/projected/ac58ff00-ba74-492a-97f1-e72c56686f1d-kube-api-access-99vt8\") pod \"ssh-known-hosts-edpm-deployment-8k5rj\" (UID: \"ac58ff00-ba74-492a-97f1-e72c56686f1d\") " pod="openstack/ssh-known-hosts-edpm-deployment-8k5rj" Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.691054 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ac58ff00-ba74-492a-97f1-e72c56686f1d-ceph\") pod \"ssh-known-hosts-edpm-deployment-8k5rj\" (UID: \"ac58ff00-ba74-492a-97f1-e72c56686f1d\") " pod="openstack/ssh-known-hosts-edpm-deployment-8k5rj" Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.691152 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ac58ff00-ba74-492a-97f1-e72c56686f1d-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-8k5rj\" (UID: \"ac58ff00-ba74-492a-97f1-e72c56686f1d\") " pod="openstack/ssh-known-hosts-edpm-deployment-8k5rj" Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.691213 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99vt8\" (UniqueName: \"kubernetes.io/projected/ac58ff00-ba74-492a-97f1-e72c56686f1d-kube-api-access-99vt8\") pod \"ssh-known-hosts-edpm-deployment-8k5rj\" (UID: \"ac58ff00-ba74-492a-97f1-e72c56686f1d\") " pod="openstack/ssh-known-hosts-edpm-deployment-8k5rj" Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.691249 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac58ff00-ba74-492a-97f1-e72c56686f1d-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-8k5rj\" (UID: \"ac58ff00-ba74-492a-97f1-e72c56686f1d\") " pod="openstack/ssh-known-hosts-edpm-deployment-8k5rj" Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.695239 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ac58ff00-ba74-492a-97f1-e72c56686f1d-ceph\") pod \"ssh-known-hosts-edpm-deployment-8k5rj\" (UID: \"ac58ff00-ba74-492a-97f1-e72c56686f1d\") " pod="openstack/ssh-known-hosts-edpm-deployment-8k5rj" Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.697106 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac58ff00-ba74-492a-97f1-e72c56686f1d-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-8k5rj\" (UID: \"ac58ff00-ba74-492a-97f1-e72c56686f1d\") " pod="openstack/ssh-known-hosts-edpm-deployment-8k5rj" Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.700225 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ac58ff00-ba74-492a-97f1-e72c56686f1d-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-8k5rj\" (UID: \"ac58ff00-ba74-492a-97f1-e72c56686f1d\") " pod="openstack/ssh-known-hosts-edpm-deployment-8k5rj" Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.711662 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99vt8\" (UniqueName: \"kubernetes.io/projected/ac58ff00-ba74-492a-97f1-e72c56686f1d-kube-api-access-99vt8\") pod \"ssh-known-hosts-edpm-deployment-8k5rj\" (UID: \"ac58ff00-ba74-492a-97f1-e72c56686f1d\") " pod="openstack/ssh-known-hosts-edpm-deployment-8k5rj" Mar 01 09:49:42 crc kubenswrapper[4792]: I0301 09:49:42.007255 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-8k5rj" Mar 01 09:49:42 crc kubenswrapper[4792]: W0301 09:49:42.530638 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac58ff00_ba74_492a_97f1_e72c56686f1d.slice/crio-e76f2a47a32ae325d9f18bf7b6fb8564c6dbb5ff6d9d975c9bbd12ba32035f88 WatchSource:0}: Error finding container e76f2a47a32ae325d9f18bf7b6fb8564c6dbb5ff6d9d975c9bbd12ba32035f88: Status 404 returned error can't find the container with id e76f2a47a32ae325d9f18bf7b6fb8564c6dbb5ff6d9d975c9bbd12ba32035f88 Mar 01 09:49:42 crc kubenswrapper[4792]: I0301 09:49:42.539285 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-8k5rj"] Mar 01 09:49:43 crc kubenswrapper[4792]: I0301 09:49:43.301964 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-8k5rj" event={"ID":"ac58ff00-ba74-492a-97f1-e72c56686f1d","Type":"ContainerStarted","Data":"baae86ba8afecd3bca5e5d9015365aee4c985262e904999ecd39f2c9a6bda3b5"} Mar 01 09:49:43 crc kubenswrapper[4792]: I0301 09:49:43.302404 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-8k5rj" event={"ID":"ac58ff00-ba74-492a-97f1-e72c56686f1d","Type":"ContainerStarted","Data":"e76f2a47a32ae325d9f18bf7b6fb8564c6dbb5ff6d9d975c9bbd12ba32035f88"} Mar 01 09:49:43 crc kubenswrapper[4792]: I0301 09:49:43.327624 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-8k5rj" podStartSLOduration=1.886662719 podStartE2EDuration="2.327608231s" podCreationTimestamp="2026-03-01 09:49:41 +0000 UTC" firstStartedPulling="2026-03-01 09:49:42.532697844 +0000 UTC m=+2511.774577041" lastFinishedPulling="2026-03-01 09:49:42.973643356 +0000 UTC m=+2512.215522553" observedRunningTime="2026-03-01 09:49:43.323960181 +0000 UTC m=+2512.565839388" watchObservedRunningTime="2026-03-01 09:49:43.327608231 +0000 UTC m=+2512.569487428" Mar 01 09:49:48 crc kubenswrapper[4792]: I0301 09:49:48.409088 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:49:48 crc kubenswrapper[4792]: E0301 09:49:48.410173 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:49:52 crc kubenswrapper[4792]: I0301 09:49:52.385701 4792 generic.go:334] "Generic (PLEG): container finished" podID="ac58ff00-ba74-492a-97f1-e72c56686f1d" containerID="baae86ba8afecd3bca5e5d9015365aee4c985262e904999ecd39f2c9a6bda3b5" exitCode=0 Mar 01 09:49:52 crc kubenswrapper[4792]: I0301 09:49:52.385799 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-8k5rj" event={"ID":"ac58ff00-ba74-492a-97f1-e72c56686f1d","Type":"ContainerDied","Data":"baae86ba8afecd3bca5e5d9015365aee4c985262e904999ecd39f2c9a6bda3b5"} Mar 01 09:49:53 crc kubenswrapper[4792]: I0301 09:49:53.763631 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-8k5rj" Mar 01 09:49:53 crc kubenswrapper[4792]: I0301 09:49:53.820000 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac58ff00-ba74-492a-97f1-e72c56686f1d-ssh-key-openstack-edpm-ipam\") pod \"ac58ff00-ba74-492a-97f1-e72c56686f1d\" (UID: \"ac58ff00-ba74-492a-97f1-e72c56686f1d\") " Mar 01 09:49:53 crc kubenswrapper[4792]: I0301 09:49:53.820556 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ac58ff00-ba74-492a-97f1-e72c56686f1d-inventory-0\") pod \"ac58ff00-ba74-492a-97f1-e72c56686f1d\" (UID: \"ac58ff00-ba74-492a-97f1-e72c56686f1d\") " Mar 01 09:49:53 crc kubenswrapper[4792]: I0301 09:49:53.820660 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ac58ff00-ba74-492a-97f1-e72c56686f1d-ceph\") pod \"ac58ff00-ba74-492a-97f1-e72c56686f1d\" (UID: \"ac58ff00-ba74-492a-97f1-e72c56686f1d\") " Mar 01 09:49:53 crc kubenswrapper[4792]: I0301 09:49:53.820799 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99vt8\" (UniqueName: \"kubernetes.io/projected/ac58ff00-ba74-492a-97f1-e72c56686f1d-kube-api-access-99vt8\") pod \"ac58ff00-ba74-492a-97f1-e72c56686f1d\" (UID: \"ac58ff00-ba74-492a-97f1-e72c56686f1d\") " Mar 01 09:49:53 crc kubenswrapper[4792]: I0301 09:49:53.828138 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac58ff00-ba74-492a-97f1-e72c56686f1d-kube-api-access-99vt8" (OuterVolumeSpecName: "kube-api-access-99vt8") pod "ac58ff00-ba74-492a-97f1-e72c56686f1d" (UID: "ac58ff00-ba74-492a-97f1-e72c56686f1d"). InnerVolumeSpecName "kube-api-access-99vt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:49:53 crc kubenswrapper[4792]: I0301 09:49:53.829086 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac58ff00-ba74-492a-97f1-e72c56686f1d-ceph" (OuterVolumeSpecName: "ceph") pod "ac58ff00-ba74-492a-97f1-e72c56686f1d" (UID: "ac58ff00-ba74-492a-97f1-e72c56686f1d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:49:53 crc kubenswrapper[4792]: I0301 09:49:53.844974 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac58ff00-ba74-492a-97f1-e72c56686f1d-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "ac58ff00-ba74-492a-97f1-e72c56686f1d" (UID: "ac58ff00-ba74-492a-97f1-e72c56686f1d"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:49:53 crc kubenswrapper[4792]: I0301 09:49:53.846389 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac58ff00-ba74-492a-97f1-e72c56686f1d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ac58ff00-ba74-492a-97f1-e72c56686f1d" (UID: "ac58ff00-ba74-492a-97f1-e72c56686f1d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:49:53 crc kubenswrapper[4792]: I0301 09:49:53.922871 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac58ff00-ba74-492a-97f1-e72c56686f1d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:49:53 crc kubenswrapper[4792]: I0301 09:49:53.922954 4792 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ac58ff00-ba74-492a-97f1-e72c56686f1d-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 01 09:49:53 crc kubenswrapper[4792]: I0301 09:49:53.922968 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ac58ff00-ba74-492a-97f1-e72c56686f1d-ceph\") on node \"crc\" DevicePath \"\"" Mar 01 09:49:53 crc kubenswrapper[4792]: I0301 09:49:53.922978 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99vt8\" (UniqueName: \"kubernetes.io/projected/ac58ff00-ba74-492a-97f1-e72c56686f1d-kube-api-access-99vt8\") on node \"crc\" DevicePath \"\"" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.414720 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-8k5rj" event={"ID":"ac58ff00-ba74-492a-97f1-e72c56686f1d","Type":"ContainerDied","Data":"e76f2a47a32ae325d9f18bf7b6fb8564c6dbb5ff6d9d975c9bbd12ba32035f88"} Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.414760 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e76f2a47a32ae325d9f18bf7b6fb8564c6dbb5ff6d9d975c9bbd12ba32035f88" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.414884 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-8k5rj" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.493266 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7"] Mar 01 09:49:54 crc kubenswrapper[4792]: E0301 09:49:54.493681 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac58ff00-ba74-492a-97f1-e72c56686f1d" containerName="ssh-known-hosts-edpm-deployment" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.493703 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac58ff00-ba74-492a-97f1-e72c56686f1d" containerName="ssh-known-hosts-edpm-deployment" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.493883 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac58ff00-ba74-492a-97f1-e72c56686f1d" containerName="ssh-known-hosts-edpm-deployment" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.494601 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.499563 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.499779 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.500777 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.507864 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.508477 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7"] Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.552066 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.552645 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff733b23-0a97-4623-9eeb-339aa02fc3b0-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gxxr7\" (UID: \"ff733b23-0a97-4623-9eeb-339aa02fc3b0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.552746 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ff733b23-0a97-4623-9eeb-339aa02fc3b0-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gxxr7\" (UID: \"ff733b23-0a97-4623-9eeb-339aa02fc3b0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.552793 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ff733b23-0a97-4623-9eeb-339aa02fc3b0-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gxxr7\" (UID: \"ff733b23-0a97-4623-9eeb-339aa02fc3b0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.552821 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxwwx\" (UniqueName: \"kubernetes.io/projected/ff733b23-0a97-4623-9eeb-339aa02fc3b0-kube-api-access-cxwwx\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gxxr7\" (UID: \"ff733b23-0a97-4623-9eeb-339aa02fc3b0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.654112 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ff733b23-0a97-4623-9eeb-339aa02fc3b0-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gxxr7\" (UID: \"ff733b23-0a97-4623-9eeb-339aa02fc3b0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.654582 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxwwx\" (UniqueName: \"kubernetes.io/projected/ff733b23-0a97-4623-9eeb-339aa02fc3b0-kube-api-access-cxwwx\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gxxr7\" (UID: \"ff733b23-0a97-4623-9eeb-339aa02fc3b0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.654683 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff733b23-0a97-4623-9eeb-339aa02fc3b0-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gxxr7\" (UID: \"ff733b23-0a97-4623-9eeb-339aa02fc3b0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.654746 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ff733b23-0a97-4623-9eeb-339aa02fc3b0-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gxxr7\" (UID: \"ff733b23-0a97-4623-9eeb-339aa02fc3b0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.661403 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ff733b23-0a97-4623-9eeb-339aa02fc3b0-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gxxr7\" (UID: \"ff733b23-0a97-4623-9eeb-339aa02fc3b0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.667440 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff733b23-0a97-4623-9eeb-339aa02fc3b0-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gxxr7\" (UID: \"ff733b23-0a97-4623-9eeb-339aa02fc3b0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.670811 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ff733b23-0a97-4623-9eeb-339aa02fc3b0-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gxxr7\" (UID: \"ff733b23-0a97-4623-9eeb-339aa02fc3b0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.671403 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxwwx\" (UniqueName: \"kubernetes.io/projected/ff733b23-0a97-4623-9eeb-339aa02fc3b0-kube-api-access-cxwwx\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gxxr7\" (UID: \"ff733b23-0a97-4623-9eeb-339aa02fc3b0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.871179 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7" Mar 01 09:49:55 crc kubenswrapper[4792]: I0301 09:49:55.364226 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7"] Mar 01 09:49:55 crc kubenswrapper[4792]: I0301 09:49:55.421593 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7" event={"ID":"ff733b23-0a97-4623-9eeb-339aa02fc3b0","Type":"ContainerStarted","Data":"db6cb5df12916c3876aa7d66ae1518f0b04d2d2d57926fff5a1fa1d1f6b4ca19"} Mar 01 09:49:56 crc kubenswrapper[4792]: I0301 09:49:56.430207 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7" event={"ID":"ff733b23-0a97-4623-9eeb-339aa02fc3b0","Type":"ContainerStarted","Data":"b3542fb65ba1732d969d1c1e12d706e23a7dab68ee9d6400fa7b59bdcc1e00eb"} Mar 01 09:49:56 crc kubenswrapper[4792]: I0301 09:49:56.447469 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7" podStartSLOduration=2.004931616 podStartE2EDuration="2.447448117s" podCreationTimestamp="2026-03-01 09:49:54 +0000 UTC" firstStartedPulling="2026-03-01 09:49:55.371842471 +0000 UTC m=+2524.613721668" lastFinishedPulling="2026-03-01 09:49:55.814358962 +0000 UTC m=+2525.056238169" observedRunningTime="2026-03-01 09:49:56.442857593 +0000 UTC m=+2525.684736810" watchObservedRunningTime="2026-03-01 09:49:56.447448117 +0000 UTC m=+2525.689327314" Mar 01 09:50:00 crc kubenswrapper[4792]: I0301 09:50:00.128741 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539310-cr6qh"] Mar 01 09:50:00 crc kubenswrapper[4792]: I0301 09:50:00.130348 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539310-cr6qh" Mar 01 09:50:00 crc kubenswrapper[4792]: I0301 09:50:00.132633 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:50:00 crc kubenswrapper[4792]: I0301 09:50:00.132797 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:50:00 crc kubenswrapper[4792]: I0301 09:50:00.133191 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:50:00 crc kubenswrapper[4792]: I0301 09:50:00.145738 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539310-cr6qh"] Mar 01 09:50:00 crc kubenswrapper[4792]: I0301 09:50:00.152041 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-482sm\" (UniqueName: \"kubernetes.io/projected/bf147424-57ff-455c-9aac-e32adcab851e-kube-api-access-482sm\") pod \"auto-csr-approver-29539310-cr6qh\" (UID: \"bf147424-57ff-455c-9aac-e32adcab851e\") " pod="openshift-infra/auto-csr-approver-29539310-cr6qh" Mar 01 09:50:00 crc kubenswrapper[4792]: I0301 09:50:00.254195 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-482sm\" (UniqueName: \"kubernetes.io/projected/bf147424-57ff-455c-9aac-e32adcab851e-kube-api-access-482sm\") pod \"auto-csr-approver-29539310-cr6qh\" (UID: \"bf147424-57ff-455c-9aac-e32adcab851e\") " pod="openshift-infra/auto-csr-approver-29539310-cr6qh" Mar 01 09:50:00 crc kubenswrapper[4792]: I0301 09:50:00.275394 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-482sm\" (UniqueName: \"kubernetes.io/projected/bf147424-57ff-455c-9aac-e32adcab851e-kube-api-access-482sm\") pod \"auto-csr-approver-29539310-cr6qh\" (UID: \"bf147424-57ff-455c-9aac-e32adcab851e\") " pod="openshift-infra/auto-csr-approver-29539310-cr6qh" Mar 01 09:50:00 crc kubenswrapper[4792]: I0301 09:50:00.457460 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539310-cr6qh" Mar 01 09:50:00 crc kubenswrapper[4792]: I0301 09:50:00.862260 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539310-cr6qh"] Mar 01 09:50:01 crc kubenswrapper[4792]: I0301 09:50:01.482563 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539310-cr6qh" event={"ID":"bf147424-57ff-455c-9aac-e32adcab851e","Type":"ContainerStarted","Data":"d9fcfd8748da5c290f602e5b1a9738b8fa0a559810526842ef6d99baf1b2b366"} Mar 01 09:50:02 crc kubenswrapper[4792]: I0301 09:50:02.491678 4792 generic.go:334] "Generic (PLEG): container finished" podID="bf147424-57ff-455c-9aac-e32adcab851e" containerID="3be8237f11ee8a9c2a66a6dce0cfdb5c72c7e1c7d5445dc46a852faf899f2940" exitCode=0 Mar 01 09:50:02 crc kubenswrapper[4792]: I0301 09:50:02.491721 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539310-cr6qh" event={"ID":"bf147424-57ff-455c-9aac-e32adcab851e","Type":"ContainerDied","Data":"3be8237f11ee8a9c2a66a6dce0cfdb5c72c7e1c7d5445dc46a852faf899f2940"} Mar 01 09:50:03 crc kubenswrapper[4792]: I0301 09:50:03.408595 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:50:03 crc kubenswrapper[4792]: E0301 09:50:03.409198 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:50:03 crc kubenswrapper[4792]: I0301 09:50:03.845576 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539310-cr6qh" Mar 01 09:50:03 crc kubenswrapper[4792]: I0301 09:50:03.924037 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-482sm\" (UniqueName: \"kubernetes.io/projected/bf147424-57ff-455c-9aac-e32adcab851e-kube-api-access-482sm\") pod \"bf147424-57ff-455c-9aac-e32adcab851e\" (UID: \"bf147424-57ff-455c-9aac-e32adcab851e\") " Mar 01 09:50:03 crc kubenswrapper[4792]: I0301 09:50:03.932800 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf147424-57ff-455c-9aac-e32adcab851e-kube-api-access-482sm" (OuterVolumeSpecName: "kube-api-access-482sm") pod "bf147424-57ff-455c-9aac-e32adcab851e" (UID: "bf147424-57ff-455c-9aac-e32adcab851e"). InnerVolumeSpecName "kube-api-access-482sm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:50:04 crc kubenswrapper[4792]: I0301 09:50:04.027010 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-482sm\" (UniqueName: \"kubernetes.io/projected/bf147424-57ff-455c-9aac-e32adcab851e-kube-api-access-482sm\") on node \"crc\" DevicePath \"\"" Mar 01 09:50:04 crc kubenswrapper[4792]: I0301 09:50:04.512277 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539310-cr6qh" event={"ID":"bf147424-57ff-455c-9aac-e32adcab851e","Type":"ContainerDied","Data":"d9fcfd8748da5c290f602e5b1a9738b8fa0a559810526842ef6d99baf1b2b366"} Mar 01 09:50:04 crc kubenswrapper[4792]: I0301 09:50:04.512325 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9fcfd8748da5c290f602e5b1a9738b8fa0a559810526842ef6d99baf1b2b366" Mar 01 09:50:04 crc kubenswrapper[4792]: I0301 09:50:04.512322 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539310-cr6qh" Mar 01 09:50:04 crc kubenswrapper[4792]: I0301 09:50:04.513864 4792 generic.go:334] "Generic (PLEG): container finished" podID="ff733b23-0a97-4623-9eeb-339aa02fc3b0" containerID="b3542fb65ba1732d969d1c1e12d706e23a7dab68ee9d6400fa7b59bdcc1e00eb" exitCode=0 Mar 01 09:50:04 crc kubenswrapper[4792]: I0301 09:50:04.514014 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7" event={"ID":"ff733b23-0a97-4623-9eeb-339aa02fc3b0","Type":"ContainerDied","Data":"b3542fb65ba1732d969d1c1e12d706e23a7dab68ee9d6400fa7b59bdcc1e00eb"} Mar 01 09:50:04 crc kubenswrapper[4792]: I0301 09:50:04.918893 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539304-c2vn2"] Mar 01 09:50:04 crc kubenswrapper[4792]: I0301 09:50:04.926279 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539304-c2vn2"] Mar 01 09:50:05 crc kubenswrapper[4792]: I0301 09:50:05.419275 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97e68f99-8c1f-4046-bb89-66516bff6370" path="/var/lib/kubelet/pods/97e68f99-8c1f-4046-bb89-66516bff6370/volumes" Mar 01 09:50:05 crc kubenswrapper[4792]: I0301 09:50:05.919169 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7" Mar 01 09:50:05 crc kubenswrapper[4792]: I0301 09:50:05.960767 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxwwx\" (UniqueName: \"kubernetes.io/projected/ff733b23-0a97-4623-9eeb-339aa02fc3b0-kube-api-access-cxwwx\") pod \"ff733b23-0a97-4623-9eeb-339aa02fc3b0\" (UID: \"ff733b23-0a97-4623-9eeb-339aa02fc3b0\") " Mar 01 09:50:05 crc kubenswrapper[4792]: I0301 09:50:05.961062 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ff733b23-0a97-4623-9eeb-339aa02fc3b0-ceph\") pod \"ff733b23-0a97-4623-9eeb-339aa02fc3b0\" (UID: \"ff733b23-0a97-4623-9eeb-339aa02fc3b0\") " Mar 01 09:50:05 crc kubenswrapper[4792]: I0301 09:50:05.961101 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff733b23-0a97-4623-9eeb-339aa02fc3b0-inventory\") pod \"ff733b23-0a97-4623-9eeb-339aa02fc3b0\" (UID: \"ff733b23-0a97-4623-9eeb-339aa02fc3b0\") " Mar 01 09:50:05 crc kubenswrapper[4792]: I0301 09:50:05.961188 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ff733b23-0a97-4623-9eeb-339aa02fc3b0-ssh-key-openstack-edpm-ipam\") pod \"ff733b23-0a97-4623-9eeb-339aa02fc3b0\" (UID: \"ff733b23-0a97-4623-9eeb-339aa02fc3b0\") " Mar 01 09:50:05 crc kubenswrapper[4792]: I0301 09:50:05.965890 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff733b23-0a97-4623-9eeb-339aa02fc3b0-ceph" (OuterVolumeSpecName: "ceph") pod "ff733b23-0a97-4623-9eeb-339aa02fc3b0" (UID: "ff733b23-0a97-4623-9eeb-339aa02fc3b0"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:50:05 crc kubenswrapper[4792]: I0301 09:50:05.966340 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff733b23-0a97-4623-9eeb-339aa02fc3b0-kube-api-access-cxwwx" (OuterVolumeSpecName: "kube-api-access-cxwwx") pod "ff733b23-0a97-4623-9eeb-339aa02fc3b0" (UID: "ff733b23-0a97-4623-9eeb-339aa02fc3b0"). InnerVolumeSpecName "kube-api-access-cxwwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:50:05 crc kubenswrapper[4792]: I0301 09:50:05.984678 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff733b23-0a97-4623-9eeb-339aa02fc3b0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ff733b23-0a97-4623-9eeb-339aa02fc3b0" (UID: "ff733b23-0a97-4623-9eeb-339aa02fc3b0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:50:05 crc kubenswrapper[4792]: I0301 09:50:05.989635 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff733b23-0a97-4623-9eeb-339aa02fc3b0-inventory" (OuterVolumeSpecName: "inventory") pod "ff733b23-0a97-4623-9eeb-339aa02fc3b0" (UID: "ff733b23-0a97-4623-9eeb-339aa02fc3b0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.063736 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ff733b23-0a97-4623-9eeb-339aa02fc3b0-ceph\") on node \"crc\" DevicePath \"\"" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.063763 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff733b23-0a97-4623-9eeb-339aa02fc3b0-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.063774 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ff733b23-0a97-4623-9eeb-339aa02fc3b0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.063783 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxwwx\" (UniqueName: \"kubernetes.io/projected/ff733b23-0a97-4623-9eeb-339aa02fc3b0-kube-api-access-cxwwx\") on node \"crc\" DevicePath \"\"" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.576661 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7" event={"ID":"ff733b23-0a97-4623-9eeb-339aa02fc3b0","Type":"ContainerDied","Data":"db6cb5df12916c3876aa7d66ae1518f0b04d2d2d57926fff5a1fa1d1f6b4ca19"} Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.576946 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db6cb5df12916c3876aa7d66ae1518f0b04d2d2d57926fff5a1fa1d1f6b4ca19" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.577126 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.623060 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d"] Mar 01 09:50:06 crc kubenswrapper[4792]: E0301 09:50:06.623838 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf147424-57ff-455c-9aac-e32adcab851e" containerName="oc" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.623985 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf147424-57ff-455c-9aac-e32adcab851e" containerName="oc" Mar 01 09:50:06 crc kubenswrapper[4792]: E0301 09:50:06.624163 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff733b23-0a97-4623-9eeb-339aa02fc3b0" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.624266 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff733b23-0a97-4623-9eeb-339aa02fc3b0" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.624655 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff733b23-0a97-4623-9eeb-339aa02fc3b0" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.624798 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf147424-57ff-455c-9aac-e32adcab851e" containerName="oc" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.625790 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.632647 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.633722 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.633928 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.633991 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.634022 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.635235 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d"] Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.672817 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/34275228-a1ab-4955-9d16-d184643a86d1-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d\" (UID: \"34275228-a1ab-4955-9d16-d184643a86d1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.673248 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27xkg\" (UniqueName: \"kubernetes.io/projected/34275228-a1ab-4955-9d16-d184643a86d1-kube-api-access-27xkg\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d\" (UID: \"34275228-a1ab-4955-9d16-d184643a86d1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.673330 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/34275228-a1ab-4955-9d16-d184643a86d1-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d\" (UID: \"34275228-a1ab-4955-9d16-d184643a86d1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.673368 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34275228-a1ab-4955-9d16-d184643a86d1-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d\" (UID: \"34275228-a1ab-4955-9d16-d184643a86d1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.774921 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/34275228-a1ab-4955-9d16-d184643a86d1-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d\" (UID: \"34275228-a1ab-4955-9d16-d184643a86d1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.775066 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27xkg\" (UniqueName: \"kubernetes.io/projected/34275228-a1ab-4955-9d16-d184643a86d1-kube-api-access-27xkg\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d\" (UID: \"34275228-a1ab-4955-9d16-d184643a86d1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.775580 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/34275228-a1ab-4955-9d16-d184643a86d1-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d\" (UID: \"34275228-a1ab-4955-9d16-d184643a86d1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.775675 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34275228-a1ab-4955-9d16-d184643a86d1-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d\" (UID: \"34275228-a1ab-4955-9d16-d184643a86d1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.779218 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34275228-a1ab-4955-9d16-d184643a86d1-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d\" (UID: \"34275228-a1ab-4955-9d16-d184643a86d1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.779663 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/34275228-a1ab-4955-9d16-d184643a86d1-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d\" (UID: \"34275228-a1ab-4955-9d16-d184643a86d1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.779991 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/34275228-a1ab-4955-9d16-d184643a86d1-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d\" (UID: \"34275228-a1ab-4955-9d16-d184643a86d1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.792621 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27xkg\" (UniqueName: \"kubernetes.io/projected/34275228-a1ab-4955-9d16-d184643a86d1-kube-api-access-27xkg\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d\" (UID: \"34275228-a1ab-4955-9d16-d184643a86d1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.942180 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d" Mar 01 09:50:07 crc kubenswrapper[4792]: I0301 09:50:07.424246 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d"] Mar 01 09:50:07 crc kubenswrapper[4792]: I0301 09:50:07.585554 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d" event={"ID":"34275228-a1ab-4955-9d16-d184643a86d1","Type":"ContainerStarted","Data":"f1578421aff2f89fef2f6ea6ebd6c1d8cfc558aa833e2b5fc9ddfc26b93f7d1f"} Mar 01 09:50:08 crc kubenswrapper[4792]: I0301 09:50:08.597645 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d" event={"ID":"34275228-a1ab-4955-9d16-d184643a86d1","Type":"ContainerStarted","Data":"aabeffd021257f62c9a2f4a842164b31c9cdc843be8d52afa279dac1f46235f3"} Mar 01 09:50:08 crc kubenswrapper[4792]: I0301 09:50:08.624820 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d" podStartSLOduration=2.238836409 podStartE2EDuration="2.624790786s" podCreationTimestamp="2026-03-01 09:50:06 +0000 UTC" firstStartedPulling="2026-03-01 09:50:07.428278823 +0000 UTC m=+2536.670158030" lastFinishedPulling="2026-03-01 09:50:07.81423321 +0000 UTC m=+2537.056112407" observedRunningTime="2026-03-01 09:50:08.616190232 +0000 UTC m=+2537.858069469" watchObservedRunningTime="2026-03-01 09:50:08.624790786 +0000 UTC m=+2537.866670023" Mar 01 09:50:14 crc kubenswrapper[4792]: I0301 09:50:14.409191 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:50:14 crc kubenswrapper[4792]: E0301 09:50:14.410156 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:50:17 crc kubenswrapper[4792]: I0301 09:50:17.702313 4792 generic.go:334] "Generic (PLEG): container finished" podID="34275228-a1ab-4955-9d16-d184643a86d1" containerID="aabeffd021257f62c9a2f4a842164b31c9cdc843be8d52afa279dac1f46235f3" exitCode=0 Mar 01 09:50:17 crc kubenswrapper[4792]: I0301 09:50:17.702400 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d" event={"ID":"34275228-a1ab-4955-9d16-d184643a86d1","Type":"ContainerDied","Data":"aabeffd021257f62c9a2f4a842164b31c9cdc843be8d52afa279dac1f46235f3"} Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.148425 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.316453 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/34275228-a1ab-4955-9d16-d184643a86d1-ceph\") pod \"34275228-a1ab-4955-9d16-d184643a86d1\" (UID: \"34275228-a1ab-4955-9d16-d184643a86d1\") " Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.316645 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34275228-a1ab-4955-9d16-d184643a86d1-inventory\") pod \"34275228-a1ab-4955-9d16-d184643a86d1\" (UID: \"34275228-a1ab-4955-9d16-d184643a86d1\") " Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.316689 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/34275228-a1ab-4955-9d16-d184643a86d1-ssh-key-openstack-edpm-ipam\") pod \"34275228-a1ab-4955-9d16-d184643a86d1\" (UID: \"34275228-a1ab-4955-9d16-d184643a86d1\") " Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.316832 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27xkg\" (UniqueName: \"kubernetes.io/projected/34275228-a1ab-4955-9d16-d184643a86d1-kube-api-access-27xkg\") pod \"34275228-a1ab-4955-9d16-d184643a86d1\" (UID: \"34275228-a1ab-4955-9d16-d184643a86d1\") " Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.322409 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34275228-a1ab-4955-9d16-d184643a86d1-kube-api-access-27xkg" (OuterVolumeSpecName: "kube-api-access-27xkg") pod "34275228-a1ab-4955-9d16-d184643a86d1" (UID: "34275228-a1ab-4955-9d16-d184643a86d1"). InnerVolumeSpecName "kube-api-access-27xkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.323034 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34275228-a1ab-4955-9d16-d184643a86d1-ceph" (OuterVolumeSpecName: "ceph") pod "34275228-a1ab-4955-9d16-d184643a86d1" (UID: "34275228-a1ab-4955-9d16-d184643a86d1"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.341272 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34275228-a1ab-4955-9d16-d184643a86d1-inventory" (OuterVolumeSpecName: "inventory") pod "34275228-a1ab-4955-9d16-d184643a86d1" (UID: "34275228-a1ab-4955-9d16-d184643a86d1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.344049 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34275228-a1ab-4955-9d16-d184643a86d1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "34275228-a1ab-4955-9d16-d184643a86d1" (UID: "34275228-a1ab-4955-9d16-d184643a86d1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.419470 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27xkg\" (UniqueName: \"kubernetes.io/projected/34275228-a1ab-4955-9d16-d184643a86d1-kube-api-access-27xkg\") on node \"crc\" DevicePath \"\"" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.419504 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/34275228-a1ab-4955-9d16-d184643a86d1-ceph\") on node \"crc\" DevicePath \"\"" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.419519 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34275228-a1ab-4955-9d16-d184643a86d1-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.419531 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/34275228-a1ab-4955-9d16-d184643a86d1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.755520 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.755559 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d" event={"ID":"34275228-a1ab-4955-9d16-d184643a86d1","Type":"ContainerDied","Data":"f1578421aff2f89fef2f6ea6ebd6c1d8cfc558aa833e2b5fc9ddfc26b93f7d1f"} Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.755898 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1578421aff2f89fef2f6ea6ebd6c1d8cfc558aa833e2b5fc9ddfc26b93f7d1f" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.818245 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh"] Mar 01 09:50:19 crc kubenswrapper[4792]: E0301 09:50:19.818841 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34275228-a1ab-4955-9d16-d184643a86d1" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.818938 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="34275228-a1ab-4955-9d16-d184643a86d1" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.819229 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="34275228-a1ab-4955-9d16-d184643a86d1" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.820226 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.825176 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.825402 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.825504 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.825215 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.825269 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.825359 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.825364 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.830496 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.833107 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh"] Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.850942 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg97l\" (UniqueName: \"kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-kube-api-access-sg97l\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.850993 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.851073 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.851099 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.851135 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.851155 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.851189 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.851213 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.851240 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.851263 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.851284 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.851312 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.851339 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.952450 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.952506 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.953174 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.953202 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.953257 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.953298 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.953323 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.953342 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.953371 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.953400 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.953430 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.953450 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sg97l\" (UniqueName: \"kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-kube-api-access-sg97l\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.953470 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.958595 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.959398 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.960305 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.960430 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.960434 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.962103 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.965303 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.965444 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.971519 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.974414 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg97l\" (UniqueName: \"kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-kube-api-access-sg97l\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.975055 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.975288 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.976496 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:20 crc kubenswrapper[4792]: I0301 09:50:20.134479 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:20 crc kubenswrapper[4792]: I0301 09:50:20.629133 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh"] Mar 01 09:50:20 crc kubenswrapper[4792]: I0301 09:50:20.764334 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" event={"ID":"d11c64e6-0562-41d9-a213-f1c5749b4c83","Type":"ContainerStarted","Data":"e19e492b8b4c3b1acedd50eee438177fb887e813c2336c4b49ea100012ecdfaf"} Mar 01 09:50:21 crc kubenswrapper[4792]: I0301 09:50:21.773092 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" event={"ID":"d11c64e6-0562-41d9-a213-f1c5749b4c83","Type":"ContainerStarted","Data":"662eb199da8ca1b1a64e3ef37c17726d184337944bb8aabe6168f79569cae95a"} Mar 01 09:50:21 crc kubenswrapper[4792]: I0301 09:50:21.797381 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" podStartSLOduration=2.384096943 podStartE2EDuration="2.797361379s" podCreationTimestamp="2026-03-01 09:50:19 +0000 UTC" firstStartedPulling="2026-03-01 09:50:20.639484672 +0000 UTC m=+2549.881363869" lastFinishedPulling="2026-03-01 09:50:21.052749118 +0000 UTC m=+2550.294628305" observedRunningTime="2026-03-01 09:50:21.787203847 +0000 UTC m=+2551.029083044" watchObservedRunningTime="2026-03-01 09:50:21.797361379 +0000 UTC m=+2551.039240566" Mar 01 09:50:28 crc kubenswrapper[4792]: I0301 09:50:28.408644 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:50:28 crc kubenswrapper[4792]: E0301 09:50:28.409308 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:50:31 crc kubenswrapper[4792]: I0301 09:50:31.220736 4792 scope.go:117] "RemoveContainer" containerID="3cbaa243041e250919798684d495339949ab384e80a45c460f4d0e0c2cfab407" Mar 01 09:50:41 crc kubenswrapper[4792]: I0301 09:50:41.417955 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:50:41 crc kubenswrapper[4792]: E0301 09:50:41.418636 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:50:52 crc kubenswrapper[4792]: I0301 09:50:52.998892 4792 generic.go:334] "Generic (PLEG): container finished" podID="d11c64e6-0562-41d9-a213-f1c5749b4c83" containerID="662eb199da8ca1b1a64e3ef37c17726d184337944bb8aabe6168f79569cae95a" exitCode=0 Mar 01 09:50:53 crc kubenswrapper[4792]: I0301 09:50:52.999000 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" event={"ID":"d11c64e6-0562-41d9-a213-f1c5749b4c83","Type":"ContainerDied","Data":"662eb199da8ca1b1a64e3ef37c17726d184337944bb8aabe6168f79569cae95a"} Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.467738 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.477210 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-nova-combined-ca-bundle\") pod \"d11c64e6-0562-41d9-a213-f1c5749b4c83\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.482150 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "d11c64e6-0562-41d9-a213-f1c5749b4c83" (UID: "d11c64e6-0562-41d9-a213-f1c5749b4c83"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.578744 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-ssh-key-openstack-edpm-ipam\") pod \"d11c64e6-0562-41d9-a213-f1c5749b4c83\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.578881 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"d11c64e6-0562-41d9-a213-f1c5749b4c83\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.578906 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-ceph\") pod \"d11c64e6-0562-41d9-a213-f1c5749b4c83\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.578949 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-libvirt-combined-ca-bundle\") pod \"d11c64e6-0562-41d9-a213-f1c5749b4c83\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.578971 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sg97l\" (UniqueName: \"kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-kube-api-access-sg97l\") pod \"d11c64e6-0562-41d9-a213-f1c5749b4c83\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.579000 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"d11c64e6-0562-41d9-a213-f1c5749b4c83\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.579158 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-openstack-edpm-ipam-ovn-default-certs-0\") pod \"d11c64e6-0562-41d9-a213-f1c5749b4c83\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.579186 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-inventory\") pod \"d11c64e6-0562-41d9-a213-f1c5749b4c83\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.579517 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-neutron-metadata-combined-ca-bundle\") pod \"d11c64e6-0562-41d9-a213-f1c5749b4c83\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.579578 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-bootstrap-combined-ca-bundle\") pod \"d11c64e6-0562-41d9-a213-f1c5749b4c83\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.579627 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-ovn-combined-ca-bundle\") pod \"d11c64e6-0562-41d9-a213-f1c5749b4c83\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.579674 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-repo-setup-combined-ca-bundle\") pod \"d11c64e6-0562-41d9-a213-f1c5749b4c83\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.580115 4792 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.582458 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "d11c64e6-0562-41d9-a213-f1c5749b4c83" (UID: "d11c64e6-0562-41d9-a213-f1c5749b4c83"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.583203 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "d11c64e6-0562-41d9-a213-f1c5749b4c83" (UID: "d11c64e6-0562-41d9-a213-f1c5749b4c83"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.584079 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-kube-api-access-sg97l" (OuterVolumeSpecName: "kube-api-access-sg97l") pod "d11c64e6-0562-41d9-a213-f1c5749b4c83" (UID: "d11c64e6-0562-41d9-a213-f1c5749b4c83"). InnerVolumeSpecName "kube-api-access-sg97l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.584546 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "d11c64e6-0562-41d9-a213-f1c5749b4c83" (UID: "d11c64e6-0562-41d9-a213-f1c5749b4c83"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.585592 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "d11c64e6-0562-41d9-a213-f1c5749b4c83" (UID: "d11c64e6-0562-41d9-a213-f1c5749b4c83"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.586543 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-ceph" (OuterVolumeSpecName: "ceph") pod "d11c64e6-0562-41d9-a213-f1c5749b4c83" (UID: "d11c64e6-0562-41d9-a213-f1c5749b4c83"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.587197 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "d11c64e6-0562-41d9-a213-f1c5749b4c83" (UID: "d11c64e6-0562-41d9-a213-f1c5749b4c83"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.589504 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "d11c64e6-0562-41d9-a213-f1c5749b4c83" (UID: "d11c64e6-0562-41d9-a213-f1c5749b4c83"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.601577 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "d11c64e6-0562-41d9-a213-f1c5749b4c83" (UID: "d11c64e6-0562-41d9-a213-f1c5749b4c83"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.602980 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "d11c64e6-0562-41d9-a213-f1c5749b4c83" (UID: "d11c64e6-0562-41d9-a213-f1c5749b4c83"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.608996 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d11c64e6-0562-41d9-a213-f1c5749b4c83" (UID: "d11c64e6-0562-41d9-a213-f1c5749b4c83"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.618100 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-inventory" (OuterVolumeSpecName: "inventory") pod "d11c64e6-0562-41d9-a213-f1c5749b4c83" (UID: "d11c64e6-0562-41d9-a213-f1c5749b4c83"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.681750 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.681970 4792 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.682072 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-ceph\") on node \"crc\" DevicePath \"\"" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.682135 4792 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.682191 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sg97l\" (UniqueName: \"kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-kube-api-access-sg97l\") on node \"crc\" DevicePath \"\"" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.682245 4792 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.682305 4792 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.682360 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.682417 4792 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.682580 4792 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.682731 4792 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.682798 4792 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.016359 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" event={"ID":"d11c64e6-0562-41d9-a213-f1c5749b4c83","Type":"ContainerDied","Data":"e19e492b8b4c3b1acedd50eee438177fb887e813c2336c4b49ea100012ecdfaf"} Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.016399 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e19e492b8b4c3b1acedd50eee438177fb887e813c2336c4b49ea100012ecdfaf" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.016402 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.205265 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj"] Mar 01 09:50:55 crc kubenswrapper[4792]: E0301 09:50:55.205861 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d11c64e6-0562-41d9-a213-f1c5749b4c83" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.205883 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d11c64e6-0562-41d9-a213-f1c5749b4c83" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.206067 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d11c64e6-0562-41d9-a213-f1c5749b4c83" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.206609 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.210749 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.211434 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.211684 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.212993 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.213311 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.236042 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj"] Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.291747 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f3a428e9-b35d-4f80-bb40-c158095d5bfa-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj\" (UID: \"f3a428e9-b35d-4f80-bb40-c158095d5bfa\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.291869 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f3a428e9-b35d-4f80-bb40-c158095d5bfa-ssh-key-openstack-edpm-ipam\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj\" (UID: \"f3a428e9-b35d-4f80-bb40-c158095d5bfa\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.291927 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3a428e9-b35d-4f80-bb40-c158095d5bfa-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj\" (UID: \"f3a428e9-b35d-4f80-bb40-c158095d5bfa\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.291972 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjgs4\" (UniqueName: \"kubernetes.io/projected/f3a428e9-b35d-4f80-bb40-c158095d5bfa-kube-api-access-sjgs4\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj\" (UID: \"f3a428e9-b35d-4f80-bb40-c158095d5bfa\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.393151 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f3a428e9-b35d-4f80-bb40-c158095d5bfa-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj\" (UID: \"f3a428e9-b35d-4f80-bb40-c158095d5bfa\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.393256 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f3a428e9-b35d-4f80-bb40-c158095d5bfa-ssh-key-openstack-edpm-ipam\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj\" (UID: \"f3a428e9-b35d-4f80-bb40-c158095d5bfa\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.393288 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3a428e9-b35d-4f80-bb40-c158095d5bfa-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj\" (UID: \"f3a428e9-b35d-4f80-bb40-c158095d5bfa\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.393330 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjgs4\" (UniqueName: \"kubernetes.io/projected/f3a428e9-b35d-4f80-bb40-c158095d5bfa-kube-api-access-sjgs4\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj\" (UID: \"f3a428e9-b35d-4f80-bb40-c158095d5bfa\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.397989 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f3a428e9-b35d-4f80-bb40-c158095d5bfa-ssh-key-openstack-edpm-ipam\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj\" (UID: \"f3a428e9-b35d-4f80-bb40-c158095d5bfa\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.397996 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3a428e9-b35d-4f80-bb40-c158095d5bfa-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj\" (UID: \"f3a428e9-b35d-4f80-bb40-c158095d5bfa\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.408034 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f3a428e9-b35d-4f80-bb40-c158095d5bfa-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj\" (UID: \"f3a428e9-b35d-4f80-bb40-c158095d5bfa\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.409153 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:50:55 crc kubenswrapper[4792]: E0301 09:50:55.409417 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.410707 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjgs4\" (UniqueName: \"kubernetes.io/projected/f3a428e9-b35d-4f80-bb40-c158095d5bfa-kube-api-access-sjgs4\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj\" (UID: \"f3a428e9-b35d-4f80-bb40-c158095d5bfa\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.520632 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj" Mar 01 09:50:56 crc kubenswrapper[4792]: W0301 09:50:56.024165 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3a428e9_b35d_4f80_bb40_c158095d5bfa.slice/crio-35fac49500e6f43757e1ca0a464c6df2c17291c1695abd5134f47b05e6e3347a WatchSource:0}: Error finding container 35fac49500e6f43757e1ca0a464c6df2c17291c1695abd5134f47b05e6e3347a: Status 404 returned error can't find the container with id 35fac49500e6f43757e1ca0a464c6df2c17291c1695abd5134f47b05e6e3347a Mar 01 09:50:56 crc kubenswrapper[4792]: I0301 09:50:56.024607 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj"] Mar 01 09:50:56 crc kubenswrapper[4792]: I0301 09:50:56.027177 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 01 09:50:57 crc kubenswrapper[4792]: I0301 09:50:57.032661 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj" event={"ID":"f3a428e9-b35d-4f80-bb40-c158095d5bfa","Type":"ContainerStarted","Data":"f56e4a269d477fb54657636d48bf2e78c700a87055d56a6f2355bd8762f25fe7"} Mar 01 09:50:57 crc kubenswrapper[4792]: I0301 09:50:57.035146 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj" event={"ID":"f3a428e9-b35d-4f80-bb40-c158095d5bfa","Type":"ContainerStarted","Data":"35fac49500e6f43757e1ca0a464c6df2c17291c1695abd5134f47b05e6e3347a"} Mar 01 09:50:58 crc kubenswrapper[4792]: I0301 09:50:58.063083 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj" podStartSLOduration=2.313029307 podStartE2EDuration="3.063055782s" podCreationTimestamp="2026-03-01 09:50:55 +0000 UTC" firstStartedPulling="2026-03-01 09:50:56.026947773 +0000 UTC m=+2585.268826970" lastFinishedPulling="2026-03-01 09:50:56.776974248 +0000 UTC m=+2586.018853445" observedRunningTime="2026-03-01 09:50:58.059108684 +0000 UTC m=+2587.300987881" watchObservedRunningTime="2026-03-01 09:50:58.063055782 +0000 UTC m=+2587.304934979" Mar 01 09:51:03 crc kubenswrapper[4792]: I0301 09:51:03.087355 4792 generic.go:334] "Generic (PLEG): container finished" podID="f3a428e9-b35d-4f80-bb40-c158095d5bfa" containerID="f56e4a269d477fb54657636d48bf2e78c700a87055d56a6f2355bd8762f25fe7" exitCode=0 Mar 01 09:51:03 crc kubenswrapper[4792]: I0301 09:51:03.087388 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj" event={"ID":"f3a428e9-b35d-4f80-bb40-c158095d5bfa","Type":"ContainerDied","Data":"f56e4a269d477fb54657636d48bf2e78c700a87055d56a6f2355bd8762f25fe7"} Mar 01 09:51:04 crc kubenswrapper[4792]: I0301 09:51:04.477165 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj" Mar 01 09:51:04 crc kubenswrapper[4792]: I0301 09:51:04.520945 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f3a428e9-b35d-4f80-bb40-c158095d5bfa-ceph\") pod \"f3a428e9-b35d-4f80-bb40-c158095d5bfa\" (UID: \"f3a428e9-b35d-4f80-bb40-c158095d5bfa\") " Mar 01 09:51:04 crc kubenswrapper[4792]: I0301 09:51:04.521156 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f3a428e9-b35d-4f80-bb40-c158095d5bfa-ssh-key-openstack-edpm-ipam\") pod \"f3a428e9-b35d-4f80-bb40-c158095d5bfa\" (UID: \"f3a428e9-b35d-4f80-bb40-c158095d5bfa\") " Mar 01 09:51:04 crc kubenswrapper[4792]: I0301 09:51:04.521216 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjgs4\" (UniqueName: \"kubernetes.io/projected/f3a428e9-b35d-4f80-bb40-c158095d5bfa-kube-api-access-sjgs4\") pod \"f3a428e9-b35d-4f80-bb40-c158095d5bfa\" (UID: \"f3a428e9-b35d-4f80-bb40-c158095d5bfa\") " Mar 01 09:51:04 crc kubenswrapper[4792]: I0301 09:51:04.521263 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3a428e9-b35d-4f80-bb40-c158095d5bfa-inventory\") pod \"f3a428e9-b35d-4f80-bb40-c158095d5bfa\" (UID: \"f3a428e9-b35d-4f80-bb40-c158095d5bfa\") " Mar 01 09:51:04 crc kubenswrapper[4792]: I0301 09:51:04.526275 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3a428e9-b35d-4f80-bb40-c158095d5bfa-kube-api-access-sjgs4" (OuterVolumeSpecName: "kube-api-access-sjgs4") pod "f3a428e9-b35d-4f80-bb40-c158095d5bfa" (UID: "f3a428e9-b35d-4f80-bb40-c158095d5bfa"). InnerVolumeSpecName "kube-api-access-sjgs4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:51:04 crc kubenswrapper[4792]: I0301 09:51:04.526336 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3a428e9-b35d-4f80-bb40-c158095d5bfa-ceph" (OuterVolumeSpecName: "ceph") pod "f3a428e9-b35d-4f80-bb40-c158095d5bfa" (UID: "f3a428e9-b35d-4f80-bb40-c158095d5bfa"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:51:04 crc kubenswrapper[4792]: I0301 09:51:04.545418 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3a428e9-b35d-4f80-bb40-c158095d5bfa-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f3a428e9-b35d-4f80-bb40-c158095d5bfa" (UID: "f3a428e9-b35d-4f80-bb40-c158095d5bfa"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:51:04 crc kubenswrapper[4792]: I0301 09:51:04.546101 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3a428e9-b35d-4f80-bb40-c158095d5bfa-inventory" (OuterVolumeSpecName: "inventory") pod "f3a428e9-b35d-4f80-bb40-c158095d5bfa" (UID: "f3a428e9-b35d-4f80-bb40-c158095d5bfa"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:51:04 crc kubenswrapper[4792]: I0301 09:51:04.623425 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f3a428e9-b35d-4f80-bb40-c158095d5bfa-ceph\") on node \"crc\" DevicePath \"\"" Mar 01 09:51:04 crc kubenswrapper[4792]: I0301 09:51:04.623463 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f3a428e9-b35d-4f80-bb40-c158095d5bfa-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:51:04 crc kubenswrapper[4792]: I0301 09:51:04.623474 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjgs4\" (UniqueName: \"kubernetes.io/projected/f3a428e9-b35d-4f80-bb40-c158095d5bfa-kube-api-access-sjgs4\") on node \"crc\" DevicePath \"\"" Mar 01 09:51:04 crc kubenswrapper[4792]: I0301 09:51:04.623484 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3a428e9-b35d-4f80-bb40-c158095d5bfa-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.132231 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj" event={"ID":"f3a428e9-b35d-4f80-bb40-c158095d5bfa","Type":"ContainerDied","Data":"35fac49500e6f43757e1ca0a464c6df2c17291c1695abd5134f47b05e6e3347a"} Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.132298 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35fac49500e6f43757e1ca0a464c6df2c17291c1695abd5134f47b05e6e3347a" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.132400 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.231383 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl"] Mar 01 09:51:05 crc kubenswrapper[4792]: E0301 09:51:05.231877 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3a428e9-b35d-4f80-bb40-c158095d5bfa" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.231986 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3a428e9-b35d-4f80-bb40-c158095d5bfa" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.232309 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3a428e9-b35d-4f80-bb40-c158095d5bfa" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.233018 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.237589 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.237632 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.238154 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.238239 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.239245 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.240572 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.242522 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl"] Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.336161 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc5rl\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.336467 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt4tm\" (UniqueName: \"kubernetes.io/projected/e4b8a64b-6bea-426c-b1f5-2372342d4211-kube-api-access-mt4tm\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc5rl\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.336640 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc5rl\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.336753 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e4b8a64b-6bea-426c-b1f5-2372342d4211-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc5rl\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.337355 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc5rl\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.337533 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc5rl\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.439327 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc5rl\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.439411 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt4tm\" (UniqueName: \"kubernetes.io/projected/e4b8a64b-6bea-426c-b1f5-2372342d4211-kube-api-access-mt4tm\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc5rl\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.439458 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc5rl\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.439486 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e4b8a64b-6bea-426c-b1f5-2372342d4211-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc5rl\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.439530 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc5rl\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.439594 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc5rl\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.440565 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e4b8a64b-6bea-426c-b1f5-2372342d4211-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc5rl\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.447560 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc5rl\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.448246 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc5rl\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.449338 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc5rl\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.450468 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc5rl\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.463445 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt4tm\" (UniqueName: \"kubernetes.io/projected/e4b8a64b-6bea-426c-b1f5-2372342d4211-kube-api-access-mt4tm\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc5rl\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.556387 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" Mar 01 09:51:06 crc kubenswrapper[4792]: I0301 09:51:06.062260 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl"] Mar 01 09:51:06 crc kubenswrapper[4792]: I0301 09:51:06.140944 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" event={"ID":"e4b8a64b-6bea-426c-b1f5-2372342d4211","Type":"ContainerStarted","Data":"c86975f99bba1488fcf86272ae150d1c14d78a7ddfa1c7f9f1e74f6697ff114d"} Mar 01 09:51:07 crc kubenswrapper[4792]: I0301 09:51:07.153552 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" event={"ID":"e4b8a64b-6bea-426c-b1f5-2372342d4211","Type":"ContainerStarted","Data":"23442558607cc4d8bcc03bb7871d1f97928a51b594e8f3c0b6969fd7fedbf63d"} Mar 01 09:51:10 crc kubenswrapper[4792]: I0301 09:51:10.409110 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:51:10 crc kubenswrapper[4792]: E0301 09:51:10.409840 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:51:21 crc kubenswrapper[4792]: I0301 09:51:21.414728 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:51:21 crc kubenswrapper[4792]: E0301 09:51:21.415598 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:51:35 crc kubenswrapper[4792]: I0301 09:51:35.409038 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:51:35 crc kubenswrapper[4792]: E0301 09:51:35.410001 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:51:46 crc kubenswrapper[4792]: I0301 09:51:46.544775 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:51:46 crc kubenswrapper[4792]: E0301 09:51:46.569684 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:52:00 crc kubenswrapper[4792]: I0301 09:52:00.154861 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" podStartSLOduration=54.776781499 podStartE2EDuration="55.154843522s" podCreationTimestamp="2026-03-01 09:51:05 +0000 UTC" firstStartedPulling="2026-03-01 09:51:06.064769059 +0000 UTC m=+2595.306648256" lastFinishedPulling="2026-03-01 09:51:06.442831082 +0000 UTC m=+2595.684710279" observedRunningTime="2026-03-01 09:51:07.175147328 +0000 UTC m=+2596.417026525" watchObservedRunningTime="2026-03-01 09:52:00.154843522 +0000 UTC m=+2649.396722719" Mar 01 09:52:00 crc kubenswrapper[4792]: I0301 09:52:00.158002 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539312-tpqkw"] Mar 01 09:52:00 crc kubenswrapper[4792]: I0301 09:52:00.160551 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539312-tpqkw" Mar 01 09:52:00 crc kubenswrapper[4792]: I0301 09:52:00.185115 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:52:00 crc kubenswrapper[4792]: I0301 09:52:00.185363 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:52:00 crc kubenswrapper[4792]: I0301 09:52:00.185895 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:52:00 crc kubenswrapper[4792]: I0301 09:52:00.217151 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539312-tpqkw"] Mar 01 09:52:00 crc kubenswrapper[4792]: I0301 09:52:00.242360 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn774\" (UniqueName: \"kubernetes.io/projected/11abc020-6c8a-4de3-8afc-229196293ab0-kube-api-access-mn774\") pod \"auto-csr-approver-29539312-tpqkw\" (UID: \"11abc020-6c8a-4de3-8afc-229196293ab0\") " pod="openshift-infra/auto-csr-approver-29539312-tpqkw" Mar 01 09:52:00 crc kubenswrapper[4792]: I0301 09:52:00.343949 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn774\" (UniqueName: \"kubernetes.io/projected/11abc020-6c8a-4de3-8afc-229196293ab0-kube-api-access-mn774\") pod \"auto-csr-approver-29539312-tpqkw\" (UID: \"11abc020-6c8a-4de3-8afc-229196293ab0\") " pod="openshift-infra/auto-csr-approver-29539312-tpqkw" Mar 01 09:52:00 crc kubenswrapper[4792]: I0301 09:52:00.372181 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn774\" (UniqueName: \"kubernetes.io/projected/11abc020-6c8a-4de3-8afc-229196293ab0-kube-api-access-mn774\") pod \"auto-csr-approver-29539312-tpqkw\" (UID: \"11abc020-6c8a-4de3-8afc-229196293ab0\") " pod="openshift-infra/auto-csr-approver-29539312-tpqkw" Mar 01 09:52:00 crc kubenswrapper[4792]: I0301 09:52:00.500706 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539312-tpqkw" Mar 01 09:52:00 crc kubenswrapper[4792]: I0301 09:52:00.969679 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539312-tpqkw"] Mar 01 09:52:01 crc kubenswrapper[4792]: I0301 09:52:01.418570 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:52:01 crc kubenswrapper[4792]: E0301 09:52:01.419054 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:52:01 crc kubenswrapper[4792]: I0301 09:52:01.711788 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539312-tpqkw" event={"ID":"11abc020-6c8a-4de3-8afc-229196293ab0","Type":"ContainerStarted","Data":"4fba4a423da85678a96691dddd15b5d9d730ca82d3b8aec89a60069a039f8a2c"} Mar 01 09:52:02 crc kubenswrapper[4792]: I0301 09:52:02.722999 4792 generic.go:334] "Generic (PLEG): container finished" podID="11abc020-6c8a-4de3-8afc-229196293ab0" containerID="89836f5e70f069d0d23a66a3f24b77f6002210b440a744a89543043e75793243" exitCode=0 Mar 01 09:52:02 crc kubenswrapper[4792]: I0301 09:52:02.723045 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539312-tpqkw" event={"ID":"11abc020-6c8a-4de3-8afc-229196293ab0","Type":"ContainerDied","Data":"89836f5e70f069d0d23a66a3f24b77f6002210b440a744a89543043e75793243"} Mar 01 09:52:04 crc kubenswrapper[4792]: I0301 09:52:04.007054 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539312-tpqkw" Mar 01 09:52:04 crc kubenswrapper[4792]: I0301 09:52:04.208460 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mn774\" (UniqueName: \"kubernetes.io/projected/11abc020-6c8a-4de3-8afc-229196293ab0-kube-api-access-mn774\") pod \"11abc020-6c8a-4de3-8afc-229196293ab0\" (UID: \"11abc020-6c8a-4de3-8afc-229196293ab0\") " Mar 01 09:52:04 crc kubenswrapper[4792]: I0301 09:52:04.214192 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11abc020-6c8a-4de3-8afc-229196293ab0-kube-api-access-mn774" (OuterVolumeSpecName: "kube-api-access-mn774") pod "11abc020-6c8a-4de3-8afc-229196293ab0" (UID: "11abc020-6c8a-4de3-8afc-229196293ab0"). InnerVolumeSpecName "kube-api-access-mn774". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:52:04 crc kubenswrapper[4792]: I0301 09:52:04.311659 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mn774\" (UniqueName: \"kubernetes.io/projected/11abc020-6c8a-4de3-8afc-229196293ab0-kube-api-access-mn774\") on node \"crc\" DevicePath \"\"" Mar 01 09:52:04 crc kubenswrapper[4792]: I0301 09:52:04.740627 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539312-tpqkw" event={"ID":"11abc020-6c8a-4de3-8afc-229196293ab0","Type":"ContainerDied","Data":"4fba4a423da85678a96691dddd15b5d9d730ca82d3b8aec89a60069a039f8a2c"} Mar 01 09:52:04 crc kubenswrapper[4792]: I0301 09:52:04.740670 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fba4a423da85678a96691dddd15b5d9d730ca82d3b8aec89a60069a039f8a2c" Mar 01 09:52:04 crc kubenswrapper[4792]: I0301 09:52:04.740720 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539312-tpqkw" Mar 01 09:52:05 crc kubenswrapper[4792]: I0301 09:52:05.072985 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539306-tt9nk"] Mar 01 09:52:05 crc kubenswrapper[4792]: I0301 09:52:05.080587 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539306-tt9nk"] Mar 01 09:52:05 crc kubenswrapper[4792]: I0301 09:52:05.420559 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5d29db8-7573-4364-9e18-20658b790d1f" path="/var/lib/kubelet/pods/f5d29db8-7573-4364-9e18-20658b790d1f/volumes" Mar 01 09:52:13 crc kubenswrapper[4792]: I0301 09:52:13.409877 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:52:13 crc kubenswrapper[4792]: E0301 09:52:13.410751 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:52:21 crc kubenswrapper[4792]: I0301 09:52:21.879445 4792 generic.go:334] "Generic (PLEG): container finished" podID="e4b8a64b-6bea-426c-b1f5-2372342d4211" containerID="23442558607cc4d8bcc03bb7871d1f97928a51b594e8f3c0b6969fd7fedbf63d" exitCode=0 Mar 01 09:52:21 crc kubenswrapper[4792]: I0301 09:52:21.879588 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" event={"ID":"e4b8a64b-6bea-426c-b1f5-2372342d4211","Type":"ContainerDied","Data":"23442558607cc4d8bcc03bb7871d1f97928a51b594e8f3c0b6969fd7fedbf63d"} Mar 01 09:52:23 crc kubenswrapper[4792]: I0301 09:52:23.275714 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" Mar 01 09:52:23 crc kubenswrapper[4792]: I0301 09:52:23.473254 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-ceph\") pod \"e4b8a64b-6bea-426c-b1f5-2372342d4211\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " Mar 01 09:52:23 crc kubenswrapper[4792]: I0301 09:52:23.473315 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-inventory\") pod \"e4b8a64b-6bea-426c-b1f5-2372342d4211\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " Mar 01 09:52:23 crc kubenswrapper[4792]: I0301 09:52:23.474279 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-ssh-key-openstack-edpm-ipam\") pod \"e4b8a64b-6bea-426c-b1f5-2372342d4211\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " Mar 01 09:52:23 crc kubenswrapper[4792]: I0301 09:52:23.474322 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mt4tm\" (UniqueName: \"kubernetes.io/projected/e4b8a64b-6bea-426c-b1f5-2372342d4211-kube-api-access-mt4tm\") pod \"e4b8a64b-6bea-426c-b1f5-2372342d4211\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " Mar 01 09:52:23 crc kubenswrapper[4792]: I0301 09:52:23.474382 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e4b8a64b-6bea-426c-b1f5-2372342d4211-ovncontroller-config-0\") pod \"e4b8a64b-6bea-426c-b1f5-2372342d4211\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " Mar 01 09:52:23 crc kubenswrapper[4792]: I0301 09:52:23.474415 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-ovn-combined-ca-bundle\") pod \"e4b8a64b-6bea-426c-b1f5-2372342d4211\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " Mar 01 09:52:23 crc kubenswrapper[4792]: I0301 09:52:23.479499 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4b8a64b-6bea-426c-b1f5-2372342d4211-kube-api-access-mt4tm" (OuterVolumeSpecName: "kube-api-access-mt4tm") pod "e4b8a64b-6bea-426c-b1f5-2372342d4211" (UID: "e4b8a64b-6bea-426c-b1f5-2372342d4211"). InnerVolumeSpecName "kube-api-access-mt4tm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:52:23 crc kubenswrapper[4792]: I0301 09:52:23.480453 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-ceph" (OuterVolumeSpecName: "ceph") pod "e4b8a64b-6bea-426c-b1f5-2372342d4211" (UID: "e4b8a64b-6bea-426c-b1f5-2372342d4211"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:52:23 crc kubenswrapper[4792]: I0301 09:52:23.480897 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "e4b8a64b-6bea-426c-b1f5-2372342d4211" (UID: "e4b8a64b-6bea-426c-b1f5-2372342d4211"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:52:23 crc kubenswrapper[4792]: I0301 09:52:23.498864 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4b8a64b-6bea-426c-b1f5-2372342d4211-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "e4b8a64b-6bea-426c-b1f5-2372342d4211" (UID: "e4b8a64b-6bea-426c-b1f5-2372342d4211"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:52:23 crc kubenswrapper[4792]: I0301 09:52:23.499478 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-inventory" (OuterVolumeSpecName: "inventory") pod "e4b8a64b-6bea-426c-b1f5-2372342d4211" (UID: "e4b8a64b-6bea-426c-b1f5-2372342d4211"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:52:23 crc kubenswrapper[4792]: I0301 09:52:23.507687 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e4b8a64b-6bea-426c-b1f5-2372342d4211" (UID: "e4b8a64b-6bea-426c-b1f5-2372342d4211"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:52:23 crc kubenswrapper[4792]: I0301 09:52:23.576562 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-ceph\") on node \"crc\" DevicePath \"\"" Mar 01 09:52:23 crc kubenswrapper[4792]: I0301 09:52:23.576601 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 09:52:23 crc kubenswrapper[4792]: I0301 09:52:23.576613 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:52:23 crc kubenswrapper[4792]: I0301 09:52:23.576623 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mt4tm\" (UniqueName: \"kubernetes.io/projected/e4b8a64b-6bea-426c-b1f5-2372342d4211-kube-api-access-mt4tm\") on node \"crc\" DevicePath \"\"" Mar 01 09:52:23 crc kubenswrapper[4792]: I0301 09:52:23.576632 4792 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e4b8a64b-6bea-426c-b1f5-2372342d4211-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 01 09:52:23 crc kubenswrapper[4792]: I0301 09:52:23.576640 4792 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:52:23 crc kubenswrapper[4792]: I0301 09:52:23.895837 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" event={"ID":"e4b8a64b-6bea-426c-b1f5-2372342d4211","Type":"ContainerDied","Data":"c86975f99bba1488fcf86272ae150d1c14d78a7ddfa1c7f9f1e74f6697ff114d"} Mar 01 09:52:23 crc kubenswrapper[4792]: I0301 09:52:23.895882 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c86975f99bba1488fcf86272ae150d1c14d78a7ddfa1c7f9f1e74f6697ff114d" Mar 01 09:52:23 crc kubenswrapper[4792]: I0301 09:52:23.895900 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.014214 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt"] Mar 01 09:52:24 crc kubenswrapper[4792]: E0301 09:52:24.014571 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11abc020-6c8a-4de3-8afc-229196293ab0" containerName="oc" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.014586 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="11abc020-6c8a-4de3-8afc-229196293ab0" containerName="oc" Mar 01 09:52:24 crc kubenswrapper[4792]: E0301 09:52:24.014600 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4b8a64b-6bea-426c-b1f5-2372342d4211" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.014608 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4b8a64b-6bea-426c-b1f5-2372342d4211" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.014779 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="11abc020-6c8a-4de3-8afc-229196293ab0" containerName="oc" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.014800 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4b8a64b-6bea-426c-b1f5-2372342d4211" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.015425 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.017748 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.017900 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.018009 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.018575 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.018834 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.019166 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.019407 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.033840 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt"] Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.186344 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.186413 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.186453 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.186481 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.187038 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr4f2\" (UniqueName: \"kubernetes.io/projected/f737af00-5e6f-4a95-bf94-738b72990ebd-kube-api-access-cr4f2\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.187087 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.187181 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.288869 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.289414 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.289479 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.289526 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.289556 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.289594 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr4f2\" (UniqueName: \"kubernetes.io/projected/f737af00-5e6f-4a95-bf94-738b72990ebd-kube-api-access-cr4f2\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.289629 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.292804 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.293145 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.294563 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.295678 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.298492 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.300116 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.310789 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr4f2\" (UniqueName: \"kubernetes.io/projected/f737af00-5e6f-4a95-bf94-738b72990ebd-kube-api-access-cr4f2\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.364522 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.684838 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt"] Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.919327 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" event={"ID":"f737af00-5e6f-4a95-bf94-738b72990ebd","Type":"ContainerStarted","Data":"87d887549298c09ffe2d57352b333fabf0848f6fb2f69c640fc971765e3e3259"} Mar 01 09:52:25 crc kubenswrapper[4792]: I0301 09:52:25.928120 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" event={"ID":"f737af00-5e6f-4a95-bf94-738b72990ebd","Type":"ContainerStarted","Data":"b89dd5932c826da6e3e27b6cf0817bc475f510c4beb76bbcb78cfad314ff427c"} Mar 01 09:52:25 crc kubenswrapper[4792]: I0301 09:52:25.955019 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" podStartSLOduration=2.521233301 podStartE2EDuration="2.955001046s" podCreationTimestamp="2026-03-01 09:52:23 +0000 UTC" firstStartedPulling="2026-03-01 09:52:24.691662615 +0000 UTC m=+2673.933541812" lastFinishedPulling="2026-03-01 09:52:25.12543036 +0000 UTC m=+2674.367309557" observedRunningTime="2026-03-01 09:52:25.949346956 +0000 UTC m=+2675.191226153" watchObservedRunningTime="2026-03-01 09:52:25.955001046 +0000 UTC m=+2675.196880243" Mar 01 09:52:27 crc kubenswrapper[4792]: I0301 09:52:27.408946 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:52:27 crc kubenswrapper[4792]: E0301 09:52:27.409433 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:52:31 crc kubenswrapper[4792]: I0301 09:52:31.315796 4792 scope.go:117] "RemoveContainer" containerID="739afaccd13faa05c6c15e2c6b70ac689c35aa13a309f0b869c97a20dddff65e" Mar 01 09:52:40 crc kubenswrapper[4792]: I0301 09:52:40.409299 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:52:40 crc kubenswrapper[4792]: E0301 09:52:40.410245 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:52:53 crc kubenswrapper[4792]: I0301 09:52:53.409996 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:52:53 crc kubenswrapper[4792]: E0301 09:52:53.411048 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:53:06 crc kubenswrapper[4792]: I0301 09:53:06.408817 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:53:06 crc kubenswrapper[4792]: E0301 09:53:06.409464 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:53:19 crc kubenswrapper[4792]: I0301 09:53:19.409712 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:53:19 crc kubenswrapper[4792]: E0301 09:53:19.410568 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:53:26 crc kubenswrapper[4792]: I0301 09:53:26.421058 4792 generic.go:334] "Generic (PLEG): container finished" podID="f737af00-5e6f-4a95-bf94-738b72990ebd" containerID="b89dd5932c826da6e3e27b6cf0817bc475f510c4beb76bbcb78cfad314ff427c" exitCode=0 Mar 01 09:53:26 crc kubenswrapper[4792]: I0301 09:53:26.421332 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" event={"ID":"f737af00-5e6f-4a95-bf94-738b72990ebd","Type":"ContainerDied","Data":"b89dd5932c826da6e3e27b6cf0817bc475f510c4beb76bbcb78cfad314ff427c"} Mar 01 09:53:27 crc kubenswrapper[4792]: I0301 09:53:27.809832 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:53:27 crc kubenswrapper[4792]: I0301 09:53:27.849257 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-neutron-ovn-metadata-agent-neutron-config-0\") pod \"f737af00-5e6f-4a95-bf94-738b72990ebd\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " Mar 01 09:53:27 crc kubenswrapper[4792]: I0301 09:53:27.849312 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-inventory\") pod \"f737af00-5e6f-4a95-bf94-738b72990ebd\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " Mar 01 09:53:27 crc kubenswrapper[4792]: I0301 09:53:27.849337 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cr4f2\" (UniqueName: \"kubernetes.io/projected/f737af00-5e6f-4a95-bf94-738b72990ebd-kube-api-access-cr4f2\") pod \"f737af00-5e6f-4a95-bf94-738b72990ebd\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " Mar 01 09:53:27 crc kubenswrapper[4792]: I0301 09:53:27.849397 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-ceph\") pod \"f737af00-5e6f-4a95-bf94-738b72990ebd\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " Mar 01 09:53:27 crc kubenswrapper[4792]: I0301 09:53:27.850242 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-ssh-key-openstack-edpm-ipam\") pod \"f737af00-5e6f-4a95-bf94-738b72990ebd\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " Mar 01 09:53:27 crc kubenswrapper[4792]: I0301 09:53:27.850457 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-nova-metadata-neutron-config-0\") pod \"f737af00-5e6f-4a95-bf94-738b72990ebd\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " Mar 01 09:53:27 crc kubenswrapper[4792]: I0301 09:53:27.850513 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-neutron-metadata-combined-ca-bundle\") pod \"f737af00-5e6f-4a95-bf94-738b72990ebd\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " Mar 01 09:53:27 crc kubenswrapper[4792]: I0301 09:53:27.858952 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f737af00-5e6f-4a95-bf94-738b72990ebd-kube-api-access-cr4f2" (OuterVolumeSpecName: "kube-api-access-cr4f2") pod "f737af00-5e6f-4a95-bf94-738b72990ebd" (UID: "f737af00-5e6f-4a95-bf94-738b72990ebd"). InnerVolumeSpecName "kube-api-access-cr4f2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:53:27 crc kubenswrapper[4792]: I0301 09:53:27.859106 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "f737af00-5e6f-4a95-bf94-738b72990ebd" (UID: "f737af00-5e6f-4a95-bf94-738b72990ebd"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:53:27 crc kubenswrapper[4792]: I0301 09:53:27.859155 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-ceph" (OuterVolumeSpecName: "ceph") pod "f737af00-5e6f-4a95-bf94-738b72990ebd" (UID: "f737af00-5e6f-4a95-bf94-738b72990ebd"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:53:27 crc kubenswrapper[4792]: I0301 09:53:27.884041 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "f737af00-5e6f-4a95-bf94-738b72990ebd" (UID: "f737af00-5e6f-4a95-bf94-738b72990ebd"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:53:27 crc kubenswrapper[4792]: I0301 09:53:27.886335 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-inventory" (OuterVolumeSpecName: "inventory") pod "f737af00-5e6f-4a95-bf94-738b72990ebd" (UID: "f737af00-5e6f-4a95-bf94-738b72990ebd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:53:27 crc kubenswrapper[4792]: I0301 09:53:27.889765 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "f737af00-5e6f-4a95-bf94-738b72990ebd" (UID: "f737af00-5e6f-4a95-bf94-738b72990ebd"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:53:27 crc kubenswrapper[4792]: I0301 09:53:27.889829 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f737af00-5e6f-4a95-bf94-738b72990ebd" (UID: "f737af00-5e6f-4a95-bf94-738b72990ebd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:53:27 crc kubenswrapper[4792]: I0301 09:53:27.952996 4792 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 01 09:53:27 crc kubenswrapper[4792]: I0301 09:53:27.953173 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 09:53:27 crc kubenswrapper[4792]: I0301 09:53:27.953247 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cr4f2\" (UniqueName: \"kubernetes.io/projected/f737af00-5e6f-4a95-bf94-738b72990ebd-kube-api-access-cr4f2\") on node \"crc\" DevicePath \"\"" Mar 01 09:53:27 crc kubenswrapper[4792]: I0301 09:53:27.953304 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-ceph\") on node \"crc\" DevicePath \"\"" Mar 01 09:53:27 crc kubenswrapper[4792]: I0301 09:53:27.953355 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:53:27 crc kubenswrapper[4792]: I0301 09:53:27.953405 4792 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 01 09:53:27 crc kubenswrapper[4792]: I0301 09:53:27.953458 4792 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.448791 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" event={"ID":"f737af00-5e6f-4a95-bf94-738b72990ebd","Type":"ContainerDied","Data":"87d887549298c09ffe2d57352b333fabf0848f6fb2f69c640fc971765e3e3259"} Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.448828 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87d887549298c09ffe2d57352b333fabf0848f6fb2f69c640fc971765e3e3259" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.449417 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.538902 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt"] Mar 01 09:53:28 crc kubenswrapper[4792]: E0301 09:53:28.539368 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f737af00-5e6f-4a95-bf94-738b72990ebd" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.539395 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f737af00-5e6f-4a95-bf94-738b72990ebd" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.539664 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f737af00-5e6f-4a95-bf94-738b72990ebd" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.540486 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.545079 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.545814 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.545972 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.546430 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.546555 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.546670 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.551158 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt"] Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.563743 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thspr\" (UniqueName: \"kubernetes.io/projected/c7230f65-7e9a-4455-8d25-c49393bfbafe-kube-api-access-thspr\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.563786 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.563819 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.563837 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.563891 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.563967 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.665247 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.665458 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thspr\" (UniqueName: \"kubernetes.io/projected/c7230f65-7e9a-4455-8d25-c49393bfbafe-kube-api-access-thspr\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.665562 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.665647 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.665719 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.665827 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.670122 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.671236 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.671996 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.672539 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.672817 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.688614 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thspr\" (UniqueName: \"kubernetes.io/projected/c7230f65-7e9a-4455-8d25-c49393bfbafe-kube-api-access-thspr\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.858544 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" Mar 01 09:53:29 crc kubenswrapper[4792]: I0301 09:53:29.370611 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt"] Mar 01 09:53:29 crc kubenswrapper[4792]: W0301 09:53:29.383645 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7230f65_7e9a_4455_8d25_c49393bfbafe.slice/crio-49739f0dd600f78c2fce8ea6d1d733fa8e1503a99144161a871f4dba20c10897 WatchSource:0}: Error finding container 49739f0dd600f78c2fce8ea6d1d733fa8e1503a99144161a871f4dba20c10897: Status 404 returned error can't find the container with id 49739f0dd600f78c2fce8ea6d1d733fa8e1503a99144161a871f4dba20c10897 Mar 01 09:53:29 crc kubenswrapper[4792]: I0301 09:53:29.457867 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" event={"ID":"c7230f65-7e9a-4455-8d25-c49393bfbafe","Type":"ContainerStarted","Data":"49739f0dd600f78c2fce8ea6d1d733fa8e1503a99144161a871f4dba20c10897"} Mar 01 09:53:30 crc kubenswrapper[4792]: I0301 09:53:30.466434 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" event={"ID":"c7230f65-7e9a-4455-8d25-c49393bfbafe","Type":"ContainerStarted","Data":"e854573aa1d54c447e18219253a483ccbce7dfebd37e9e5c1c0e176ad1346674"} Mar 01 09:53:30 crc kubenswrapper[4792]: I0301 09:53:30.484113 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" podStartSLOduration=2.002514313 podStartE2EDuration="2.484090492s" podCreationTimestamp="2026-03-01 09:53:28 +0000 UTC" firstStartedPulling="2026-03-01 09:53:29.38647308 +0000 UTC m=+2738.628352277" lastFinishedPulling="2026-03-01 09:53:29.868049259 +0000 UTC m=+2739.109928456" observedRunningTime="2026-03-01 09:53:30.4795671 +0000 UTC m=+2739.721446297" watchObservedRunningTime="2026-03-01 09:53:30.484090492 +0000 UTC m=+2739.725969689" Mar 01 09:53:33 crc kubenswrapper[4792]: I0301 09:53:33.408997 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:53:33 crc kubenswrapper[4792]: E0301 09:53:33.409499 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:53:48 crc kubenswrapper[4792]: I0301 09:53:48.408476 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:53:48 crc kubenswrapper[4792]: E0301 09:53:48.409257 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:54:00 crc kubenswrapper[4792]: I0301 09:54:00.151334 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539314-9brl7"] Mar 01 09:54:00 crc kubenswrapper[4792]: I0301 09:54:00.158311 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539314-9brl7" Mar 01 09:54:00 crc kubenswrapper[4792]: I0301 09:54:00.163252 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:54:00 crc kubenswrapper[4792]: I0301 09:54:00.163252 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:54:00 crc kubenswrapper[4792]: I0301 09:54:00.167162 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:54:00 crc kubenswrapper[4792]: I0301 09:54:00.186545 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539314-9brl7"] Mar 01 09:54:00 crc kubenswrapper[4792]: I0301 09:54:00.252284 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzcbw\" (UniqueName: \"kubernetes.io/projected/d633fb4c-b1e3-463f-af0a-2891b7130fc0-kube-api-access-bzcbw\") pod \"auto-csr-approver-29539314-9brl7\" (UID: \"d633fb4c-b1e3-463f-af0a-2891b7130fc0\") " pod="openshift-infra/auto-csr-approver-29539314-9brl7" Mar 01 09:54:00 crc kubenswrapper[4792]: I0301 09:54:00.353498 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzcbw\" (UniqueName: \"kubernetes.io/projected/d633fb4c-b1e3-463f-af0a-2891b7130fc0-kube-api-access-bzcbw\") pod \"auto-csr-approver-29539314-9brl7\" (UID: \"d633fb4c-b1e3-463f-af0a-2891b7130fc0\") " pod="openshift-infra/auto-csr-approver-29539314-9brl7" Mar 01 09:54:00 crc kubenswrapper[4792]: I0301 09:54:00.383648 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzcbw\" (UniqueName: \"kubernetes.io/projected/d633fb4c-b1e3-463f-af0a-2891b7130fc0-kube-api-access-bzcbw\") pod \"auto-csr-approver-29539314-9brl7\" (UID: \"d633fb4c-b1e3-463f-af0a-2891b7130fc0\") " pod="openshift-infra/auto-csr-approver-29539314-9brl7" Mar 01 09:54:00 crc kubenswrapper[4792]: I0301 09:54:00.409301 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:54:00 crc kubenswrapper[4792]: E0301 09:54:00.409947 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:54:00 crc kubenswrapper[4792]: I0301 09:54:00.487654 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539314-9brl7" Mar 01 09:54:00 crc kubenswrapper[4792]: I0301 09:54:00.955564 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539314-9brl7"] Mar 01 09:54:01 crc kubenswrapper[4792]: I0301 09:54:01.743401 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539314-9brl7" event={"ID":"d633fb4c-b1e3-463f-af0a-2891b7130fc0","Type":"ContainerStarted","Data":"81081e6aa3da0609653570cbd97f437d3c30358ab7d1ccfd226d44ca883e5aef"} Mar 01 09:54:02 crc kubenswrapper[4792]: I0301 09:54:02.751772 4792 generic.go:334] "Generic (PLEG): container finished" podID="d633fb4c-b1e3-463f-af0a-2891b7130fc0" containerID="19ca062443b337d8791859ab02de766e48126cb99d1f720dcaa520cb4be8f904" exitCode=0 Mar 01 09:54:02 crc kubenswrapper[4792]: I0301 09:54:02.751865 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539314-9brl7" event={"ID":"d633fb4c-b1e3-463f-af0a-2891b7130fc0","Type":"ContainerDied","Data":"19ca062443b337d8791859ab02de766e48126cb99d1f720dcaa520cb4be8f904"} Mar 01 09:54:04 crc kubenswrapper[4792]: I0301 09:54:04.050235 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539314-9brl7" Mar 01 09:54:04 crc kubenswrapper[4792]: I0301 09:54:04.137390 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzcbw\" (UniqueName: \"kubernetes.io/projected/d633fb4c-b1e3-463f-af0a-2891b7130fc0-kube-api-access-bzcbw\") pod \"d633fb4c-b1e3-463f-af0a-2891b7130fc0\" (UID: \"d633fb4c-b1e3-463f-af0a-2891b7130fc0\") " Mar 01 09:54:04 crc kubenswrapper[4792]: I0301 09:54:04.155128 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d633fb4c-b1e3-463f-af0a-2891b7130fc0-kube-api-access-bzcbw" (OuterVolumeSpecName: "kube-api-access-bzcbw") pod "d633fb4c-b1e3-463f-af0a-2891b7130fc0" (UID: "d633fb4c-b1e3-463f-af0a-2891b7130fc0"). InnerVolumeSpecName "kube-api-access-bzcbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:54:04 crc kubenswrapper[4792]: I0301 09:54:04.239799 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzcbw\" (UniqueName: \"kubernetes.io/projected/d633fb4c-b1e3-463f-af0a-2891b7130fc0-kube-api-access-bzcbw\") on node \"crc\" DevicePath \"\"" Mar 01 09:54:04 crc kubenswrapper[4792]: I0301 09:54:04.769071 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539314-9brl7" event={"ID":"d633fb4c-b1e3-463f-af0a-2891b7130fc0","Type":"ContainerDied","Data":"81081e6aa3da0609653570cbd97f437d3c30358ab7d1ccfd226d44ca883e5aef"} Mar 01 09:54:04 crc kubenswrapper[4792]: I0301 09:54:04.769115 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81081e6aa3da0609653570cbd97f437d3c30358ab7d1ccfd226d44ca883e5aef" Mar 01 09:54:04 crc kubenswrapper[4792]: I0301 09:54:04.769132 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539314-9brl7" Mar 01 09:54:05 crc kubenswrapper[4792]: I0301 09:54:05.124550 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539308-lnc4n"] Mar 01 09:54:05 crc kubenswrapper[4792]: I0301 09:54:05.138559 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539308-lnc4n"] Mar 01 09:54:05 crc kubenswrapper[4792]: I0301 09:54:05.420365 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e3f198d-a642-45b3-9a5a-fd5906670db8" path="/var/lib/kubelet/pods/1e3f198d-a642-45b3-9a5a-fd5906670db8/volumes" Mar 01 09:54:11 crc kubenswrapper[4792]: I0301 09:54:11.415230 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:54:11 crc kubenswrapper[4792]: E0301 09:54:11.416286 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:54:22 crc kubenswrapper[4792]: I0301 09:54:22.408520 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:54:22 crc kubenswrapper[4792]: E0301 09:54:22.409527 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:54:31 crc kubenswrapper[4792]: I0301 09:54:31.430669 4792 scope.go:117] "RemoveContainer" containerID="cde0b22712c7c2f1430743fdebf0e1e49438b47b056e66c49fd78cf546ba54f9" Mar 01 09:54:33 crc kubenswrapper[4792]: I0301 09:54:33.409020 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:54:33 crc kubenswrapper[4792]: E0301 09:54:33.409563 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:54:34 crc kubenswrapper[4792]: I0301 09:54:34.176369 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-c2xpd"] Mar 01 09:54:34 crc kubenswrapper[4792]: E0301 09:54:34.176725 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d633fb4c-b1e3-463f-af0a-2891b7130fc0" containerName="oc" Mar 01 09:54:34 crc kubenswrapper[4792]: I0301 09:54:34.176743 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d633fb4c-b1e3-463f-af0a-2891b7130fc0" containerName="oc" Mar 01 09:54:34 crc kubenswrapper[4792]: I0301 09:54:34.176942 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d633fb4c-b1e3-463f-af0a-2891b7130fc0" containerName="oc" Mar 01 09:54:34 crc kubenswrapper[4792]: I0301 09:54:34.178400 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c2xpd" Mar 01 09:54:34 crc kubenswrapper[4792]: I0301 09:54:34.191027 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c2xpd"] Mar 01 09:54:34 crc kubenswrapper[4792]: I0301 09:54:34.292800 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc2cn\" (UniqueName: \"kubernetes.io/projected/18317120-5fe3-415e-9646-44ec3a528eb7-kube-api-access-zc2cn\") pod \"redhat-operators-c2xpd\" (UID: \"18317120-5fe3-415e-9646-44ec3a528eb7\") " pod="openshift-marketplace/redhat-operators-c2xpd" Mar 01 09:54:34 crc kubenswrapper[4792]: I0301 09:54:34.292898 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18317120-5fe3-415e-9646-44ec3a528eb7-utilities\") pod \"redhat-operators-c2xpd\" (UID: \"18317120-5fe3-415e-9646-44ec3a528eb7\") " pod="openshift-marketplace/redhat-operators-c2xpd" Mar 01 09:54:34 crc kubenswrapper[4792]: I0301 09:54:34.292946 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18317120-5fe3-415e-9646-44ec3a528eb7-catalog-content\") pod \"redhat-operators-c2xpd\" (UID: \"18317120-5fe3-415e-9646-44ec3a528eb7\") " pod="openshift-marketplace/redhat-operators-c2xpd" Mar 01 09:54:34 crc kubenswrapper[4792]: I0301 09:54:34.394595 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18317120-5fe3-415e-9646-44ec3a528eb7-catalog-content\") pod \"redhat-operators-c2xpd\" (UID: \"18317120-5fe3-415e-9646-44ec3a528eb7\") " pod="openshift-marketplace/redhat-operators-c2xpd" Mar 01 09:54:34 crc kubenswrapper[4792]: I0301 09:54:34.394982 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zc2cn\" (UniqueName: \"kubernetes.io/projected/18317120-5fe3-415e-9646-44ec3a528eb7-kube-api-access-zc2cn\") pod \"redhat-operators-c2xpd\" (UID: \"18317120-5fe3-415e-9646-44ec3a528eb7\") " pod="openshift-marketplace/redhat-operators-c2xpd" Mar 01 09:54:34 crc kubenswrapper[4792]: I0301 09:54:34.395055 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18317120-5fe3-415e-9646-44ec3a528eb7-utilities\") pod \"redhat-operators-c2xpd\" (UID: \"18317120-5fe3-415e-9646-44ec3a528eb7\") " pod="openshift-marketplace/redhat-operators-c2xpd" Mar 01 09:54:34 crc kubenswrapper[4792]: I0301 09:54:34.395177 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18317120-5fe3-415e-9646-44ec3a528eb7-catalog-content\") pod \"redhat-operators-c2xpd\" (UID: \"18317120-5fe3-415e-9646-44ec3a528eb7\") " pod="openshift-marketplace/redhat-operators-c2xpd" Mar 01 09:54:34 crc kubenswrapper[4792]: I0301 09:54:34.395305 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18317120-5fe3-415e-9646-44ec3a528eb7-utilities\") pod \"redhat-operators-c2xpd\" (UID: \"18317120-5fe3-415e-9646-44ec3a528eb7\") " pod="openshift-marketplace/redhat-operators-c2xpd" Mar 01 09:54:34 crc kubenswrapper[4792]: I0301 09:54:34.416707 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc2cn\" (UniqueName: \"kubernetes.io/projected/18317120-5fe3-415e-9646-44ec3a528eb7-kube-api-access-zc2cn\") pod \"redhat-operators-c2xpd\" (UID: \"18317120-5fe3-415e-9646-44ec3a528eb7\") " pod="openshift-marketplace/redhat-operators-c2xpd" Mar 01 09:54:34 crc kubenswrapper[4792]: I0301 09:54:34.511850 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c2xpd" Mar 01 09:54:35 crc kubenswrapper[4792]: I0301 09:54:35.000928 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c2xpd"] Mar 01 09:54:35 crc kubenswrapper[4792]: I0301 09:54:35.020881 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2xpd" event={"ID":"18317120-5fe3-415e-9646-44ec3a528eb7","Type":"ContainerStarted","Data":"ee907b4b929868643143bf7639b65b69ce3a2888c7d73c8cbf27659d5681a348"} Mar 01 09:54:36 crc kubenswrapper[4792]: I0301 09:54:36.031712 4792 generic.go:334] "Generic (PLEG): container finished" podID="18317120-5fe3-415e-9646-44ec3a528eb7" containerID="b2f9d0e40b0a2b67e86560f5ccd1a86dd3a00bd2b6f96ef056d30c86017d4ef8" exitCode=0 Mar 01 09:54:36 crc kubenswrapper[4792]: I0301 09:54:36.031779 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2xpd" event={"ID":"18317120-5fe3-415e-9646-44ec3a528eb7","Type":"ContainerDied","Data":"b2f9d0e40b0a2b67e86560f5ccd1a86dd3a00bd2b6f96ef056d30c86017d4ef8"} Mar 01 09:54:37 crc kubenswrapper[4792]: I0301 09:54:37.046955 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2xpd" event={"ID":"18317120-5fe3-415e-9646-44ec3a528eb7","Type":"ContainerStarted","Data":"86395a2da12a91c13bfefd6b1284cb572b9ad1954fa1f434ffd1bbad0b6f04d0"} Mar 01 09:54:42 crc kubenswrapper[4792]: I0301 09:54:42.086296 4792 generic.go:334] "Generic (PLEG): container finished" podID="18317120-5fe3-415e-9646-44ec3a528eb7" containerID="86395a2da12a91c13bfefd6b1284cb572b9ad1954fa1f434ffd1bbad0b6f04d0" exitCode=0 Mar 01 09:54:42 crc kubenswrapper[4792]: I0301 09:54:42.086384 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2xpd" event={"ID":"18317120-5fe3-415e-9646-44ec3a528eb7","Type":"ContainerDied","Data":"86395a2da12a91c13bfefd6b1284cb572b9ad1954fa1f434ffd1bbad0b6f04d0"} Mar 01 09:54:43 crc kubenswrapper[4792]: I0301 09:54:43.107026 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2xpd" event={"ID":"18317120-5fe3-415e-9646-44ec3a528eb7","Type":"ContainerStarted","Data":"32b275015fe6c36ca8db45d58d20e68bddc88bd9f4816385a5f919ea21c9fa74"} Mar 01 09:54:43 crc kubenswrapper[4792]: I0301 09:54:43.127045 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-c2xpd" podStartSLOduration=2.69614874 podStartE2EDuration="9.127017224s" podCreationTimestamp="2026-03-01 09:54:34 +0000 UTC" firstStartedPulling="2026-03-01 09:54:36.034064606 +0000 UTC m=+2805.275943803" lastFinishedPulling="2026-03-01 09:54:42.46493308 +0000 UTC m=+2811.706812287" observedRunningTime="2026-03-01 09:54:43.124550583 +0000 UTC m=+2812.366429790" watchObservedRunningTime="2026-03-01 09:54:43.127017224 +0000 UTC m=+2812.368896451" Mar 01 09:54:44 crc kubenswrapper[4792]: I0301 09:54:44.513149 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-c2xpd" Mar 01 09:54:44 crc kubenswrapper[4792]: I0301 09:54:44.514110 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-c2xpd" Mar 01 09:54:45 crc kubenswrapper[4792]: I0301 09:54:45.410956 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:54:45 crc kubenswrapper[4792]: I0301 09:54:45.558689 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-c2xpd" podUID="18317120-5fe3-415e-9646-44ec3a528eb7" containerName="registry-server" probeResult="failure" output=< Mar 01 09:54:45 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 09:54:45 crc kubenswrapper[4792]: > Mar 01 09:54:46 crc kubenswrapper[4792]: I0301 09:54:46.139247 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerStarted","Data":"0c8bbdbe3553a3ed4617103bbd379245337bfad8f18870faba13af9b3c14caa1"} Mar 01 09:54:54 crc kubenswrapper[4792]: I0301 09:54:54.552571 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-c2xpd" Mar 01 09:54:54 crc kubenswrapper[4792]: I0301 09:54:54.597291 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-c2xpd" Mar 01 09:54:54 crc kubenswrapper[4792]: I0301 09:54:54.786810 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c2xpd"] Mar 01 09:54:56 crc kubenswrapper[4792]: I0301 09:54:56.226063 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-c2xpd" podUID="18317120-5fe3-415e-9646-44ec3a528eb7" containerName="registry-server" containerID="cri-o://32b275015fe6c36ca8db45d58d20e68bddc88bd9f4816385a5f919ea21c9fa74" gracePeriod=2 Mar 01 09:54:56 crc kubenswrapper[4792]: I0301 09:54:56.695542 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c2xpd" Mar 01 09:54:56 crc kubenswrapper[4792]: I0301 09:54:56.727507 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18317120-5fe3-415e-9646-44ec3a528eb7-catalog-content\") pod \"18317120-5fe3-415e-9646-44ec3a528eb7\" (UID: \"18317120-5fe3-415e-9646-44ec3a528eb7\") " Mar 01 09:54:56 crc kubenswrapper[4792]: I0301 09:54:56.727624 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18317120-5fe3-415e-9646-44ec3a528eb7-utilities\") pod \"18317120-5fe3-415e-9646-44ec3a528eb7\" (UID: \"18317120-5fe3-415e-9646-44ec3a528eb7\") " Mar 01 09:54:56 crc kubenswrapper[4792]: I0301 09:54:56.727656 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zc2cn\" (UniqueName: \"kubernetes.io/projected/18317120-5fe3-415e-9646-44ec3a528eb7-kube-api-access-zc2cn\") pod \"18317120-5fe3-415e-9646-44ec3a528eb7\" (UID: \"18317120-5fe3-415e-9646-44ec3a528eb7\") " Mar 01 09:54:56 crc kubenswrapper[4792]: I0301 09:54:56.728490 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18317120-5fe3-415e-9646-44ec3a528eb7-utilities" (OuterVolumeSpecName: "utilities") pod "18317120-5fe3-415e-9646-44ec3a528eb7" (UID: "18317120-5fe3-415e-9646-44ec3a528eb7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:54:56 crc kubenswrapper[4792]: I0301 09:54:56.734189 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18317120-5fe3-415e-9646-44ec3a528eb7-kube-api-access-zc2cn" (OuterVolumeSpecName: "kube-api-access-zc2cn") pod "18317120-5fe3-415e-9646-44ec3a528eb7" (UID: "18317120-5fe3-415e-9646-44ec3a528eb7"). InnerVolumeSpecName "kube-api-access-zc2cn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:54:56 crc kubenswrapper[4792]: I0301 09:54:56.830488 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18317120-5fe3-415e-9646-44ec3a528eb7-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:54:56 crc kubenswrapper[4792]: I0301 09:54:56.830525 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zc2cn\" (UniqueName: \"kubernetes.io/projected/18317120-5fe3-415e-9646-44ec3a528eb7-kube-api-access-zc2cn\") on node \"crc\" DevicePath \"\"" Mar 01 09:54:56 crc kubenswrapper[4792]: I0301 09:54:56.852649 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18317120-5fe3-415e-9646-44ec3a528eb7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "18317120-5fe3-415e-9646-44ec3a528eb7" (UID: "18317120-5fe3-415e-9646-44ec3a528eb7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:54:56 crc kubenswrapper[4792]: I0301 09:54:56.931427 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18317120-5fe3-415e-9646-44ec3a528eb7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:54:57 crc kubenswrapper[4792]: I0301 09:54:57.236594 4792 generic.go:334] "Generic (PLEG): container finished" podID="18317120-5fe3-415e-9646-44ec3a528eb7" containerID="32b275015fe6c36ca8db45d58d20e68bddc88bd9f4816385a5f919ea21c9fa74" exitCode=0 Mar 01 09:54:57 crc kubenswrapper[4792]: I0301 09:54:57.236640 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c2xpd" Mar 01 09:54:57 crc kubenswrapper[4792]: I0301 09:54:57.236658 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2xpd" event={"ID":"18317120-5fe3-415e-9646-44ec3a528eb7","Type":"ContainerDied","Data":"32b275015fe6c36ca8db45d58d20e68bddc88bd9f4816385a5f919ea21c9fa74"} Mar 01 09:54:57 crc kubenswrapper[4792]: I0301 09:54:57.237735 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2xpd" event={"ID":"18317120-5fe3-415e-9646-44ec3a528eb7","Type":"ContainerDied","Data":"ee907b4b929868643143bf7639b65b69ce3a2888c7d73c8cbf27659d5681a348"} Mar 01 09:54:57 crc kubenswrapper[4792]: I0301 09:54:57.237756 4792 scope.go:117] "RemoveContainer" containerID="32b275015fe6c36ca8db45d58d20e68bddc88bd9f4816385a5f919ea21c9fa74" Mar 01 09:54:57 crc kubenswrapper[4792]: I0301 09:54:57.267852 4792 scope.go:117] "RemoveContainer" containerID="86395a2da12a91c13bfefd6b1284cb572b9ad1954fa1f434ffd1bbad0b6f04d0" Mar 01 09:54:57 crc kubenswrapper[4792]: I0301 09:54:57.278444 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c2xpd"] Mar 01 09:54:57 crc kubenswrapper[4792]: I0301 09:54:57.297947 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-c2xpd"] Mar 01 09:54:57 crc kubenswrapper[4792]: I0301 09:54:57.301205 4792 scope.go:117] "RemoveContainer" containerID="b2f9d0e40b0a2b67e86560f5ccd1a86dd3a00bd2b6f96ef056d30c86017d4ef8" Mar 01 09:54:57 crc kubenswrapper[4792]: I0301 09:54:57.346802 4792 scope.go:117] "RemoveContainer" containerID="32b275015fe6c36ca8db45d58d20e68bddc88bd9f4816385a5f919ea21c9fa74" Mar 01 09:54:57 crc kubenswrapper[4792]: E0301 09:54:57.347686 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32b275015fe6c36ca8db45d58d20e68bddc88bd9f4816385a5f919ea21c9fa74\": container with ID starting with 32b275015fe6c36ca8db45d58d20e68bddc88bd9f4816385a5f919ea21c9fa74 not found: ID does not exist" containerID="32b275015fe6c36ca8db45d58d20e68bddc88bd9f4816385a5f919ea21c9fa74" Mar 01 09:54:57 crc kubenswrapper[4792]: I0301 09:54:57.347728 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32b275015fe6c36ca8db45d58d20e68bddc88bd9f4816385a5f919ea21c9fa74"} err="failed to get container status \"32b275015fe6c36ca8db45d58d20e68bddc88bd9f4816385a5f919ea21c9fa74\": rpc error: code = NotFound desc = could not find container \"32b275015fe6c36ca8db45d58d20e68bddc88bd9f4816385a5f919ea21c9fa74\": container with ID starting with 32b275015fe6c36ca8db45d58d20e68bddc88bd9f4816385a5f919ea21c9fa74 not found: ID does not exist" Mar 01 09:54:57 crc kubenswrapper[4792]: I0301 09:54:57.347806 4792 scope.go:117] "RemoveContainer" containerID="86395a2da12a91c13bfefd6b1284cb572b9ad1954fa1f434ffd1bbad0b6f04d0" Mar 01 09:54:57 crc kubenswrapper[4792]: E0301 09:54:57.348646 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86395a2da12a91c13bfefd6b1284cb572b9ad1954fa1f434ffd1bbad0b6f04d0\": container with ID starting with 86395a2da12a91c13bfefd6b1284cb572b9ad1954fa1f434ffd1bbad0b6f04d0 not found: ID does not exist" containerID="86395a2da12a91c13bfefd6b1284cb572b9ad1954fa1f434ffd1bbad0b6f04d0" Mar 01 09:54:57 crc kubenswrapper[4792]: I0301 09:54:57.348674 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86395a2da12a91c13bfefd6b1284cb572b9ad1954fa1f434ffd1bbad0b6f04d0"} err="failed to get container status \"86395a2da12a91c13bfefd6b1284cb572b9ad1954fa1f434ffd1bbad0b6f04d0\": rpc error: code = NotFound desc = could not find container \"86395a2da12a91c13bfefd6b1284cb572b9ad1954fa1f434ffd1bbad0b6f04d0\": container with ID starting with 86395a2da12a91c13bfefd6b1284cb572b9ad1954fa1f434ffd1bbad0b6f04d0 not found: ID does not exist" Mar 01 09:54:57 crc kubenswrapper[4792]: I0301 09:54:57.348687 4792 scope.go:117] "RemoveContainer" containerID="b2f9d0e40b0a2b67e86560f5ccd1a86dd3a00bd2b6f96ef056d30c86017d4ef8" Mar 01 09:54:57 crc kubenswrapper[4792]: E0301 09:54:57.348917 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2f9d0e40b0a2b67e86560f5ccd1a86dd3a00bd2b6f96ef056d30c86017d4ef8\": container with ID starting with b2f9d0e40b0a2b67e86560f5ccd1a86dd3a00bd2b6f96ef056d30c86017d4ef8 not found: ID does not exist" containerID="b2f9d0e40b0a2b67e86560f5ccd1a86dd3a00bd2b6f96ef056d30c86017d4ef8" Mar 01 09:54:57 crc kubenswrapper[4792]: I0301 09:54:57.348936 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2f9d0e40b0a2b67e86560f5ccd1a86dd3a00bd2b6f96ef056d30c86017d4ef8"} err="failed to get container status \"b2f9d0e40b0a2b67e86560f5ccd1a86dd3a00bd2b6f96ef056d30c86017d4ef8\": rpc error: code = NotFound desc = could not find container \"b2f9d0e40b0a2b67e86560f5ccd1a86dd3a00bd2b6f96ef056d30c86017d4ef8\": container with ID starting with b2f9d0e40b0a2b67e86560f5ccd1a86dd3a00bd2b6f96ef056d30c86017d4ef8 not found: ID does not exist" Mar 01 09:54:57 crc kubenswrapper[4792]: I0301 09:54:57.419389 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18317120-5fe3-415e-9646-44ec3a528eb7" path="/var/lib/kubelet/pods/18317120-5fe3-415e-9646-44ec3a528eb7/volumes" Mar 01 09:56:00 crc kubenswrapper[4792]: I0301 09:56:00.142043 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539316-8swf8"] Mar 01 09:56:00 crc kubenswrapper[4792]: E0301 09:56:00.142949 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18317120-5fe3-415e-9646-44ec3a528eb7" containerName="registry-server" Mar 01 09:56:00 crc kubenswrapper[4792]: I0301 09:56:00.142965 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="18317120-5fe3-415e-9646-44ec3a528eb7" containerName="registry-server" Mar 01 09:56:00 crc kubenswrapper[4792]: E0301 09:56:00.142981 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18317120-5fe3-415e-9646-44ec3a528eb7" containerName="extract-utilities" Mar 01 09:56:00 crc kubenswrapper[4792]: I0301 09:56:00.142989 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="18317120-5fe3-415e-9646-44ec3a528eb7" containerName="extract-utilities" Mar 01 09:56:00 crc kubenswrapper[4792]: E0301 09:56:00.143017 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18317120-5fe3-415e-9646-44ec3a528eb7" containerName="extract-content" Mar 01 09:56:00 crc kubenswrapper[4792]: I0301 09:56:00.143026 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="18317120-5fe3-415e-9646-44ec3a528eb7" containerName="extract-content" Mar 01 09:56:00 crc kubenswrapper[4792]: I0301 09:56:00.143199 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="18317120-5fe3-415e-9646-44ec3a528eb7" containerName="registry-server" Mar 01 09:56:00 crc kubenswrapper[4792]: I0301 09:56:00.143743 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539316-8swf8" Mar 01 09:56:00 crc kubenswrapper[4792]: I0301 09:56:00.145971 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:56:00 crc kubenswrapper[4792]: I0301 09:56:00.146010 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:56:00 crc kubenswrapper[4792]: I0301 09:56:00.146317 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:56:00 crc kubenswrapper[4792]: I0301 09:56:00.149464 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfcjg\" (UniqueName: \"kubernetes.io/projected/e9273c82-13c3-43c5-b90e-16fdb09f082e-kube-api-access-qfcjg\") pod \"auto-csr-approver-29539316-8swf8\" (UID: \"e9273c82-13c3-43c5-b90e-16fdb09f082e\") " pod="openshift-infra/auto-csr-approver-29539316-8swf8" Mar 01 09:56:00 crc kubenswrapper[4792]: I0301 09:56:00.160534 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539316-8swf8"] Mar 01 09:56:00 crc kubenswrapper[4792]: I0301 09:56:00.251388 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfcjg\" (UniqueName: \"kubernetes.io/projected/e9273c82-13c3-43c5-b90e-16fdb09f082e-kube-api-access-qfcjg\") pod \"auto-csr-approver-29539316-8swf8\" (UID: \"e9273c82-13c3-43c5-b90e-16fdb09f082e\") " pod="openshift-infra/auto-csr-approver-29539316-8swf8" Mar 01 09:56:00 crc kubenswrapper[4792]: I0301 09:56:00.269465 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfcjg\" (UniqueName: \"kubernetes.io/projected/e9273c82-13c3-43c5-b90e-16fdb09f082e-kube-api-access-qfcjg\") pod \"auto-csr-approver-29539316-8swf8\" (UID: \"e9273c82-13c3-43c5-b90e-16fdb09f082e\") " pod="openshift-infra/auto-csr-approver-29539316-8swf8" Mar 01 09:56:00 crc kubenswrapper[4792]: I0301 09:56:00.461065 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539316-8swf8" Mar 01 09:56:00 crc kubenswrapper[4792]: I0301 09:56:00.900949 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539316-8swf8"] Mar 01 09:56:00 crc kubenswrapper[4792]: W0301 09:56:00.903582 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9273c82_13c3_43c5_b90e_16fdb09f082e.slice/crio-1dad835c77013b58809a520c2651be9bac6b64755244aae7aa6a7ab543562843 WatchSource:0}: Error finding container 1dad835c77013b58809a520c2651be9bac6b64755244aae7aa6a7ab543562843: Status 404 returned error can't find the container with id 1dad835c77013b58809a520c2651be9bac6b64755244aae7aa6a7ab543562843 Mar 01 09:56:00 crc kubenswrapper[4792]: I0301 09:56:00.905697 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 01 09:56:01 crc kubenswrapper[4792]: I0301 09:56:01.801884 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539316-8swf8" event={"ID":"e9273c82-13c3-43c5-b90e-16fdb09f082e","Type":"ContainerStarted","Data":"1dad835c77013b58809a520c2651be9bac6b64755244aae7aa6a7ab543562843"} Mar 01 09:56:02 crc kubenswrapper[4792]: I0301 09:56:02.810833 4792 generic.go:334] "Generic (PLEG): container finished" podID="e9273c82-13c3-43c5-b90e-16fdb09f082e" containerID="3aeb9f44a1b454186ac24af4b6119b2ea036267663153223c161c28c89a3a926" exitCode=0 Mar 01 09:56:02 crc kubenswrapper[4792]: I0301 09:56:02.810930 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539316-8swf8" event={"ID":"e9273c82-13c3-43c5-b90e-16fdb09f082e","Type":"ContainerDied","Data":"3aeb9f44a1b454186ac24af4b6119b2ea036267663153223c161c28c89a3a926"} Mar 01 09:56:04 crc kubenswrapper[4792]: I0301 09:56:04.133193 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539316-8swf8" Mar 01 09:56:04 crc kubenswrapper[4792]: I0301 09:56:04.318203 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfcjg\" (UniqueName: \"kubernetes.io/projected/e9273c82-13c3-43c5-b90e-16fdb09f082e-kube-api-access-qfcjg\") pod \"e9273c82-13c3-43c5-b90e-16fdb09f082e\" (UID: \"e9273c82-13c3-43c5-b90e-16fdb09f082e\") " Mar 01 09:56:04 crc kubenswrapper[4792]: I0301 09:56:04.325220 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9273c82-13c3-43c5-b90e-16fdb09f082e-kube-api-access-qfcjg" (OuterVolumeSpecName: "kube-api-access-qfcjg") pod "e9273c82-13c3-43c5-b90e-16fdb09f082e" (UID: "e9273c82-13c3-43c5-b90e-16fdb09f082e"). InnerVolumeSpecName "kube-api-access-qfcjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:56:04 crc kubenswrapper[4792]: I0301 09:56:04.420251 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfcjg\" (UniqueName: \"kubernetes.io/projected/e9273c82-13c3-43c5-b90e-16fdb09f082e-kube-api-access-qfcjg\") on node \"crc\" DevicePath \"\"" Mar 01 09:56:04 crc kubenswrapper[4792]: I0301 09:56:04.828099 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539316-8swf8" event={"ID":"e9273c82-13c3-43c5-b90e-16fdb09f082e","Type":"ContainerDied","Data":"1dad835c77013b58809a520c2651be9bac6b64755244aae7aa6a7ab543562843"} Mar 01 09:56:04 crc kubenswrapper[4792]: I0301 09:56:04.828139 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1dad835c77013b58809a520c2651be9bac6b64755244aae7aa6a7ab543562843" Mar 01 09:56:04 crc kubenswrapper[4792]: I0301 09:56:04.828152 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539316-8swf8" Mar 01 09:56:05 crc kubenswrapper[4792]: I0301 09:56:05.204724 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539310-cr6qh"] Mar 01 09:56:05 crc kubenswrapper[4792]: I0301 09:56:05.212094 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539310-cr6qh"] Mar 01 09:56:05 crc kubenswrapper[4792]: I0301 09:56:05.418005 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf147424-57ff-455c-9aac-e32adcab851e" path="/var/lib/kubelet/pods/bf147424-57ff-455c-9aac-e32adcab851e/volumes" Mar 01 09:56:22 crc kubenswrapper[4792]: I0301 09:56:22.953659 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gd957"] Mar 01 09:56:22 crc kubenswrapper[4792]: E0301 09:56:22.955727 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9273c82-13c3-43c5-b90e-16fdb09f082e" containerName="oc" Mar 01 09:56:22 crc kubenswrapper[4792]: I0301 09:56:22.955815 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9273c82-13c3-43c5-b90e-16fdb09f082e" containerName="oc" Mar 01 09:56:22 crc kubenswrapper[4792]: I0301 09:56:22.956098 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9273c82-13c3-43c5-b90e-16fdb09f082e" containerName="oc" Mar 01 09:56:22 crc kubenswrapper[4792]: I0301 09:56:22.957463 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gd957" Mar 01 09:56:22 crc kubenswrapper[4792]: I0301 09:56:22.970318 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gd957"] Mar 01 09:56:22 crc kubenswrapper[4792]: I0301 09:56:22.993397 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/334b1950-af06-4648-98dc-543534fc216a-utilities\") pod \"community-operators-gd957\" (UID: \"334b1950-af06-4648-98dc-543534fc216a\") " pod="openshift-marketplace/community-operators-gd957" Mar 01 09:56:22 crc kubenswrapper[4792]: I0301 09:56:22.993538 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfwsm\" (UniqueName: \"kubernetes.io/projected/334b1950-af06-4648-98dc-543534fc216a-kube-api-access-gfwsm\") pod \"community-operators-gd957\" (UID: \"334b1950-af06-4648-98dc-543534fc216a\") " pod="openshift-marketplace/community-operators-gd957" Mar 01 09:56:22 crc kubenswrapper[4792]: I0301 09:56:22.993637 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/334b1950-af06-4648-98dc-543534fc216a-catalog-content\") pod \"community-operators-gd957\" (UID: \"334b1950-af06-4648-98dc-543534fc216a\") " pod="openshift-marketplace/community-operators-gd957" Mar 01 09:56:23 crc kubenswrapper[4792]: I0301 09:56:23.096153 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/334b1950-af06-4648-98dc-543534fc216a-catalog-content\") pod \"community-operators-gd957\" (UID: \"334b1950-af06-4648-98dc-543534fc216a\") " pod="openshift-marketplace/community-operators-gd957" Mar 01 09:56:23 crc kubenswrapper[4792]: I0301 09:56:23.096233 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/334b1950-af06-4648-98dc-543534fc216a-utilities\") pod \"community-operators-gd957\" (UID: \"334b1950-af06-4648-98dc-543534fc216a\") " pod="openshift-marketplace/community-operators-gd957" Mar 01 09:56:23 crc kubenswrapper[4792]: I0301 09:56:23.096308 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfwsm\" (UniqueName: \"kubernetes.io/projected/334b1950-af06-4648-98dc-543534fc216a-kube-api-access-gfwsm\") pod \"community-operators-gd957\" (UID: \"334b1950-af06-4648-98dc-543534fc216a\") " pod="openshift-marketplace/community-operators-gd957" Mar 01 09:56:23 crc kubenswrapper[4792]: I0301 09:56:23.096793 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/334b1950-af06-4648-98dc-543534fc216a-catalog-content\") pod \"community-operators-gd957\" (UID: \"334b1950-af06-4648-98dc-543534fc216a\") " pod="openshift-marketplace/community-operators-gd957" Mar 01 09:56:23 crc kubenswrapper[4792]: I0301 09:56:23.096879 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/334b1950-af06-4648-98dc-543534fc216a-utilities\") pod \"community-operators-gd957\" (UID: \"334b1950-af06-4648-98dc-543534fc216a\") " pod="openshift-marketplace/community-operators-gd957" Mar 01 09:56:23 crc kubenswrapper[4792]: I0301 09:56:23.128532 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfwsm\" (UniqueName: \"kubernetes.io/projected/334b1950-af06-4648-98dc-543534fc216a-kube-api-access-gfwsm\") pod \"community-operators-gd957\" (UID: \"334b1950-af06-4648-98dc-543534fc216a\") " pod="openshift-marketplace/community-operators-gd957" Mar 01 09:56:23 crc kubenswrapper[4792]: I0301 09:56:23.310377 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gd957" Mar 01 09:56:23 crc kubenswrapper[4792]: I0301 09:56:23.838162 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gd957"] Mar 01 09:56:23 crc kubenswrapper[4792]: I0301 09:56:23.982013 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gd957" event={"ID":"334b1950-af06-4648-98dc-543534fc216a","Type":"ContainerStarted","Data":"87c281219a698791250b09d69a542e87c72aa7e1f46af736492aa9a12cf9e627"} Mar 01 09:56:24 crc kubenswrapper[4792]: I0301 09:56:24.990172 4792 generic.go:334] "Generic (PLEG): container finished" podID="334b1950-af06-4648-98dc-543534fc216a" containerID="519bc9f272a1ce5a790a51d0a515e723ba56516939324742b270e925ac89eb9a" exitCode=0 Mar 01 09:56:24 crc kubenswrapper[4792]: I0301 09:56:24.990246 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gd957" event={"ID":"334b1950-af06-4648-98dc-543534fc216a","Type":"ContainerDied","Data":"519bc9f272a1ce5a790a51d0a515e723ba56516939324742b270e925ac89eb9a"} Mar 01 09:56:26 crc kubenswrapper[4792]: I0301 09:56:25.999656 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gd957" event={"ID":"334b1950-af06-4648-98dc-543534fc216a","Type":"ContainerStarted","Data":"6bfcbdc1221bb94ee688c99c1c0085afbc86471f934e984a6a66410e9171f1b3"} Mar 01 09:56:27 crc kubenswrapper[4792]: I0301 09:56:27.010742 4792 generic.go:334] "Generic (PLEG): container finished" podID="334b1950-af06-4648-98dc-543534fc216a" containerID="6bfcbdc1221bb94ee688c99c1c0085afbc86471f934e984a6a66410e9171f1b3" exitCode=0 Mar 01 09:56:27 crc kubenswrapper[4792]: I0301 09:56:27.010788 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gd957" event={"ID":"334b1950-af06-4648-98dc-543534fc216a","Type":"ContainerDied","Data":"6bfcbdc1221bb94ee688c99c1c0085afbc86471f934e984a6a66410e9171f1b3"} Mar 01 09:56:28 crc kubenswrapper[4792]: I0301 09:56:28.021284 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gd957" event={"ID":"334b1950-af06-4648-98dc-543534fc216a","Type":"ContainerStarted","Data":"a59b136bc0800aa98bbdde4b0986c346a3752f954016918404b0d9226b16f2fe"} Mar 01 09:56:28 crc kubenswrapper[4792]: I0301 09:56:28.043709 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gd957" podStartSLOduration=3.6669071989999997 podStartE2EDuration="6.043687373s" podCreationTimestamp="2026-03-01 09:56:22 +0000 UTC" firstStartedPulling="2026-03-01 09:56:24.993047502 +0000 UTC m=+2914.234926699" lastFinishedPulling="2026-03-01 09:56:27.369827676 +0000 UTC m=+2916.611706873" observedRunningTime="2026-03-01 09:56:28.03873579 +0000 UTC m=+2917.280614987" watchObservedRunningTime="2026-03-01 09:56:28.043687373 +0000 UTC m=+2917.285566560" Mar 01 09:56:30 crc kubenswrapper[4792]: I0301 09:56:30.870459 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-f6q2t"] Mar 01 09:56:30 crc kubenswrapper[4792]: I0301 09:56:30.874235 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f6q2t" Mar 01 09:56:30 crc kubenswrapper[4792]: I0301 09:56:30.882703 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f6q2t"] Mar 01 09:56:30 crc kubenswrapper[4792]: I0301 09:56:30.930368 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5hgj\" (UniqueName: \"kubernetes.io/projected/01308ffe-b402-42f1-8895-22ee5823304b-kube-api-access-g5hgj\") pod \"redhat-marketplace-f6q2t\" (UID: \"01308ffe-b402-42f1-8895-22ee5823304b\") " pod="openshift-marketplace/redhat-marketplace-f6q2t" Mar 01 09:56:30 crc kubenswrapper[4792]: I0301 09:56:30.930642 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01308ffe-b402-42f1-8895-22ee5823304b-catalog-content\") pod \"redhat-marketplace-f6q2t\" (UID: \"01308ffe-b402-42f1-8895-22ee5823304b\") " pod="openshift-marketplace/redhat-marketplace-f6q2t" Mar 01 09:56:30 crc kubenswrapper[4792]: I0301 09:56:30.930805 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01308ffe-b402-42f1-8895-22ee5823304b-utilities\") pod \"redhat-marketplace-f6q2t\" (UID: \"01308ffe-b402-42f1-8895-22ee5823304b\") " pod="openshift-marketplace/redhat-marketplace-f6q2t" Mar 01 09:56:31 crc kubenswrapper[4792]: I0301 09:56:31.032154 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01308ffe-b402-42f1-8895-22ee5823304b-catalog-content\") pod \"redhat-marketplace-f6q2t\" (UID: \"01308ffe-b402-42f1-8895-22ee5823304b\") " pod="openshift-marketplace/redhat-marketplace-f6q2t" Mar 01 09:56:31 crc kubenswrapper[4792]: I0301 09:56:31.032257 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01308ffe-b402-42f1-8895-22ee5823304b-utilities\") pod \"redhat-marketplace-f6q2t\" (UID: \"01308ffe-b402-42f1-8895-22ee5823304b\") " pod="openshift-marketplace/redhat-marketplace-f6q2t" Mar 01 09:56:31 crc kubenswrapper[4792]: I0301 09:56:31.032681 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01308ffe-b402-42f1-8895-22ee5823304b-catalog-content\") pod \"redhat-marketplace-f6q2t\" (UID: \"01308ffe-b402-42f1-8895-22ee5823304b\") " pod="openshift-marketplace/redhat-marketplace-f6q2t" Mar 01 09:56:31 crc kubenswrapper[4792]: I0301 09:56:31.032736 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5hgj\" (UniqueName: \"kubernetes.io/projected/01308ffe-b402-42f1-8895-22ee5823304b-kube-api-access-g5hgj\") pod \"redhat-marketplace-f6q2t\" (UID: \"01308ffe-b402-42f1-8895-22ee5823304b\") " pod="openshift-marketplace/redhat-marketplace-f6q2t" Mar 01 09:56:31 crc kubenswrapper[4792]: I0301 09:56:31.032835 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01308ffe-b402-42f1-8895-22ee5823304b-utilities\") pod \"redhat-marketplace-f6q2t\" (UID: \"01308ffe-b402-42f1-8895-22ee5823304b\") " pod="openshift-marketplace/redhat-marketplace-f6q2t" Mar 01 09:56:31 crc kubenswrapper[4792]: I0301 09:56:31.062654 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5hgj\" (UniqueName: \"kubernetes.io/projected/01308ffe-b402-42f1-8895-22ee5823304b-kube-api-access-g5hgj\") pod \"redhat-marketplace-f6q2t\" (UID: \"01308ffe-b402-42f1-8895-22ee5823304b\") " pod="openshift-marketplace/redhat-marketplace-f6q2t" Mar 01 09:56:31 crc kubenswrapper[4792]: I0301 09:56:31.198739 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f6q2t" Mar 01 09:56:31 crc kubenswrapper[4792]: I0301 09:56:31.564149 4792 scope.go:117] "RemoveContainer" containerID="3be8237f11ee8a9c2a66a6dce0cfdb5c72c7e1c7d5445dc46a852faf899f2940" Mar 01 09:56:31 crc kubenswrapper[4792]: I0301 09:56:31.667815 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f6q2t"] Mar 01 09:56:31 crc kubenswrapper[4792]: W0301 09:56:31.687382 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01308ffe_b402_42f1_8895_22ee5823304b.slice/crio-27f6a0cf37bf0be2aecf2f55c75d9a40b34c5ba6225ba36896bc7c41c0d112ec WatchSource:0}: Error finding container 27f6a0cf37bf0be2aecf2f55c75d9a40b34c5ba6225ba36896bc7c41c0d112ec: Status 404 returned error can't find the container with id 27f6a0cf37bf0be2aecf2f55c75d9a40b34c5ba6225ba36896bc7c41c0d112ec Mar 01 09:56:32 crc kubenswrapper[4792]: I0301 09:56:32.053079 4792 generic.go:334] "Generic (PLEG): container finished" podID="01308ffe-b402-42f1-8895-22ee5823304b" containerID="d15fdb0df7769f9939ae8c03c32db0f6fe6adfab99371fe79bbe6e95ddd1dfca" exitCode=0 Mar 01 09:56:32 crc kubenswrapper[4792]: I0301 09:56:32.053121 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f6q2t" event={"ID":"01308ffe-b402-42f1-8895-22ee5823304b","Type":"ContainerDied","Data":"d15fdb0df7769f9939ae8c03c32db0f6fe6adfab99371fe79bbe6e95ddd1dfca"} Mar 01 09:56:32 crc kubenswrapper[4792]: I0301 09:56:32.053164 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f6q2t" event={"ID":"01308ffe-b402-42f1-8895-22ee5823304b","Type":"ContainerStarted","Data":"27f6a0cf37bf0be2aecf2f55c75d9a40b34c5ba6225ba36896bc7c41c0d112ec"} Mar 01 09:56:33 crc kubenswrapper[4792]: I0301 09:56:33.063647 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f6q2t" event={"ID":"01308ffe-b402-42f1-8895-22ee5823304b","Type":"ContainerStarted","Data":"ce11c9f3486d2dacf072897c2dfb0af5686df38bebcd0809b74c3bd8809a6200"} Mar 01 09:56:33 crc kubenswrapper[4792]: I0301 09:56:33.311153 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gd957" Mar 01 09:56:33 crc kubenswrapper[4792]: I0301 09:56:33.311209 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gd957" Mar 01 09:56:33 crc kubenswrapper[4792]: I0301 09:56:33.496037 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gd957" Mar 01 09:56:34 crc kubenswrapper[4792]: I0301 09:56:34.117440 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gd957" Mar 01 09:56:35 crc kubenswrapper[4792]: I0301 09:56:35.079070 4792 generic.go:334] "Generic (PLEG): container finished" podID="01308ffe-b402-42f1-8895-22ee5823304b" containerID="ce11c9f3486d2dacf072897c2dfb0af5686df38bebcd0809b74c3bd8809a6200" exitCode=0 Mar 01 09:56:35 crc kubenswrapper[4792]: I0301 09:56:35.079156 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f6q2t" event={"ID":"01308ffe-b402-42f1-8895-22ee5823304b","Type":"ContainerDied","Data":"ce11c9f3486d2dacf072897c2dfb0af5686df38bebcd0809b74c3bd8809a6200"} Mar 01 09:56:35 crc kubenswrapper[4792]: I0301 09:56:35.916697 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gd957"] Mar 01 09:56:36 crc kubenswrapper[4792]: I0301 09:56:36.089341 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gd957" podUID="334b1950-af06-4648-98dc-543534fc216a" containerName="registry-server" containerID="cri-o://a59b136bc0800aa98bbdde4b0986c346a3752f954016918404b0d9226b16f2fe" gracePeriod=2 Mar 01 09:56:36 crc kubenswrapper[4792]: I0301 09:56:36.089790 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f6q2t" event={"ID":"01308ffe-b402-42f1-8895-22ee5823304b","Type":"ContainerStarted","Data":"eb816dd8310f585c322a71e004114ee79875bc1a853db9c0adc5bbf211eb0696"} Mar 01 09:56:36 crc kubenswrapper[4792]: I0301 09:56:36.112311 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-f6q2t" podStartSLOduration=2.567439225 podStartE2EDuration="6.112290979s" podCreationTimestamp="2026-03-01 09:56:30 +0000 UTC" firstStartedPulling="2026-03-01 09:56:32.055549364 +0000 UTC m=+2921.297428561" lastFinishedPulling="2026-03-01 09:56:35.600401118 +0000 UTC m=+2924.842280315" observedRunningTime="2026-03-01 09:56:36.108197147 +0000 UTC m=+2925.350076344" watchObservedRunningTime="2026-03-01 09:56:36.112290979 +0000 UTC m=+2925.354170176" Mar 01 09:56:36 crc kubenswrapper[4792]: I0301 09:56:36.578787 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gd957" Mar 01 09:56:36 crc kubenswrapper[4792]: I0301 09:56:36.639048 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/334b1950-af06-4648-98dc-543534fc216a-catalog-content\") pod \"334b1950-af06-4648-98dc-543534fc216a\" (UID: \"334b1950-af06-4648-98dc-543534fc216a\") " Mar 01 09:56:36 crc kubenswrapper[4792]: I0301 09:56:36.639185 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfwsm\" (UniqueName: \"kubernetes.io/projected/334b1950-af06-4648-98dc-543534fc216a-kube-api-access-gfwsm\") pod \"334b1950-af06-4648-98dc-543534fc216a\" (UID: \"334b1950-af06-4648-98dc-543534fc216a\") " Mar 01 09:56:36 crc kubenswrapper[4792]: I0301 09:56:36.639252 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/334b1950-af06-4648-98dc-543534fc216a-utilities\") pod \"334b1950-af06-4648-98dc-543534fc216a\" (UID: \"334b1950-af06-4648-98dc-543534fc216a\") " Mar 01 09:56:36 crc kubenswrapper[4792]: I0301 09:56:36.640128 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/334b1950-af06-4648-98dc-543534fc216a-utilities" (OuterVolumeSpecName: "utilities") pod "334b1950-af06-4648-98dc-543534fc216a" (UID: "334b1950-af06-4648-98dc-543534fc216a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:56:36 crc kubenswrapper[4792]: I0301 09:56:36.645843 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/334b1950-af06-4648-98dc-543534fc216a-kube-api-access-gfwsm" (OuterVolumeSpecName: "kube-api-access-gfwsm") pod "334b1950-af06-4648-98dc-543534fc216a" (UID: "334b1950-af06-4648-98dc-543534fc216a"). InnerVolumeSpecName "kube-api-access-gfwsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:56:36 crc kubenswrapper[4792]: I0301 09:56:36.716743 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/334b1950-af06-4648-98dc-543534fc216a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "334b1950-af06-4648-98dc-543534fc216a" (UID: "334b1950-af06-4648-98dc-543534fc216a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:56:36 crc kubenswrapper[4792]: I0301 09:56:36.741059 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfwsm\" (UniqueName: \"kubernetes.io/projected/334b1950-af06-4648-98dc-543534fc216a-kube-api-access-gfwsm\") on node \"crc\" DevicePath \"\"" Mar 01 09:56:36 crc kubenswrapper[4792]: I0301 09:56:36.741091 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/334b1950-af06-4648-98dc-543534fc216a-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:56:36 crc kubenswrapper[4792]: I0301 09:56:36.741100 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/334b1950-af06-4648-98dc-543534fc216a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:56:37 crc kubenswrapper[4792]: I0301 09:56:37.108856 4792 generic.go:334] "Generic (PLEG): container finished" podID="334b1950-af06-4648-98dc-543534fc216a" containerID="a59b136bc0800aa98bbdde4b0986c346a3752f954016918404b0d9226b16f2fe" exitCode=0 Mar 01 09:56:37 crc kubenswrapper[4792]: I0301 09:56:37.110230 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gd957" event={"ID":"334b1950-af06-4648-98dc-543534fc216a","Type":"ContainerDied","Data":"a59b136bc0800aa98bbdde4b0986c346a3752f954016918404b0d9226b16f2fe"} Mar 01 09:56:37 crc kubenswrapper[4792]: I0301 09:56:37.110268 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gd957" event={"ID":"334b1950-af06-4648-98dc-543534fc216a","Type":"ContainerDied","Data":"87c281219a698791250b09d69a542e87c72aa7e1f46af736492aa9a12cf9e627"} Mar 01 09:56:37 crc kubenswrapper[4792]: I0301 09:56:37.110286 4792 scope.go:117] "RemoveContainer" containerID="a59b136bc0800aa98bbdde4b0986c346a3752f954016918404b0d9226b16f2fe" Mar 01 09:56:37 crc kubenswrapper[4792]: I0301 09:56:37.110505 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gd957" Mar 01 09:56:37 crc kubenswrapper[4792]: I0301 09:56:37.147346 4792 scope.go:117] "RemoveContainer" containerID="6bfcbdc1221bb94ee688c99c1c0085afbc86471f934e984a6a66410e9171f1b3" Mar 01 09:56:37 crc kubenswrapper[4792]: I0301 09:56:37.199203 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gd957"] Mar 01 09:56:37 crc kubenswrapper[4792]: I0301 09:56:37.210287 4792 scope.go:117] "RemoveContainer" containerID="519bc9f272a1ce5a790a51d0a515e723ba56516939324742b270e925ac89eb9a" Mar 01 09:56:37 crc kubenswrapper[4792]: I0301 09:56:37.218559 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gd957"] Mar 01 09:56:37 crc kubenswrapper[4792]: I0301 09:56:37.248081 4792 scope.go:117] "RemoveContainer" containerID="a59b136bc0800aa98bbdde4b0986c346a3752f954016918404b0d9226b16f2fe" Mar 01 09:56:37 crc kubenswrapper[4792]: E0301 09:56:37.251983 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a59b136bc0800aa98bbdde4b0986c346a3752f954016918404b0d9226b16f2fe\": container with ID starting with a59b136bc0800aa98bbdde4b0986c346a3752f954016918404b0d9226b16f2fe not found: ID does not exist" containerID="a59b136bc0800aa98bbdde4b0986c346a3752f954016918404b0d9226b16f2fe" Mar 01 09:56:37 crc kubenswrapper[4792]: I0301 09:56:37.252027 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a59b136bc0800aa98bbdde4b0986c346a3752f954016918404b0d9226b16f2fe"} err="failed to get container status \"a59b136bc0800aa98bbdde4b0986c346a3752f954016918404b0d9226b16f2fe\": rpc error: code = NotFound desc = could not find container \"a59b136bc0800aa98bbdde4b0986c346a3752f954016918404b0d9226b16f2fe\": container with ID starting with a59b136bc0800aa98bbdde4b0986c346a3752f954016918404b0d9226b16f2fe not found: ID does not exist" Mar 01 09:56:37 crc kubenswrapper[4792]: I0301 09:56:37.252058 4792 scope.go:117] "RemoveContainer" containerID="6bfcbdc1221bb94ee688c99c1c0085afbc86471f934e984a6a66410e9171f1b3" Mar 01 09:56:37 crc kubenswrapper[4792]: E0301 09:56:37.255358 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bfcbdc1221bb94ee688c99c1c0085afbc86471f934e984a6a66410e9171f1b3\": container with ID starting with 6bfcbdc1221bb94ee688c99c1c0085afbc86471f934e984a6a66410e9171f1b3 not found: ID does not exist" containerID="6bfcbdc1221bb94ee688c99c1c0085afbc86471f934e984a6a66410e9171f1b3" Mar 01 09:56:37 crc kubenswrapper[4792]: I0301 09:56:37.255417 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bfcbdc1221bb94ee688c99c1c0085afbc86471f934e984a6a66410e9171f1b3"} err="failed to get container status \"6bfcbdc1221bb94ee688c99c1c0085afbc86471f934e984a6a66410e9171f1b3\": rpc error: code = NotFound desc = could not find container \"6bfcbdc1221bb94ee688c99c1c0085afbc86471f934e984a6a66410e9171f1b3\": container with ID starting with 6bfcbdc1221bb94ee688c99c1c0085afbc86471f934e984a6a66410e9171f1b3 not found: ID does not exist" Mar 01 09:56:37 crc kubenswrapper[4792]: I0301 09:56:37.255440 4792 scope.go:117] "RemoveContainer" containerID="519bc9f272a1ce5a790a51d0a515e723ba56516939324742b270e925ac89eb9a" Mar 01 09:56:37 crc kubenswrapper[4792]: E0301 09:56:37.262247 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"519bc9f272a1ce5a790a51d0a515e723ba56516939324742b270e925ac89eb9a\": container with ID starting with 519bc9f272a1ce5a790a51d0a515e723ba56516939324742b270e925ac89eb9a not found: ID does not exist" containerID="519bc9f272a1ce5a790a51d0a515e723ba56516939324742b270e925ac89eb9a" Mar 01 09:56:37 crc kubenswrapper[4792]: I0301 09:56:37.262295 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"519bc9f272a1ce5a790a51d0a515e723ba56516939324742b270e925ac89eb9a"} err="failed to get container status \"519bc9f272a1ce5a790a51d0a515e723ba56516939324742b270e925ac89eb9a\": rpc error: code = NotFound desc = could not find container \"519bc9f272a1ce5a790a51d0a515e723ba56516939324742b270e925ac89eb9a\": container with ID starting with 519bc9f272a1ce5a790a51d0a515e723ba56516939324742b270e925ac89eb9a not found: ID does not exist" Mar 01 09:56:37 crc kubenswrapper[4792]: I0301 09:56:37.420215 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="334b1950-af06-4648-98dc-543534fc216a" path="/var/lib/kubelet/pods/334b1950-af06-4648-98dc-543534fc216a/volumes" Mar 01 09:56:41 crc kubenswrapper[4792]: I0301 09:56:41.199702 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-f6q2t" Mar 01 09:56:41 crc kubenswrapper[4792]: I0301 09:56:41.200275 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-f6q2t" Mar 01 09:56:41 crc kubenswrapper[4792]: I0301 09:56:41.294177 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-f6q2t" Mar 01 09:56:42 crc kubenswrapper[4792]: I0301 09:56:42.208805 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-f6q2t" Mar 01 09:56:42 crc kubenswrapper[4792]: I0301 09:56:42.259000 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f6q2t"] Mar 01 09:56:44 crc kubenswrapper[4792]: I0301 09:56:44.174216 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-f6q2t" podUID="01308ffe-b402-42f1-8895-22ee5823304b" containerName="registry-server" containerID="cri-o://eb816dd8310f585c322a71e004114ee79875bc1a853db9c0adc5bbf211eb0696" gracePeriod=2 Mar 01 09:56:45 crc kubenswrapper[4792]: I0301 09:56:45.089644 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f6q2t" Mar 01 09:56:45 crc kubenswrapper[4792]: I0301 09:56:45.182724 4792 generic.go:334] "Generic (PLEG): container finished" podID="01308ffe-b402-42f1-8895-22ee5823304b" containerID="eb816dd8310f585c322a71e004114ee79875bc1a853db9c0adc5bbf211eb0696" exitCode=0 Mar 01 09:56:45 crc kubenswrapper[4792]: I0301 09:56:45.182776 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f6q2t" event={"ID":"01308ffe-b402-42f1-8895-22ee5823304b","Type":"ContainerDied","Data":"eb816dd8310f585c322a71e004114ee79875bc1a853db9c0adc5bbf211eb0696"} Mar 01 09:56:45 crc kubenswrapper[4792]: I0301 09:56:45.182872 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f6q2t" event={"ID":"01308ffe-b402-42f1-8895-22ee5823304b","Type":"ContainerDied","Data":"27f6a0cf37bf0be2aecf2f55c75d9a40b34c5ba6225ba36896bc7c41c0d112ec"} Mar 01 09:56:45 crc kubenswrapper[4792]: I0301 09:56:45.182911 4792 scope.go:117] "RemoveContainer" containerID="eb816dd8310f585c322a71e004114ee79875bc1a853db9c0adc5bbf211eb0696" Mar 01 09:56:45 crc kubenswrapper[4792]: I0301 09:56:45.183744 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f6q2t" Mar 01 09:56:45 crc kubenswrapper[4792]: I0301 09:56:45.207547 4792 scope.go:117] "RemoveContainer" containerID="ce11c9f3486d2dacf072897c2dfb0af5686df38bebcd0809b74c3bd8809a6200" Mar 01 09:56:45 crc kubenswrapper[4792]: I0301 09:56:45.230926 4792 scope.go:117] "RemoveContainer" containerID="d15fdb0df7769f9939ae8c03c32db0f6fe6adfab99371fe79bbe6e95ddd1dfca" Mar 01 09:56:45 crc kubenswrapper[4792]: I0301 09:56:45.270194 4792 scope.go:117] "RemoveContainer" containerID="eb816dd8310f585c322a71e004114ee79875bc1a853db9c0adc5bbf211eb0696" Mar 01 09:56:45 crc kubenswrapper[4792]: E0301 09:56:45.270539 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb816dd8310f585c322a71e004114ee79875bc1a853db9c0adc5bbf211eb0696\": container with ID starting with eb816dd8310f585c322a71e004114ee79875bc1a853db9c0adc5bbf211eb0696 not found: ID does not exist" containerID="eb816dd8310f585c322a71e004114ee79875bc1a853db9c0adc5bbf211eb0696" Mar 01 09:56:45 crc kubenswrapper[4792]: I0301 09:56:45.270574 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb816dd8310f585c322a71e004114ee79875bc1a853db9c0adc5bbf211eb0696"} err="failed to get container status \"eb816dd8310f585c322a71e004114ee79875bc1a853db9c0adc5bbf211eb0696\": rpc error: code = NotFound desc = could not find container \"eb816dd8310f585c322a71e004114ee79875bc1a853db9c0adc5bbf211eb0696\": container with ID starting with eb816dd8310f585c322a71e004114ee79875bc1a853db9c0adc5bbf211eb0696 not found: ID does not exist" Mar 01 09:56:45 crc kubenswrapper[4792]: I0301 09:56:45.270595 4792 scope.go:117] "RemoveContainer" containerID="ce11c9f3486d2dacf072897c2dfb0af5686df38bebcd0809b74c3bd8809a6200" Mar 01 09:56:45 crc kubenswrapper[4792]: E0301 09:56:45.271045 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce11c9f3486d2dacf072897c2dfb0af5686df38bebcd0809b74c3bd8809a6200\": container with ID starting with ce11c9f3486d2dacf072897c2dfb0af5686df38bebcd0809b74c3bd8809a6200 not found: ID does not exist" containerID="ce11c9f3486d2dacf072897c2dfb0af5686df38bebcd0809b74c3bd8809a6200" Mar 01 09:56:45 crc kubenswrapper[4792]: I0301 09:56:45.271069 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce11c9f3486d2dacf072897c2dfb0af5686df38bebcd0809b74c3bd8809a6200"} err="failed to get container status \"ce11c9f3486d2dacf072897c2dfb0af5686df38bebcd0809b74c3bd8809a6200\": rpc error: code = NotFound desc = could not find container \"ce11c9f3486d2dacf072897c2dfb0af5686df38bebcd0809b74c3bd8809a6200\": container with ID starting with ce11c9f3486d2dacf072897c2dfb0af5686df38bebcd0809b74c3bd8809a6200 not found: ID does not exist" Mar 01 09:56:45 crc kubenswrapper[4792]: I0301 09:56:45.271084 4792 scope.go:117] "RemoveContainer" containerID="d15fdb0df7769f9939ae8c03c32db0f6fe6adfab99371fe79bbe6e95ddd1dfca" Mar 01 09:56:45 crc kubenswrapper[4792]: E0301 09:56:45.271361 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d15fdb0df7769f9939ae8c03c32db0f6fe6adfab99371fe79bbe6e95ddd1dfca\": container with ID starting with d15fdb0df7769f9939ae8c03c32db0f6fe6adfab99371fe79bbe6e95ddd1dfca not found: ID does not exist" containerID="d15fdb0df7769f9939ae8c03c32db0f6fe6adfab99371fe79bbe6e95ddd1dfca" Mar 01 09:56:45 crc kubenswrapper[4792]: I0301 09:56:45.271391 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d15fdb0df7769f9939ae8c03c32db0f6fe6adfab99371fe79bbe6e95ddd1dfca"} err="failed to get container status \"d15fdb0df7769f9939ae8c03c32db0f6fe6adfab99371fe79bbe6e95ddd1dfca\": rpc error: code = NotFound desc = could not find container \"d15fdb0df7769f9939ae8c03c32db0f6fe6adfab99371fe79bbe6e95ddd1dfca\": container with ID starting with d15fdb0df7769f9939ae8c03c32db0f6fe6adfab99371fe79bbe6e95ddd1dfca not found: ID does not exist" Mar 01 09:56:45 crc kubenswrapper[4792]: I0301 09:56:45.287393 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5hgj\" (UniqueName: \"kubernetes.io/projected/01308ffe-b402-42f1-8895-22ee5823304b-kube-api-access-g5hgj\") pod \"01308ffe-b402-42f1-8895-22ee5823304b\" (UID: \"01308ffe-b402-42f1-8895-22ee5823304b\") " Mar 01 09:56:45 crc kubenswrapper[4792]: I0301 09:56:45.287635 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01308ffe-b402-42f1-8895-22ee5823304b-utilities\") pod \"01308ffe-b402-42f1-8895-22ee5823304b\" (UID: \"01308ffe-b402-42f1-8895-22ee5823304b\") " Mar 01 09:56:45 crc kubenswrapper[4792]: I0301 09:56:45.287782 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01308ffe-b402-42f1-8895-22ee5823304b-catalog-content\") pod \"01308ffe-b402-42f1-8895-22ee5823304b\" (UID: \"01308ffe-b402-42f1-8895-22ee5823304b\") " Mar 01 09:56:45 crc kubenswrapper[4792]: I0301 09:56:45.290778 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01308ffe-b402-42f1-8895-22ee5823304b-utilities" (OuterVolumeSpecName: "utilities") pod "01308ffe-b402-42f1-8895-22ee5823304b" (UID: "01308ffe-b402-42f1-8895-22ee5823304b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:56:45 crc kubenswrapper[4792]: I0301 09:56:45.293455 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01308ffe-b402-42f1-8895-22ee5823304b-kube-api-access-g5hgj" (OuterVolumeSpecName: "kube-api-access-g5hgj") pod "01308ffe-b402-42f1-8895-22ee5823304b" (UID: "01308ffe-b402-42f1-8895-22ee5823304b"). InnerVolumeSpecName "kube-api-access-g5hgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:56:45 crc kubenswrapper[4792]: I0301 09:56:45.317694 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01308ffe-b402-42f1-8895-22ee5823304b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "01308ffe-b402-42f1-8895-22ee5823304b" (UID: "01308ffe-b402-42f1-8895-22ee5823304b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:56:46 crc kubenswrapper[4792]: I0301 09:56:46.040188 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01308ffe-b402-42f1-8895-22ee5823304b-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:56:46 crc kubenswrapper[4792]: I0301 09:56:46.040242 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01308ffe-b402-42f1-8895-22ee5823304b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:56:46 crc kubenswrapper[4792]: I0301 09:56:46.040259 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5hgj\" (UniqueName: \"kubernetes.io/projected/01308ffe-b402-42f1-8895-22ee5823304b-kube-api-access-g5hgj\") on node \"crc\" DevicePath \"\"" Mar 01 09:56:46 crc kubenswrapper[4792]: I0301 09:56:46.143180 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f6q2t"] Mar 01 09:56:46 crc kubenswrapper[4792]: I0301 09:56:46.188474 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-f6q2t"] Mar 01 09:56:47 crc kubenswrapper[4792]: I0301 09:56:47.421891 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01308ffe-b402-42f1-8895-22ee5823304b" path="/var/lib/kubelet/pods/01308ffe-b402-42f1-8895-22ee5823304b/volumes" Mar 01 09:57:02 crc kubenswrapper[4792]: I0301 09:57:02.900248 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-srpk8"] Mar 01 09:57:02 crc kubenswrapper[4792]: E0301 09:57:02.901190 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01308ffe-b402-42f1-8895-22ee5823304b" containerName="extract-utilities" Mar 01 09:57:02 crc kubenswrapper[4792]: I0301 09:57:02.901206 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="01308ffe-b402-42f1-8895-22ee5823304b" containerName="extract-utilities" Mar 01 09:57:02 crc kubenswrapper[4792]: E0301 09:57:02.901224 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="334b1950-af06-4648-98dc-543534fc216a" containerName="extract-content" Mar 01 09:57:02 crc kubenswrapper[4792]: I0301 09:57:02.901233 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="334b1950-af06-4648-98dc-543534fc216a" containerName="extract-content" Mar 01 09:57:02 crc kubenswrapper[4792]: E0301 09:57:02.901243 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01308ffe-b402-42f1-8895-22ee5823304b" containerName="extract-content" Mar 01 09:57:02 crc kubenswrapper[4792]: I0301 09:57:02.901251 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="01308ffe-b402-42f1-8895-22ee5823304b" containerName="extract-content" Mar 01 09:57:02 crc kubenswrapper[4792]: E0301 09:57:02.901279 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="334b1950-af06-4648-98dc-543534fc216a" containerName="extract-utilities" Mar 01 09:57:02 crc kubenswrapper[4792]: I0301 09:57:02.901288 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="334b1950-af06-4648-98dc-543534fc216a" containerName="extract-utilities" Mar 01 09:57:02 crc kubenswrapper[4792]: E0301 09:57:02.901303 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01308ffe-b402-42f1-8895-22ee5823304b" containerName="registry-server" Mar 01 09:57:02 crc kubenswrapper[4792]: I0301 09:57:02.901312 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="01308ffe-b402-42f1-8895-22ee5823304b" containerName="registry-server" Mar 01 09:57:02 crc kubenswrapper[4792]: E0301 09:57:02.901362 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="334b1950-af06-4648-98dc-543534fc216a" containerName="registry-server" Mar 01 09:57:02 crc kubenswrapper[4792]: I0301 09:57:02.901371 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="334b1950-af06-4648-98dc-543534fc216a" containerName="registry-server" Mar 01 09:57:02 crc kubenswrapper[4792]: I0301 09:57:02.901605 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="334b1950-af06-4648-98dc-543534fc216a" containerName="registry-server" Mar 01 09:57:02 crc kubenswrapper[4792]: I0301 09:57:02.901629 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="01308ffe-b402-42f1-8895-22ee5823304b" containerName="registry-server" Mar 01 09:57:02 crc kubenswrapper[4792]: I0301 09:57:02.903347 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-srpk8" Mar 01 09:57:02 crc kubenswrapper[4792]: I0301 09:57:02.913942 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-srpk8"] Mar 01 09:57:03 crc kubenswrapper[4792]: I0301 09:57:03.035488 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b-catalog-content\") pod \"certified-operators-srpk8\" (UID: \"b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b\") " pod="openshift-marketplace/certified-operators-srpk8" Mar 01 09:57:03 crc kubenswrapper[4792]: I0301 09:57:03.035887 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x246v\" (UniqueName: \"kubernetes.io/projected/b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b-kube-api-access-x246v\") pod \"certified-operators-srpk8\" (UID: \"b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b\") " pod="openshift-marketplace/certified-operators-srpk8" Mar 01 09:57:03 crc kubenswrapper[4792]: I0301 09:57:03.036117 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b-utilities\") pod \"certified-operators-srpk8\" (UID: \"b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b\") " pod="openshift-marketplace/certified-operators-srpk8" Mar 01 09:57:03 crc kubenswrapper[4792]: I0301 09:57:03.138367 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b-utilities\") pod \"certified-operators-srpk8\" (UID: \"b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b\") " pod="openshift-marketplace/certified-operators-srpk8" Mar 01 09:57:03 crc kubenswrapper[4792]: I0301 09:57:03.138467 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b-catalog-content\") pod \"certified-operators-srpk8\" (UID: \"b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b\") " pod="openshift-marketplace/certified-operators-srpk8" Mar 01 09:57:03 crc kubenswrapper[4792]: I0301 09:57:03.138495 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x246v\" (UniqueName: \"kubernetes.io/projected/b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b-kube-api-access-x246v\") pod \"certified-operators-srpk8\" (UID: \"b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b\") " pod="openshift-marketplace/certified-operators-srpk8" Mar 01 09:57:03 crc kubenswrapper[4792]: I0301 09:57:03.138951 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b-utilities\") pod \"certified-operators-srpk8\" (UID: \"b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b\") " pod="openshift-marketplace/certified-operators-srpk8" Mar 01 09:57:03 crc kubenswrapper[4792]: I0301 09:57:03.139073 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b-catalog-content\") pod \"certified-operators-srpk8\" (UID: \"b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b\") " pod="openshift-marketplace/certified-operators-srpk8" Mar 01 09:57:03 crc kubenswrapper[4792]: I0301 09:57:03.169449 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x246v\" (UniqueName: \"kubernetes.io/projected/b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b-kube-api-access-x246v\") pod \"certified-operators-srpk8\" (UID: \"b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b\") " pod="openshift-marketplace/certified-operators-srpk8" Mar 01 09:57:03 crc kubenswrapper[4792]: I0301 09:57:03.296157 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-srpk8" Mar 01 09:57:03 crc kubenswrapper[4792]: I0301 09:57:03.860475 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-srpk8"] Mar 01 09:57:04 crc kubenswrapper[4792]: I0301 09:57:04.349249 4792 generic.go:334] "Generic (PLEG): container finished" podID="b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b" containerID="c9012413d5c592c5b40c5f928a05c5b3f4829ea8e19af19dc3f1a9cf031570d0" exitCode=0 Mar 01 09:57:04 crc kubenswrapper[4792]: I0301 09:57:04.349532 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-srpk8" event={"ID":"b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b","Type":"ContainerDied","Data":"c9012413d5c592c5b40c5f928a05c5b3f4829ea8e19af19dc3f1a9cf031570d0"} Mar 01 09:57:04 crc kubenswrapper[4792]: I0301 09:57:04.349558 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-srpk8" event={"ID":"b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b","Type":"ContainerStarted","Data":"08062e533d06ffaf9308c7a8f8ae79e0e90f4f7bd9be17fa74bd0dac1d520e96"} Mar 01 09:57:04 crc kubenswrapper[4792]: I0301 09:57:04.943477 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:57:04 crc kubenswrapper[4792]: I0301 09:57:04.943778 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:57:05 crc kubenswrapper[4792]: I0301 09:57:05.359232 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-srpk8" event={"ID":"b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b","Type":"ContainerStarted","Data":"d38cdb1d4619e258f809a277bacae0934764dfb7a7db89bbc071f45136202a0a"} Mar 01 09:57:08 crc kubenswrapper[4792]: I0301 09:57:08.385575 4792 generic.go:334] "Generic (PLEG): container finished" podID="b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b" containerID="d38cdb1d4619e258f809a277bacae0934764dfb7a7db89bbc071f45136202a0a" exitCode=0 Mar 01 09:57:08 crc kubenswrapper[4792]: I0301 09:57:08.385655 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-srpk8" event={"ID":"b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b","Type":"ContainerDied","Data":"d38cdb1d4619e258f809a277bacae0934764dfb7a7db89bbc071f45136202a0a"} Mar 01 09:57:09 crc kubenswrapper[4792]: I0301 09:57:09.396353 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-srpk8" event={"ID":"b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b","Type":"ContainerStarted","Data":"1e8631ae9531f7382b5e81ac945cf2990503490ca014a18f568e761944deaa05"} Mar 01 09:57:09 crc kubenswrapper[4792]: I0301 09:57:09.440128 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-srpk8" podStartSLOduration=2.919147154 podStartE2EDuration="7.440108657s" podCreationTimestamp="2026-03-01 09:57:02 +0000 UTC" firstStartedPulling="2026-03-01 09:57:04.352192228 +0000 UTC m=+2953.594071415" lastFinishedPulling="2026-03-01 09:57:08.873153721 +0000 UTC m=+2958.115032918" observedRunningTime="2026-03-01 09:57:09.428545271 +0000 UTC m=+2958.670424468" watchObservedRunningTime="2026-03-01 09:57:09.440108657 +0000 UTC m=+2958.681987854" Mar 01 09:57:13 crc kubenswrapper[4792]: I0301 09:57:13.297457 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-srpk8" Mar 01 09:57:13 crc kubenswrapper[4792]: I0301 09:57:13.297751 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-srpk8" Mar 01 09:57:13 crc kubenswrapper[4792]: I0301 09:57:13.347187 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-srpk8" Mar 01 09:57:23 crc kubenswrapper[4792]: I0301 09:57:23.371481 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-srpk8" Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.049970 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-srpk8"] Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.050250 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-srpk8" podUID="b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b" containerName="registry-server" containerID="cri-o://1e8631ae9531f7382b5e81ac945cf2990503490ca014a18f568e761944deaa05" gracePeriod=2 Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.505406 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-srpk8" Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.558037 4792 generic.go:334] "Generic (PLEG): container finished" podID="b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b" containerID="1e8631ae9531f7382b5e81ac945cf2990503490ca014a18f568e761944deaa05" exitCode=0 Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.558078 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-srpk8" event={"ID":"b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b","Type":"ContainerDied","Data":"1e8631ae9531f7382b5e81ac945cf2990503490ca014a18f568e761944deaa05"} Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.558104 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-srpk8" event={"ID":"b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b","Type":"ContainerDied","Data":"08062e533d06ffaf9308c7a8f8ae79e0e90f4f7bd9be17fa74bd0dac1d520e96"} Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.558120 4792 scope.go:117] "RemoveContainer" containerID="1e8631ae9531f7382b5e81ac945cf2990503490ca014a18f568e761944deaa05" Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.558209 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-srpk8" Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.585895 4792 scope.go:117] "RemoveContainer" containerID="d38cdb1d4619e258f809a277bacae0934764dfb7a7db89bbc071f45136202a0a" Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.613116 4792 scope.go:117] "RemoveContainer" containerID="c9012413d5c592c5b40c5f928a05c5b3f4829ea8e19af19dc3f1a9cf031570d0" Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.649464 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b-utilities\") pod \"b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b\" (UID: \"b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b\") " Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.649545 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x246v\" (UniqueName: \"kubernetes.io/projected/b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b-kube-api-access-x246v\") pod \"b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b\" (UID: \"b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b\") " Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.649568 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b-catalog-content\") pod \"b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b\" (UID: \"b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b\") " Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.652971 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b-utilities" (OuterVolumeSpecName: "utilities") pod "b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b" (UID: "b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.658938 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b-kube-api-access-x246v" (OuterVolumeSpecName: "kube-api-access-x246v") pod "b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b" (UID: "b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b"). InnerVolumeSpecName "kube-api-access-x246v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.661932 4792 scope.go:117] "RemoveContainer" containerID="1e8631ae9531f7382b5e81ac945cf2990503490ca014a18f568e761944deaa05" Mar 01 09:57:26 crc kubenswrapper[4792]: E0301 09:57:26.662430 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e8631ae9531f7382b5e81ac945cf2990503490ca014a18f568e761944deaa05\": container with ID starting with 1e8631ae9531f7382b5e81ac945cf2990503490ca014a18f568e761944deaa05 not found: ID does not exist" containerID="1e8631ae9531f7382b5e81ac945cf2990503490ca014a18f568e761944deaa05" Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.662457 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e8631ae9531f7382b5e81ac945cf2990503490ca014a18f568e761944deaa05"} err="failed to get container status \"1e8631ae9531f7382b5e81ac945cf2990503490ca014a18f568e761944deaa05\": rpc error: code = NotFound desc = could not find container \"1e8631ae9531f7382b5e81ac945cf2990503490ca014a18f568e761944deaa05\": container with ID starting with 1e8631ae9531f7382b5e81ac945cf2990503490ca014a18f568e761944deaa05 not found: ID does not exist" Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.662477 4792 scope.go:117] "RemoveContainer" containerID="d38cdb1d4619e258f809a277bacae0934764dfb7a7db89bbc071f45136202a0a" Mar 01 09:57:26 crc kubenswrapper[4792]: E0301 09:57:26.662765 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d38cdb1d4619e258f809a277bacae0934764dfb7a7db89bbc071f45136202a0a\": container with ID starting with d38cdb1d4619e258f809a277bacae0934764dfb7a7db89bbc071f45136202a0a not found: ID does not exist" containerID="d38cdb1d4619e258f809a277bacae0934764dfb7a7db89bbc071f45136202a0a" Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.662784 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d38cdb1d4619e258f809a277bacae0934764dfb7a7db89bbc071f45136202a0a"} err="failed to get container status \"d38cdb1d4619e258f809a277bacae0934764dfb7a7db89bbc071f45136202a0a\": rpc error: code = NotFound desc = could not find container \"d38cdb1d4619e258f809a277bacae0934764dfb7a7db89bbc071f45136202a0a\": container with ID starting with d38cdb1d4619e258f809a277bacae0934764dfb7a7db89bbc071f45136202a0a not found: ID does not exist" Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.662798 4792 scope.go:117] "RemoveContainer" containerID="c9012413d5c592c5b40c5f928a05c5b3f4829ea8e19af19dc3f1a9cf031570d0" Mar 01 09:57:26 crc kubenswrapper[4792]: E0301 09:57:26.663261 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9012413d5c592c5b40c5f928a05c5b3f4829ea8e19af19dc3f1a9cf031570d0\": container with ID starting with c9012413d5c592c5b40c5f928a05c5b3f4829ea8e19af19dc3f1a9cf031570d0 not found: ID does not exist" containerID="c9012413d5c592c5b40c5f928a05c5b3f4829ea8e19af19dc3f1a9cf031570d0" Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.663286 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9012413d5c592c5b40c5f928a05c5b3f4829ea8e19af19dc3f1a9cf031570d0"} err="failed to get container status \"c9012413d5c592c5b40c5f928a05c5b3f4829ea8e19af19dc3f1a9cf031570d0\": rpc error: code = NotFound desc = could not find container \"c9012413d5c592c5b40c5f928a05c5b3f4829ea8e19af19dc3f1a9cf031570d0\": container with ID starting with c9012413d5c592c5b40c5f928a05c5b3f4829ea8e19af19dc3f1a9cf031570d0 not found: ID does not exist" Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.709799 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b" (UID: "b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.752003 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.752068 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x246v\" (UniqueName: \"kubernetes.io/projected/b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b-kube-api-access-x246v\") on node \"crc\" DevicePath \"\"" Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.752081 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.890173 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-srpk8"] Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.897657 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-srpk8"] Mar 01 09:57:27 crc kubenswrapper[4792]: I0301 09:57:27.419600 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b" path="/var/lib/kubelet/pods/b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b/volumes" Mar 01 09:57:34 crc kubenswrapper[4792]: I0301 09:57:34.943329 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:57:34 crc kubenswrapper[4792]: I0301 09:57:34.943924 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:57:44 crc kubenswrapper[4792]: I0301 09:57:44.706179 4792 generic.go:334] "Generic (PLEG): container finished" podID="c7230f65-7e9a-4455-8d25-c49393bfbafe" containerID="e854573aa1d54c447e18219253a483ccbce7dfebd37e9e5c1c0e176ad1346674" exitCode=0 Mar 01 09:57:44 crc kubenswrapper[4792]: I0301 09:57:44.706270 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" event={"ID":"c7230f65-7e9a-4455-8d25-c49393bfbafe","Type":"ContainerDied","Data":"e854573aa1d54c447e18219253a483ccbce7dfebd37e9e5c1c0e176ad1346674"} Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.085437 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.160131 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-libvirt-combined-ca-bundle\") pod \"c7230f65-7e9a-4455-8d25-c49393bfbafe\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.160174 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-ssh-key-openstack-edpm-ipam\") pod \"c7230f65-7e9a-4455-8d25-c49393bfbafe\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.160372 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-inventory\") pod \"c7230f65-7e9a-4455-8d25-c49393bfbafe\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.160437 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thspr\" (UniqueName: \"kubernetes.io/projected/c7230f65-7e9a-4455-8d25-c49393bfbafe-kube-api-access-thspr\") pod \"c7230f65-7e9a-4455-8d25-c49393bfbafe\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.160469 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-ceph\") pod \"c7230f65-7e9a-4455-8d25-c49393bfbafe\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.160505 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-libvirt-secret-0\") pod \"c7230f65-7e9a-4455-8d25-c49393bfbafe\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.165923 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7230f65-7e9a-4455-8d25-c49393bfbafe-kube-api-access-thspr" (OuterVolumeSpecName: "kube-api-access-thspr") pod "c7230f65-7e9a-4455-8d25-c49393bfbafe" (UID: "c7230f65-7e9a-4455-8d25-c49393bfbafe"). InnerVolumeSpecName "kube-api-access-thspr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.166581 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "c7230f65-7e9a-4455-8d25-c49393bfbafe" (UID: "c7230f65-7e9a-4455-8d25-c49393bfbafe"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.168800 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-ceph" (OuterVolumeSpecName: "ceph") pod "c7230f65-7e9a-4455-8d25-c49393bfbafe" (UID: "c7230f65-7e9a-4455-8d25-c49393bfbafe"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.190279 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-inventory" (OuterVolumeSpecName: "inventory") pod "c7230f65-7e9a-4455-8d25-c49393bfbafe" (UID: "c7230f65-7e9a-4455-8d25-c49393bfbafe"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.190441 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c7230f65-7e9a-4455-8d25-c49393bfbafe" (UID: "c7230f65-7e9a-4455-8d25-c49393bfbafe"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.192572 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "c7230f65-7e9a-4455-8d25-c49393bfbafe" (UID: "c7230f65-7e9a-4455-8d25-c49393bfbafe"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.262589 4792 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.262624 4792 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.262637 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.262646 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.262654 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thspr\" (UniqueName: \"kubernetes.io/projected/c7230f65-7e9a-4455-8d25-c49393bfbafe-kube-api-access-thspr\") on node \"crc\" DevicePath \"\"" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.262664 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-ceph\") on node \"crc\" DevicePath \"\"" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.724386 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" event={"ID":"c7230f65-7e9a-4455-8d25-c49393bfbafe","Type":"ContainerDied","Data":"49739f0dd600f78c2fce8ea6d1d733fa8e1503a99144161a871f4dba20c10897"} Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.724737 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49739f0dd600f78c2fce8ea6d1d733fa8e1503a99144161a871f4dba20c10897" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.724601 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.822446 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7"] Mar 01 09:57:46 crc kubenswrapper[4792]: E0301 09:57:46.822787 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7230f65-7e9a-4455-8d25-c49393bfbafe" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.822804 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7230f65-7e9a-4455-8d25-c49393bfbafe" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 01 09:57:46 crc kubenswrapper[4792]: E0301 09:57:46.822830 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b" containerName="extract-utilities" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.822838 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b" containerName="extract-utilities" Mar 01 09:57:46 crc kubenswrapper[4792]: E0301 09:57:46.822846 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b" containerName="extract-content" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.822852 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b" containerName="extract-content" Mar 01 09:57:46 crc kubenswrapper[4792]: E0301 09:57:46.822861 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b" containerName="registry-server" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.822867 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b" containerName="registry-server" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.823028 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b" containerName="registry-server" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.823044 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7230f65-7e9a-4455-8d25-c49393bfbafe" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.823644 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.827064 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.827412 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.827638 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ceph-nova" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.828187 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.829486 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.829517 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.829650 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.829718 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.829947 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.844614 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7"] Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.877516 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.877564 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-3\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.877599 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-ssh-key-openstack-edpm-ipam\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.877660 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.877702 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.877722 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkzx5\" (UniqueName: \"kubernetes.io/projected/d7776778-c586-4ab6-8fdf-bfed4168992d-kube-api-access-qkzx5\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.877768 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.877793 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.877867 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.877887 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.877912 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.877928 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/d7776778-c586-4ab6-8fdf-bfed4168992d-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.878144 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-2\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.979882 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-2\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.979953 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.979979 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-3\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.980003 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-ssh-key-openstack-edpm-ipam\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.980025 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.980070 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.980087 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkzx5\" (UniqueName: \"kubernetes.io/projected/d7776778-c586-4ab6-8fdf-bfed4168992d-kube-api-access-qkzx5\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.980131 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.980155 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.980186 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.980202 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.980228 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.980245 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/d7776778-c586-4ab6-8fdf-bfed4168992d-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.981001 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/d7776778-c586-4ab6-8fdf-bfed4168992d-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.981217 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.984913 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-3\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.985127 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.985486 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.985674 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-2\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.986375 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.987551 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.987568 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.988986 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.989196 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:47 crc kubenswrapper[4792]: I0301 09:57:47.000818 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-ssh-key-openstack-edpm-ipam\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:47 crc kubenswrapper[4792]: I0301 09:57:47.006581 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkzx5\" (UniqueName: \"kubernetes.io/projected/d7776778-c586-4ab6-8fdf-bfed4168992d-kube-api-access-qkzx5\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:47 crc kubenswrapper[4792]: I0301 09:57:47.143142 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:47 crc kubenswrapper[4792]: I0301 09:57:47.659917 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7"] Mar 01 09:57:47 crc kubenswrapper[4792]: I0301 09:57:47.733311 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" event={"ID":"d7776778-c586-4ab6-8fdf-bfed4168992d","Type":"ContainerStarted","Data":"f6255f4f054d2fb3983c00b8a4caf1954fd5251d362befc32634842b245118e9"} Mar 01 09:57:48 crc kubenswrapper[4792]: I0301 09:57:48.742046 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" event={"ID":"d7776778-c586-4ab6-8fdf-bfed4168992d","Type":"ContainerStarted","Data":"136627f27d1451f409d8606c1096deb71cb0c8c3aa23573781151b161026a979"} Mar 01 09:57:48 crc kubenswrapper[4792]: I0301 09:57:48.764084 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" podStartSLOduration=2.287963233 podStartE2EDuration="2.76406713s" podCreationTimestamp="2026-03-01 09:57:46 +0000 UTC" firstStartedPulling="2026-03-01 09:57:47.665652303 +0000 UTC m=+2996.907531500" lastFinishedPulling="2026-03-01 09:57:48.1417562 +0000 UTC m=+2997.383635397" observedRunningTime="2026-03-01 09:57:48.758507622 +0000 UTC m=+2998.000386829" watchObservedRunningTime="2026-03-01 09:57:48.76406713 +0000 UTC m=+2998.005946327" Mar 01 09:58:00 crc kubenswrapper[4792]: I0301 09:58:00.146692 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539318-d9tmc"] Mar 01 09:58:00 crc kubenswrapper[4792]: I0301 09:58:00.149551 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539318-d9tmc" Mar 01 09:58:00 crc kubenswrapper[4792]: I0301 09:58:00.154617 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539318-d9tmc"] Mar 01 09:58:00 crc kubenswrapper[4792]: I0301 09:58:00.154766 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:58:00 crc kubenswrapper[4792]: I0301 09:58:00.154960 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:58:00 crc kubenswrapper[4792]: I0301 09:58:00.155033 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:58:00 crc kubenswrapper[4792]: I0301 09:58:00.237858 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sv8q\" (UniqueName: \"kubernetes.io/projected/bf3aad6a-fbd9-4a24-a489-33507709811b-kube-api-access-9sv8q\") pod \"auto-csr-approver-29539318-d9tmc\" (UID: \"bf3aad6a-fbd9-4a24-a489-33507709811b\") " pod="openshift-infra/auto-csr-approver-29539318-d9tmc" Mar 01 09:58:00 crc kubenswrapper[4792]: I0301 09:58:00.340111 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sv8q\" (UniqueName: \"kubernetes.io/projected/bf3aad6a-fbd9-4a24-a489-33507709811b-kube-api-access-9sv8q\") pod \"auto-csr-approver-29539318-d9tmc\" (UID: \"bf3aad6a-fbd9-4a24-a489-33507709811b\") " pod="openshift-infra/auto-csr-approver-29539318-d9tmc" Mar 01 09:58:00 crc kubenswrapper[4792]: I0301 09:58:00.359047 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sv8q\" (UniqueName: \"kubernetes.io/projected/bf3aad6a-fbd9-4a24-a489-33507709811b-kube-api-access-9sv8q\") pod \"auto-csr-approver-29539318-d9tmc\" (UID: \"bf3aad6a-fbd9-4a24-a489-33507709811b\") " pod="openshift-infra/auto-csr-approver-29539318-d9tmc" Mar 01 09:58:00 crc kubenswrapper[4792]: I0301 09:58:00.470289 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539318-d9tmc" Mar 01 09:58:00 crc kubenswrapper[4792]: I0301 09:58:00.922886 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539318-d9tmc"] Mar 01 09:58:01 crc kubenswrapper[4792]: I0301 09:58:01.846547 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539318-d9tmc" event={"ID":"bf3aad6a-fbd9-4a24-a489-33507709811b","Type":"ContainerStarted","Data":"9e03b1c59248ba09c1b8c07faed32e4559aa4ccbf524fc20f2afd8ba80f48cc9"} Mar 01 09:58:02 crc kubenswrapper[4792]: I0301 09:58:02.855953 4792 generic.go:334] "Generic (PLEG): container finished" podID="bf3aad6a-fbd9-4a24-a489-33507709811b" containerID="7152ca7878f74975420b6650bf54cd79c2b676e3ea865b2cbf55b92459e46fa6" exitCode=0 Mar 01 09:58:02 crc kubenswrapper[4792]: I0301 09:58:02.855999 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539318-d9tmc" event={"ID":"bf3aad6a-fbd9-4a24-a489-33507709811b","Type":"ContainerDied","Data":"7152ca7878f74975420b6650bf54cd79c2b676e3ea865b2cbf55b92459e46fa6"} Mar 01 09:58:04 crc kubenswrapper[4792]: I0301 09:58:04.247767 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539318-d9tmc" Mar 01 09:58:04 crc kubenswrapper[4792]: I0301 09:58:04.314040 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sv8q\" (UniqueName: \"kubernetes.io/projected/bf3aad6a-fbd9-4a24-a489-33507709811b-kube-api-access-9sv8q\") pod \"bf3aad6a-fbd9-4a24-a489-33507709811b\" (UID: \"bf3aad6a-fbd9-4a24-a489-33507709811b\") " Mar 01 09:58:04 crc kubenswrapper[4792]: I0301 09:58:04.330295 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf3aad6a-fbd9-4a24-a489-33507709811b-kube-api-access-9sv8q" (OuterVolumeSpecName: "kube-api-access-9sv8q") pod "bf3aad6a-fbd9-4a24-a489-33507709811b" (UID: "bf3aad6a-fbd9-4a24-a489-33507709811b"). InnerVolumeSpecName "kube-api-access-9sv8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:58:04 crc kubenswrapper[4792]: I0301 09:58:04.416881 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sv8q\" (UniqueName: \"kubernetes.io/projected/bf3aad6a-fbd9-4a24-a489-33507709811b-kube-api-access-9sv8q\") on node \"crc\" DevicePath \"\"" Mar 01 09:58:04 crc kubenswrapper[4792]: I0301 09:58:04.871236 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539318-d9tmc" event={"ID":"bf3aad6a-fbd9-4a24-a489-33507709811b","Type":"ContainerDied","Data":"9e03b1c59248ba09c1b8c07faed32e4559aa4ccbf524fc20f2afd8ba80f48cc9"} Mar 01 09:58:04 crc kubenswrapper[4792]: I0301 09:58:04.871274 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e03b1c59248ba09c1b8c07faed32e4559aa4ccbf524fc20f2afd8ba80f48cc9" Mar 01 09:58:04 crc kubenswrapper[4792]: I0301 09:58:04.871252 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539318-d9tmc" Mar 01 09:58:04 crc kubenswrapper[4792]: I0301 09:58:04.944366 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:58:04 crc kubenswrapper[4792]: I0301 09:58:04.944418 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:58:04 crc kubenswrapper[4792]: I0301 09:58:04.944464 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:58:04 crc kubenswrapper[4792]: I0301 09:58:04.945139 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0c8bbdbe3553a3ed4617103bbd379245337bfad8f18870faba13af9b3c14caa1"} pod="openshift-machine-config-operator/machine-config-daemon-bqszv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 01 09:58:04 crc kubenswrapper[4792]: I0301 09:58:04.945189 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" containerID="cri-o://0c8bbdbe3553a3ed4617103bbd379245337bfad8f18870faba13af9b3c14caa1" gracePeriod=600 Mar 01 09:58:05 crc kubenswrapper[4792]: I0301 09:58:05.308881 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539312-tpqkw"] Mar 01 09:58:05 crc kubenswrapper[4792]: I0301 09:58:05.316717 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539312-tpqkw"] Mar 01 09:58:05 crc kubenswrapper[4792]: I0301 09:58:05.417892 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11abc020-6c8a-4de3-8afc-229196293ab0" path="/var/lib/kubelet/pods/11abc020-6c8a-4de3-8afc-229196293ab0/volumes" Mar 01 09:58:05 crc kubenswrapper[4792]: I0301 09:58:05.880583 4792 generic.go:334] "Generic (PLEG): container finished" podID="9105f6b0-6f16-47aa-8009-73736a90b765" containerID="0c8bbdbe3553a3ed4617103bbd379245337bfad8f18870faba13af9b3c14caa1" exitCode=0 Mar 01 09:58:05 crc kubenswrapper[4792]: I0301 09:58:05.880650 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerDied","Data":"0c8bbdbe3553a3ed4617103bbd379245337bfad8f18870faba13af9b3c14caa1"} Mar 01 09:58:05 crc kubenswrapper[4792]: I0301 09:58:05.880968 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerStarted","Data":"a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b"} Mar 01 09:58:05 crc kubenswrapper[4792]: I0301 09:58:05.880995 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:58:31 crc kubenswrapper[4792]: I0301 09:58:31.704816 4792 scope.go:117] "RemoveContainer" containerID="89836f5e70f069d0d23a66a3f24b77f6002210b440a744a89543043e75793243" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.148718 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539320-dbp4j"] Mar 01 10:00:00 crc kubenswrapper[4792]: E0301 10:00:00.149790 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf3aad6a-fbd9-4a24-a489-33507709811b" containerName="oc" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.149809 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf3aad6a-fbd9-4a24-a489-33507709811b" containerName="oc" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.150261 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf3aad6a-fbd9-4a24-a489-33507709811b" containerName="oc" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.151330 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539320-dbp4j" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.153829 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.154425 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.162015 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539320-bzqcr"] Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.163147 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539320-bzqcr" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.167801 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.167955 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.168085 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.170473 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539320-bzqcr"] Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.177422 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539320-dbp4j"] Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.305953 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8730f988-7504-4e33-a0dc-406e4b21ca50-config-volume\") pod \"collect-profiles-29539320-dbp4j\" (UID: \"8730f988-7504-4e33-a0dc-406e4b21ca50\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539320-dbp4j" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.306108 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84gpt\" (UniqueName: \"kubernetes.io/projected/8730f988-7504-4e33-a0dc-406e4b21ca50-kube-api-access-84gpt\") pod \"collect-profiles-29539320-dbp4j\" (UID: \"8730f988-7504-4e33-a0dc-406e4b21ca50\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539320-dbp4j" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.306206 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8730f988-7504-4e33-a0dc-406e4b21ca50-secret-volume\") pod \"collect-profiles-29539320-dbp4j\" (UID: \"8730f988-7504-4e33-a0dc-406e4b21ca50\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539320-dbp4j" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.306226 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcgkf\" (UniqueName: \"kubernetes.io/projected/a5c085ec-b23e-4ad9-ae76-9775921b667d-kube-api-access-gcgkf\") pod \"auto-csr-approver-29539320-bzqcr\" (UID: \"a5c085ec-b23e-4ad9-ae76-9775921b667d\") " pod="openshift-infra/auto-csr-approver-29539320-bzqcr" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.408351 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8730f988-7504-4e33-a0dc-406e4b21ca50-secret-volume\") pod \"collect-profiles-29539320-dbp4j\" (UID: \"8730f988-7504-4e33-a0dc-406e4b21ca50\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539320-dbp4j" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.408430 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcgkf\" (UniqueName: \"kubernetes.io/projected/a5c085ec-b23e-4ad9-ae76-9775921b667d-kube-api-access-gcgkf\") pod \"auto-csr-approver-29539320-bzqcr\" (UID: \"a5c085ec-b23e-4ad9-ae76-9775921b667d\") " pod="openshift-infra/auto-csr-approver-29539320-bzqcr" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.408492 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8730f988-7504-4e33-a0dc-406e4b21ca50-config-volume\") pod \"collect-profiles-29539320-dbp4j\" (UID: \"8730f988-7504-4e33-a0dc-406e4b21ca50\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539320-dbp4j" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.408572 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84gpt\" (UniqueName: \"kubernetes.io/projected/8730f988-7504-4e33-a0dc-406e4b21ca50-kube-api-access-84gpt\") pod \"collect-profiles-29539320-dbp4j\" (UID: \"8730f988-7504-4e33-a0dc-406e4b21ca50\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539320-dbp4j" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.410049 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8730f988-7504-4e33-a0dc-406e4b21ca50-config-volume\") pod \"collect-profiles-29539320-dbp4j\" (UID: \"8730f988-7504-4e33-a0dc-406e4b21ca50\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539320-dbp4j" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.414038 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8730f988-7504-4e33-a0dc-406e4b21ca50-secret-volume\") pod \"collect-profiles-29539320-dbp4j\" (UID: \"8730f988-7504-4e33-a0dc-406e4b21ca50\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539320-dbp4j" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.428669 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcgkf\" (UniqueName: \"kubernetes.io/projected/a5c085ec-b23e-4ad9-ae76-9775921b667d-kube-api-access-gcgkf\") pod \"auto-csr-approver-29539320-bzqcr\" (UID: \"a5c085ec-b23e-4ad9-ae76-9775921b667d\") " pod="openshift-infra/auto-csr-approver-29539320-bzqcr" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.431700 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84gpt\" (UniqueName: \"kubernetes.io/projected/8730f988-7504-4e33-a0dc-406e4b21ca50-kube-api-access-84gpt\") pod \"collect-profiles-29539320-dbp4j\" (UID: \"8730f988-7504-4e33-a0dc-406e4b21ca50\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539320-dbp4j" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.522241 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539320-dbp4j" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.532706 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539320-bzqcr" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.983197 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539320-dbp4j"] Mar 01 10:00:01 crc kubenswrapper[4792]: I0301 10:00:01.057566 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539320-bzqcr"] Mar 01 10:00:01 crc kubenswrapper[4792]: I0301 10:00:01.809621 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539320-bzqcr" event={"ID":"a5c085ec-b23e-4ad9-ae76-9775921b667d","Type":"ContainerStarted","Data":"6950cd079a11f0e71e0a4f9c432b59a859efd91bdf75b8459961647862c328b1"} Mar 01 10:00:01 crc kubenswrapper[4792]: I0301 10:00:01.811954 4792 generic.go:334] "Generic (PLEG): container finished" podID="8730f988-7504-4e33-a0dc-406e4b21ca50" containerID="f2d004ec429ab6ffe82899f882bc50441e9749ff1c56e27e78714fcc65b133e4" exitCode=0 Mar 01 10:00:01 crc kubenswrapper[4792]: I0301 10:00:01.811979 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539320-dbp4j" event={"ID":"8730f988-7504-4e33-a0dc-406e4b21ca50","Type":"ContainerDied","Data":"f2d004ec429ab6ffe82899f882bc50441e9749ff1c56e27e78714fcc65b133e4"} Mar 01 10:00:01 crc kubenswrapper[4792]: I0301 10:00:01.811992 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539320-dbp4j" event={"ID":"8730f988-7504-4e33-a0dc-406e4b21ca50","Type":"ContainerStarted","Data":"81e84ed6319131ce2cccc0987ee94a63105b632fd5d0ca856c2abe3c989cb70b"} Mar 01 10:00:03 crc kubenswrapper[4792]: I0301 10:00:03.131476 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539320-dbp4j" Mar 01 10:00:03 crc kubenswrapper[4792]: I0301 10:00:03.170420 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8730f988-7504-4e33-a0dc-406e4b21ca50-secret-volume\") pod \"8730f988-7504-4e33-a0dc-406e4b21ca50\" (UID: \"8730f988-7504-4e33-a0dc-406e4b21ca50\") " Mar 01 10:00:03 crc kubenswrapper[4792]: I0301 10:00:03.170503 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8730f988-7504-4e33-a0dc-406e4b21ca50-config-volume\") pod \"8730f988-7504-4e33-a0dc-406e4b21ca50\" (UID: \"8730f988-7504-4e33-a0dc-406e4b21ca50\") " Mar 01 10:00:03 crc kubenswrapper[4792]: I0301 10:00:03.170673 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84gpt\" (UniqueName: \"kubernetes.io/projected/8730f988-7504-4e33-a0dc-406e4b21ca50-kube-api-access-84gpt\") pod \"8730f988-7504-4e33-a0dc-406e4b21ca50\" (UID: \"8730f988-7504-4e33-a0dc-406e4b21ca50\") " Mar 01 10:00:03 crc kubenswrapper[4792]: I0301 10:00:03.171478 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8730f988-7504-4e33-a0dc-406e4b21ca50-config-volume" (OuterVolumeSpecName: "config-volume") pod "8730f988-7504-4e33-a0dc-406e4b21ca50" (UID: "8730f988-7504-4e33-a0dc-406e4b21ca50"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 10:00:03 crc kubenswrapper[4792]: I0301 10:00:03.179361 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8730f988-7504-4e33-a0dc-406e4b21ca50-kube-api-access-84gpt" (OuterVolumeSpecName: "kube-api-access-84gpt") pod "8730f988-7504-4e33-a0dc-406e4b21ca50" (UID: "8730f988-7504-4e33-a0dc-406e4b21ca50"). InnerVolumeSpecName "kube-api-access-84gpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:00:03 crc kubenswrapper[4792]: I0301 10:00:03.181034 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8730f988-7504-4e33-a0dc-406e4b21ca50-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8730f988-7504-4e33-a0dc-406e4b21ca50" (UID: "8730f988-7504-4e33-a0dc-406e4b21ca50"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:03 crc kubenswrapper[4792]: I0301 10:00:03.272557 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84gpt\" (UniqueName: \"kubernetes.io/projected/8730f988-7504-4e33-a0dc-406e4b21ca50-kube-api-access-84gpt\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:03 crc kubenswrapper[4792]: I0301 10:00:03.272584 4792 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8730f988-7504-4e33-a0dc-406e4b21ca50-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:03 crc kubenswrapper[4792]: I0301 10:00:03.272595 4792 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8730f988-7504-4e33-a0dc-406e4b21ca50-config-volume\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:03 crc kubenswrapper[4792]: I0301 10:00:03.830383 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539320-dbp4j" event={"ID":"8730f988-7504-4e33-a0dc-406e4b21ca50","Type":"ContainerDied","Data":"81e84ed6319131ce2cccc0987ee94a63105b632fd5d0ca856c2abe3c989cb70b"} Mar 01 10:00:03 crc kubenswrapper[4792]: I0301 10:00:03.830420 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81e84ed6319131ce2cccc0987ee94a63105b632fd5d0ca856c2abe3c989cb70b" Mar 01 10:00:03 crc kubenswrapper[4792]: I0301 10:00:03.830436 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539320-dbp4j" Mar 01 10:00:04 crc kubenswrapper[4792]: I0301 10:00:04.214594 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539275-5mhm4"] Mar 01 10:00:04 crc kubenswrapper[4792]: I0301 10:00:04.224116 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539275-5mhm4"] Mar 01 10:00:05 crc kubenswrapper[4792]: I0301 10:00:05.422682 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8558286c-6cb2-4061-bb84-07803d33b576" path="/var/lib/kubelet/pods/8558286c-6cb2-4061-bb84-07803d33b576/volumes" Mar 01 10:00:18 crc kubenswrapper[4792]: I0301 10:00:18.959447 4792 generic.go:334] "Generic (PLEG): container finished" podID="a5c085ec-b23e-4ad9-ae76-9775921b667d" containerID="4994cd62771f4057b6d1f58071d3828ce7f1350994c490b87bf0e5cb1e97dff9" exitCode=0 Mar 01 10:00:18 crc kubenswrapper[4792]: I0301 10:00:18.959560 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539320-bzqcr" event={"ID":"a5c085ec-b23e-4ad9-ae76-9775921b667d","Type":"ContainerDied","Data":"4994cd62771f4057b6d1f58071d3828ce7f1350994c490b87bf0e5cb1e97dff9"} Mar 01 10:00:20 crc kubenswrapper[4792]: I0301 10:00:20.266830 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539320-bzqcr" Mar 01 10:00:20 crc kubenswrapper[4792]: I0301 10:00:20.374143 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcgkf\" (UniqueName: \"kubernetes.io/projected/a5c085ec-b23e-4ad9-ae76-9775921b667d-kube-api-access-gcgkf\") pod \"a5c085ec-b23e-4ad9-ae76-9775921b667d\" (UID: \"a5c085ec-b23e-4ad9-ae76-9775921b667d\") " Mar 01 10:00:20 crc kubenswrapper[4792]: I0301 10:00:20.391097 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5c085ec-b23e-4ad9-ae76-9775921b667d-kube-api-access-gcgkf" (OuterVolumeSpecName: "kube-api-access-gcgkf") pod "a5c085ec-b23e-4ad9-ae76-9775921b667d" (UID: "a5c085ec-b23e-4ad9-ae76-9775921b667d"). InnerVolumeSpecName "kube-api-access-gcgkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:00:20 crc kubenswrapper[4792]: I0301 10:00:20.476675 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcgkf\" (UniqueName: \"kubernetes.io/projected/a5c085ec-b23e-4ad9-ae76-9775921b667d-kube-api-access-gcgkf\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:20 crc kubenswrapper[4792]: I0301 10:00:20.979394 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539320-bzqcr" event={"ID":"a5c085ec-b23e-4ad9-ae76-9775921b667d","Type":"ContainerDied","Data":"6950cd079a11f0e71e0a4f9c432b59a859efd91bdf75b8459961647862c328b1"} Mar 01 10:00:20 crc kubenswrapper[4792]: I0301 10:00:20.979460 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6950cd079a11f0e71e0a4f9c432b59a859efd91bdf75b8459961647862c328b1" Mar 01 10:00:20 crc kubenswrapper[4792]: I0301 10:00:20.979533 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539320-bzqcr" Mar 01 10:00:21 crc kubenswrapper[4792]: E0301 10:00:21.127868 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5c085ec_b23e_4ad9_ae76_9775921b667d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5c085ec_b23e_4ad9_ae76_9775921b667d.slice/crio-6950cd079a11f0e71e0a4f9c432b59a859efd91bdf75b8459961647862c328b1\": RecentStats: unable to find data in memory cache]" Mar 01 10:00:21 crc kubenswrapper[4792]: I0301 10:00:21.320824 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539314-9brl7"] Mar 01 10:00:21 crc kubenswrapper[4792]: I0301 10:00:21.330347 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539314-9brl7"] Mar 01 10:00:21 crc kubenswrapper[4792]: I0301 10:00:21.426631 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d633fb4c-b1e3-463f-af0a-2891b7130fc0" path="/var/lib/kubelet/pods/d633fb4c-b1e3-463f-af0a-2891b7130fc0/volumes" Mar 01 10:00:31 crc kubenswrapper[4792]: I0301 10:00:31.797495 4792 scope.go:117] "RemoveContainer" containerID="25d35f30a0bda8efbd3c0227d7f47d3a25118a249c78dad70cafb27c52068acf" Mar 01 10:00:31 crc kubenswrapper[4792]: I0301 10:00:31.822266 4792 scope.go:117] "RemoveContainer" containerID="19ca062443b337d8791859ab02de766e48126cb99d1f720dcaa520cb4be8f904" Mar 01 10:00:34 crc kubenswrapper[4792]: I0301 10:00:34.090633 4792 generic.go:334] "Generic (PLEG): container finished" podID="d7776778-c586-4ab6-8fdf-bfed4168992d" containerID="136627f27d1451f409d8606c1096deb71cb0c8c3aa23573781151b161026a979" exitCode=0 Mar 01 10:00:34 crc kubenswrapper[4792]: I0301 10:00:34.090677 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" event={"ID":"d7776778-c586-4ab6-8fdf-bfed4168992d","Type":"ContainerDied","Data":"136627f27d1451f409d8606c1096deb71cb0c8c3aa23573781151b161026a979"} Mar 01 10:00:34 crc kubenswrapper[4792]: I0301 10:00:34.942634 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 10:00:34 crc kubenswrapper[4792]: I0301 10:00:34.942963 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.467865 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.645419 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkzx5\" (UniqueName: \"kubernetes.io/projected/d7776778-c586-4ab6-8fdf-bfed4168992d-kube-api-access-qkzx5\") pod \"d7776778-c586-4ab6-8fdf-bfed4168992d\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.645488 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-ssh-key-openstack-edpm-ipam\") pod \"d7776778-c586-4ab6-8fdf-bfed4168992d\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.645513 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-ceph\") pod \"d7776778-c586-4ab6-8fdf-bfed4168992d\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.645555 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-extra-config-0\") pod \"d7776778-c586-4ab6-8fdf-bfed4168992d\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.645601 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-inventory\") pod \"d7776778-c586-4ab6-8fdf-bfed4168992d\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.645685 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-custom-ceph-combined-ca-bundle\") pod \"d7776778-c586-4ab6-8fdf-bfed4168992d\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.645722 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-migration-ssh-key-0\") pod \"d7776778-c586-4ab6-8fdf-bfed4168992d\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.645766 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-2\") pod \"d7776778-c586-4ab6-8fdf-bfed4168992d\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.645832 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-0\") pod \"d7776778-c586-4ab6-8fdf-bfed4168992d\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.645875 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-1\") pod \"d7776778-c586-4ab6-8fdf-bfed4168992d\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.645948 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-3\") pod \"d7776778-c586-4ab6-8fdf-bfed4168992d\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.646066 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-migration-ssh-key-1\") pod \"d7776778-c586-4ab6-8fdf-bfed4168992d\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.646113 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/d7776778-c586-4ab6-8fdf-bfed4168992d-ceph-nova-0\") pod \"d7776778-c586-4ab6-8fdf-bfed4168992d\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.660341 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-custom-ceph-combined-ca-bundle" (OuterVolumeSpecName: "nova-custom-ceph-combined-ca-bundle") pod "d7776778-c586-4ab6-8fdf-bfed4168992d" (UID: "d7776778-c586-4ab6-8fdf-bfed4168992d"). InnerVolumeSpecName "nova-custom-ceph-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.660398 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-ceph" (OuterVolumeSpecName: "ceph") pod "d7776778-c586-4ab6-8fdf-bfed4168992d" (UID: "d7776778-c586-4ab6-8fdf-bfed4168992d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.667721 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7776778-c586-4ab6-8fdf-bfed4168992d-kube-api-access-qkzx5" (OuterVolumeSpecName: "kube-api-access-qkzx5") pod "d7776778-c586-4ab6-8fdf-bfed4168992d" (UID: "d7776778-c586-4ab6-8fdf-bfed4168992d"). InnerVolumeSpecName "kube-api-access-qkzx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.674416 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "d7776778-c586-4ab6-8fdf-bfed4168992d" (UID: "d7776778-c586-4ab6-8fdf-bfed4168992d"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.674864 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "d7776778-c586-4ab6-8fdf-bfed4168992d" (UID: "d7776778-c586-4ab6-8fdf-bfed4168992d"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.686282 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d7776778-c586-4ab6-8fdf-bfed4168992d" (UID: "d7776778-c586-4ab6-8fdf-bfed4168992d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.690773 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "d7776778-c586-4ab6-8fdf-bfed4168992d" (UID: "d7776778-c586-4ab6-8fdf-bfed4168992d"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.695087 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-inventory" (OuterVolumeSpecName: "inventory") pod "d7776778-c586-4ab6-8fdf-bfed4168992d" (UID: "d7776778-c586-4ab6-8fdf-bfed4168992d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.695130 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "d7776778-c586-4ab6-8fdf-bfed4168992d" (UID: "d7776778-c586-4ab6-8fdf-bfed4168992d"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.701877 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7776778-c586-4ab6-8fdf-bfed4168992d-ceph-nova-0" (OuterVolumeSpecName: "ceph-nova-0") pod "d7776778-c586-4ab6-8fdf-bfed4168992d" (UID: "d7776778-c586-4ab6-8fdf-bfed4168992d"). InnerVolumeSpecName "ceph-nova-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.703619 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "d7776778-c586-4ab6-8fdf-bfed4168992d" (UID: "d7776778-c586-4ab6-8fdf-bfed4168992d"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.705643 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "d7776778-c586-4ab6-8fdf-bfed4168992d" (UID: "d7776778-c586-4ab6-8fdf-bfed4168992d"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.714065 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "d7776778-c586-4ab6-8fdf-bfed4168992d" (UID: "d7776778-c586-4ab6-8fdf-bfed4168992d"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.750945 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkzx5\" (UniqueName: \"kubernetes.io/projected/d7776778-c586-4ab6-8fdf-bfed4168992d-kube-api-access-qkzx5\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.750972 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.750984 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-ceph\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.750996 4792 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.751005 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.751014 4792 reconciler_common.go:293] "Volume detached for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-custom-ceph-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.751024 4792 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.751032 4792 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.751041 4792 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.751049 4792 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.751057 4792 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.751066 4792 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.751074 4792 reconciler_common.go:293] "Volume detached for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/d7776778-c586-4ab6-8fdf-bfed4168992d-ceph-nova-0\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:36 crc kubenswrapper[4792]: I0301 10:00:36.109041 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" event={"ID":"d7776778-c586-4ab6-8fdf-bfed4168992d","Type":"ContainerDied","Data":"f6255f4f054d2fb3983c00b8a4caf1954fd5251d362befc32634842b245118e9"} Mar 01 10:00:36 crc kubenswrapper[4792]: I0301 10:00:36.109323 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6255f4f054d2fb3983c00b8a4caf1954fd5251d362befc32634842b245118e9" Mar 01 10:00:36 crc kubenswrapper[4792]: I0301 10:00:36.109104 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.242682 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 01 10:00:50 crc kubenswrapper[4792]: E0301 10:00:50.243560 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8730f988-7504-4e33-a0dc-406e4b21ca50" containerName="collect-profiles" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.243571 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8730f988-7504-4e33-a0dc-406e4b21ca50" containerName="collect-profiles" Mar 01 10:00:50 crc kubenswrapper[4792]: E0301 10:00:50.243591 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7776778-c586-4ab6-8fdf-bfed4168992d" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.243599 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7776778-c586-4ab6-8fdf-bfed4168992d" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Mar 01 10:00:50 crc kubenswrapper[4792]: E0301 10:00:50.243614 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5c085ec-b23e-4ad9-ae76-9775921b667d" containerName="oc" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.243621 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5c085ec-b23e-4ad9-ae76-9775921b667d" containerName="oc" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.243805 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5c085ec-b23e-4ad9-ae76-9775921b667d" containerName="oc" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.243832 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="8730f988-7504-4e33-a0dc-406e4b21ca50" containerName="collect-profiles" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.243848 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7776778-c586-4ab6-8fdf-bfed4168992d" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.244959 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.249478 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.249762 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.253741 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.258697 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.261597 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.281298 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.288065 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.373843 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.373887 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23d15722-3d0f-44ce-ac55-eba67760f0e9-config-data\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.373922 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.373943 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-dev\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.373962 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.373978 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23d15722-3d0f-44ce-ac55-eba67760f0e9-scripts\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.373994 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhtmd\" (UniqueName: \"kubernetes.io/projected/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-kube-api-access-rhtmd\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.374587 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmq67\" (UniqueName: \"kubernetes.io/projected/23d15722-3d0f-44ce-ac55-eba67760f0e9-kube-api-access-qmq67\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.374633 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.374655 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.374679 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-sys\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.374712 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.374735 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-lib-modules\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.374754 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.374800 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-etc-nvme\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.374837 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23d15722-3d0f-44ce-ac55-eba67760f0e9-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.374865 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23d15722-3d0f-44ce-ac55-eba67760f0e9-config-data-custom\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.374889 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.374929 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-sys\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.374946 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.374970 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-run\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.374994 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.375025 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-dev\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.375045 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.375066 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.375109 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/23d15722-3d0f-44ce-ac55-eba67760f0e9-ceph\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.375130 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-run\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.375145 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.375163 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.375235 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.375354 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.375409 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477245 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477319 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/23d15722-3d0f-44ce-ac55-eba67760f0e9-ceph\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477343 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477370 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-run\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477388 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477408 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477427 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477445 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477464 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477478 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477576 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23d15722-3d0f-44ce-ac55-eba67760f0e9-config-data\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477608 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477632 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-dev\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477657 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23d15722-3d0f-44ce-ac55-eba67760f0e9-scripts\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477677 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477702 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhtmd\" (UniqueName: \"kubernetes.io/projected/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-kube-api-access-rhtmd\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477725 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477757 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-run\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477767 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmq67\" (UniqueName: \"kubernetes.io/projected/23d15722-3d0f-44ce-ac55-eba67760f0e9-kube-api-access-qmq67\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477797 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477800 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477839 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477859 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-sys\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477882 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477916 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-lib-modules\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477933 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477962 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-etc-nvme\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.478003 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23d15722-3d0f-44ce-ac55-eba67760f0e9-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.478065 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23d15722-3d0f-44ce-ac55-eba67760f0e9-config-data-custom\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.478085 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.478106 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-sys\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.478124 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.478398 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-run\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.478429 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.478446 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-dev\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.478463 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.478491 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.478477 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.478463 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.478653 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.478647 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-dev\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.478715 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-run\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.478303 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-lib-modules\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.478337 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.478737 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.479308 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.479330 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-dev\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.479367 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.479388 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-sys\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.478611 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-etc-nvme\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.479527 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-sys\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.479587 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.492130 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23d15722-3d0f-44ce-ac55-eba67760f0e9-scripts\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.493228 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.498814 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23d15722-3d0f-44ce-ac55-eba67760f0e9-config-data-custom\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.498855 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23d15722-3d0f-44ce-ac55-eba67760f0e9-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.499341 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.499415 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.500456 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/23d15722-3d0f-44ce-ac55-eba67760f0e9-ceph\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.500919 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23d15722-3d0f-44ce-ac55-eba67760f0e9-config-data\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.502201 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.503054 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.507346 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmq67\" (UniqueName: \"kubernetes.io/projected/23d15722-3d0f-44ce-ac55-eba67760f0e9-kube-api-access-qmq67\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.511690 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhtmd\" (UniqueName: \"kubernetes.io/projected/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-kube-api-access-rhtmd\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.566676 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.578490 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.942881 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-svb22"] Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.945598 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-svb22" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.957595 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-svb22"] Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.047776 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-fa75-account-create-update-s2kng"] Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.048996 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-fa75-account-create-update-s2kng" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.063299 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.078674 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-fa75-account-create-update-s2kng"] Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.089048 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7736148-bc12-4621-a1d2-efc4a0143b42-operator-scripts\") pod \"manila-db-create-svb22\" (UID: \"c7736148-bc12-4621-a1d2-efc4a0143b42\") " pod="openstack/manila-db-create-svb22" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.089118 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62fr5\" (UniqueName: \"kubernetes.io/projected/c7736148-bc12-4621-a1d2-efc4a0143b42-kube-api-access-62fr5\") pod \"manila-db-create-svb22\" (UID: \"c7736148-bc12-4621-a1d2-efc4a0143b42\") " pod="openstack/manila-db-create-svb22" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.094129 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5887d74897-rnlz9"] Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.095623 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5887d74897-rnlz9" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.102595 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-c29rk" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.102702 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.103030 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.103186 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.151004 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.152576 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.157365 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.157740 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-vwjrh" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.157942 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.159115 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.167008 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5887d74897-rnlz9"] Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.186557 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.190952 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-scripts\") pod \"horizon-5887d74897-rnlz9\" (UID: \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\") " pod="openstack/horizon-5887d74897-rnlz9" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.191031 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0441b486-847a-4f32-8df2-a1284f39ee5d-operator-scripts\") pod \"manila-fa75-account-create-update-s2kng\" (UID: \"0441b486-847a-4f32-8df2-a1284f39ee5d\") " pod="openstack/manila-fa75-account-create-update-s2kng" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.191076 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbq5q\" (UniqueName: \"kubernetes.io/projected/0441b486-847a-4f32-8df2-a1284f39ee5d-kube-api-access-jbq5q\") pod \"manila-fa75-account-create-update-s2kng\" (UID: \"0441b486-847a-4f32-8df2-a1284f39ee5d\") " pod="openstack/manila-fa75-account-create-update-s2kng" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.191101 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7736148-bc12-4621-a1d2-efc4a0143b42-operator-scripts\") pod \"manila-db-create-svb22\" (UID: \"c7736148-bc12-4621-a1d2-efc4a0143b42\") " pod="openstack/manila-db-create-svb22" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.191144 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62fr5\" (UniqueName: \"kubernetes.io/projected/c7736148-bc12-4621-a1d2-efc4a0143b42-kube-api-access-62fr5\") pod \"manila-db-create-svb22\" (UID: \"c7736148-bc12-4621-a1d2-efc4a0143b42\") " pod="openstack/manila-db-create-svb22" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.191172 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-config-data\") pod \"horizon-5887d74897-rnlz9\" (UID: \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\") " pod="openstack/horizon-5887d74897-rnlz9" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.191203 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pf4f\" (UniqueName: \"kubernetes.io/projected/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-kube-api-access-2pf4f\") pod \"horizon-5887d74897-rnlz9\" (UID: \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\") " pod="openstack/horizon-5887d74897-rnlz9" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.191231 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-logs\") pod \"horizon-5887d74897-rnlz9\" (UID: \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\") " pod="openstack/horizon-5887d74897-rnlz9" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.191252 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-horizon-secret-key\") pod \"horizon-5887d74897-rnlz9\" (UID: \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\") " pod="openstack/horizon-5887d74897-rnlz9" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.196157 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7736148-bc12-4621-a1d2-efc4a0143b42-operator-scripts\") pod \"manila-db-create-svb22\" (UID: \"c7736148-bc12-4621-a1d2-efc4a0143b42\") " pod="openstack/manila-db-create-svb22" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.214240 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.215939 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.222812 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.223057 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.240585 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.248339 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.249231 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62fr5\" (UniqueName: \"kubernetes.io/projected/c7736148-bc12-4621-a1d2-efc4a0143b42-kube-api-access-62fr5\") pod \"manila-db-create-svb22\" (UID: \"c7736148-bc12-4621-a1d2-efc4a0143b42\") " pod="openstack/manila-db-create-svb22" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.279697 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-svb22" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.293943 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-scripts\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.294017 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.294051 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-config-data\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.294078 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0441b486-847a-4f32-8df2-a1284f39ee5d-operator-scripts\") pod \"manila-fa75-account-create-update-s2kng\" (UID: \"0441b486-847a-4f32-8df2-a1284f39ee5d\") " pod="openstack/manila-fa75-account-create-update-s2kng" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.294132 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbq5q\" (UniqueName: \"kubernetes.io/projected/0441b486-847a-4f32-8df2-a1284f39ee5d-kube-api-access-jbq5q\") pod \"manila-fa75-account-create-update-s2kng\" (UID: \"0441b486-847a-4f32-8df2-a1284f39ee5d\") " pod="openstack/manila-fa75-account-create-update-s2kng" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.294168 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.294221 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.294243 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbxxb\" (UniqueName: \"kubernetes.io/projected/316eb94f-166a-4fa2-99b8-8967503eba43-kube-api-access-pbxxb\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.294285 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/316eb94f-166a-4fa2-99b8-8967503eba43-ceph\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.294306 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-config-data\") pod \"horizon-5887d74897-rnlz9\" (UID: \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\") " pod="openstack/horizon-5887d74897-rnlz9" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.294355 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pf4f\" (UniqueName: \"kubernetes.io/projected/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-kube-api-access-2pf4f\") pod \"horizon-5887d74897-rnlz9\" (UID: \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\") " pod="openstack/horizon-5887d74897-rnlz9" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.294377 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/316eb94f-166a-4fa2-99b8-8967503eba43-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.294415 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-logs\") pod \"horizon-5887d74897-rnlz9\" (UID: \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\") " pod="openstack/horizon-5887d74897-rnlz9" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.294441 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-horizon-secret-key\") pod \"horizon-5887d74897-rnlz9\" (UID: \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\") " pod="openstack/horizon-5887d74897-rnlz9" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.294493 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-scripts\") pod \"horizon-5887d74897-rnlz9\" (UID: \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\") " pod="openstack/horizon-5887d74897-rnlz9" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.294515 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/316eb94f-166a-4fa2-99b8-8967503eba43-logs\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.301028 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0441b486-847a-4f32-8df2-a1284f39ee5d-operator-scripts\") pod \"manila-fa75-account-create-update-s2kng\" (UID: \"0441b486-847a-4f32-8df2-a1284f39ee5d\") " pod="openstack/manila-fa75-account-create-update-s2kng" Mar 01 10:00:51 crc kubenswrapper[4792]: E0301 10:00:51.302285 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceph combined-ca-bundle config-data glance httpd-run internal-tls-certs kube-api-access-lxswf logs scripts], unattached volumes=[], failed to process volumes=[ceph combined-ca-bundle config-data glance httpd-run internal-tls-certs kube-api-access-lxswf logs scripts]: context canceled" pod="openstack/glance-default-internal-api-0" podUID="6c25331e-14fb-47d2-aa34-d84b133255e3" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.302999 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-scripts\") pod \"horizon-5887d74897-rnlz9\" (UID: \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\") " pod="openstack/horizon-5887d74897-rnlz9" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.303432 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-logs\") pod \"horizon-5887d74897-rnlz9\" (UID: \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\") " pod="openstack/horizon-5887d74897-rnlz9" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.307263 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-config-data\") pod \"horizon-5887d74897-rnlz9\" (UID: \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\") " pod="openstack/horizon-5887d74897-rnlz9" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.316106 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-horizon-secret-key\") pod \"horizon-5887d74897-rnlz9\" (UID: \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\") " pod="openstack/horizon-5887d74897-rnlz9" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.332477 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbq5q\" (UniqueName: \"kubernetes.io/projected/0441b486-847a-4f32-8df2-a1284f39ee5d-kube-api-access-jbq5q\") pod \"manila-fa75-account-create-update-s2kng\" (UID: \"0441b486-847a-4f32-8df2-a1284f39ee5d\") " pod="openstack/manila-fa75-account-create-update-s2kng" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.348742 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 01 10:00:51 crc kubenswrapper[4792]: E0301 10:00:51.349445 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceph combined-ca-bundle config-data glance httpd-run kube-api-access-pbxxb logs public-tls-certs scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-default-external-api-0" podUID="316eb94f-166a-4fa2-99b8-8967503eba43" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.365376 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pf4f\" (UniqueName: \"kubernetes.io/projected/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-kube-api-access-2pf4f\") pod \"horizon-5887d74897-rnlz9\" (UID: \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\") " pod="openstack/horizon-5887d74897-rnlz9" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.371593 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5996ddfbb9-drpwn"] Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.373996 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5996ddfbb9-drpwn" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.399319 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbxxb\" (UniqueName: \"kubernetes.io/projected/316eb94f-166a-4fa2-99b8-8967503eba43-kube-api-access-pbxxb\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.399371 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6c25331e-14fb-47d2-aa34-d84b133255e3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.399431 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.399460 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6c25331e-14fb-47d2-aa34-d84b133255e3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.399508 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.399542 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/316eb94f-166a-4fa2-99b8-8967503eba43-ceph\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.399623 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/316eb94f-166a-4fa2-99b8-8967503eba43-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.399642 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c25331e-14fb-47d2-aa34-d84b133255e3-logs\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.399675 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.399747 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.399804 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/316eb94f-166a-4fa2-99b8-8967503eba43-logs\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.399842 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-scripts\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.399929 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.399965 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-config-data\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.400010 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.400083 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxswf\" (UniqueName: \"kubernetes.io/projected/6c25331e-14fb-47d2-aa34-d84b133255e3-kube-api-access-lxswf\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.400154 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.400215 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.410215 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/316eb94f-166a-4fa2-99b8-8967503eba43-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.410280 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.410588 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/316eb94f-166a-4fa2-99b8-8967503eba43-logs\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.410866 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.415562 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-fa75-account-create-update-s2kng" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.444712 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.444991 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.445233 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.445401 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-c29rk" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.453164 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/316eb94f-166a-4fa2-99b8-8967503eba43-ceph\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.457184 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5887d74897-rnlz9" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.463076 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbxxb\" (UniqueName: \"kubernetes.io/projected/316eb94f-166a-4fa2-99b8-8967503eba43-kube-api-access-pbxxb\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.463568 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.472794 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-config-data\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.477934 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-scripts\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.477970 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.501825 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7026175e-efaf-497a-aaf1-079f2811ad08-logs\") pod \"horizon-5996ddfbb9-drpwn\" (UID: \"7026175e-efaf-497a-aaf1-079f2811ad08\") " pod="openstack/horizon-5996ddfbb9-drpwn" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.501870 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.501973 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcmd2\" (UniqueName: \"kubernetes.io/projected/7026175e-efaf-497a-aaf1-079f2811ad08-kube-api-access-zcmd2\") pod \"horizon-5996ddfbb9-drpwn\" (UID: \"7026175e-efaf-497a-aaf1-079f2811ad08\") " pod="openstack/horizon-5996ddfbb9-drpwn" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.502003 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7026175e-efaf-497a-aaf1-079f2811ad08-scripts\") pod \"horizon-5996ddfbb9-drpwn\" (UID: \"7026175e-efaf-497a-aaf1-079f2811ad08\") " pod="openstack/horizon-5996ddfbb9-drpwn" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.502042 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.502313 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxswf\" (UniqueName: \"kubernetes.io/projected/6c25331e-14fb-47d2-aa34-d84b133255e3-kube-api-access-lxswf\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.502351 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7026175e-efaf-497a-aaf1-079f2811ad08-horizon-secret-key\") pod \"horizon-5996ddfbb9-drpwn\" (UID: \"7026175e-efaf-497a-aaf1-079f2811ad08\") " pod="openstack/horizon-5996ddfbb9-drpwn" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.502379 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6c25331e-14fb-47d2-aa34-d84b133255e3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.502396 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6c25331e-14fb-47d2-aa34-d84b133255e3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.502413 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.502437 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.502488 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c25331e-14fb-47d2-aa34-d84b133255e3-logs\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.502512 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7026175e-efaf-497a-aaf1-079f2811ad08-config-data\") pod \"horizon-5996ddfbb9-drpwn\" (UID: \"7026175e-efaf-497a-aaf1-079f2811ad08\") " pod="openstack/horizon-5996ddfbb9-drpwn" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.502529 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.503128 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.503698 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6c25331e-14fb-47d2-aa34-d84b133255e3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.503766 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c25331e-14fb-47d2-aa34-d84b133255e3-logs\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.504846 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.507709 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.521693 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.522421 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.531177 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.535419 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6c25331e-14fb-47d2-aa34-d84b133255e3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.543122 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.543798 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.549686 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxswf\" (UniqueName: \"kubernetes.io/projected/6c25331e-14fb-47d2-aa34-d84b133255e3-kube-api-access-lxswf\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.558171 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.614222 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7026175e-efaf-497a-aaf1-079f2811ad08-config-data\") pod \"horizon-5996ddfbb9-drpwn\" (UID: \"7026175e-efaf-497a-aaf1-079f2811ad08\") " pod="openstack/horizon-5996ddfbb9-drpwn" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.614299 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7026175e-efaf-497a-aaf1-079f2811ad08-logs\") pod \"horizon-5996ddfbb9-drpwn\" (UID: \"7026175e-efaf-497a-aaf1-079f2811ad08\") " pod="openstack/horizon-5996ddfbb9-drpwn" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.614361 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcmd2\" (UniqueName: \"kubernetes.io/projected/7026175e-efaf-497a-aaf1-079f2811ad08-kube-api-access-zcmd2\") pod \"horizon-5996ddfbb9-drpwn\" (UID: \"7026175e-efaf-497a-aaf1-079f2811ad08\") " pod="openstack/horizon-5996ddfbb9-drpwn" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.614391 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7026175e-efaf-497a-aaf1-079f2811ad08-scripts\") pod \"horizon-5996ddfbb9-drpwn\" (UID: \"7026175e-efaf-497a-aaf1-079f2811ad08\") " pod="openstack/horizon-5996ddfbb9-drpwn" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.614462 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7026175e-efaf-497a-aaf1-079f2811ad08-horizon-secret-key\") pod \"horizon-5996ddfbb9-drpwn\" (UID: \"7026175e-efaf-497a-aaf1-079f2811ad08\") " pod="openstack/horizon-5996ddfbb9-drpwn" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.617747 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7026175e-efaf-497a-aaf1-079f2811ad08-config-data\") pod \"horizon-5996ddfbb9-drpwn\" (UID: \"7026175e-efaf-497a-aaf1-079f2811ad08\") " pod="openstack/horizon-5996ddfbb9-drpwn" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.618394 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7026175e-efaf-497a-aaf1-079f2811ad08-logs\") pod \"horizon-5996ddfbb9-drpwn\" (UID: \"7026175e-efaf-497a-aaf1-079f2811ad08\") " pod="openstack/horizon-5996ddfbb9-drpwn" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.619139 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5996ddfbb9-drpwn"] Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.624924 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7026175e-efaf-497a-aaf1-079f2811ad08-scripts\") pod \"horizon-5996ddfbb9-drpwn\" (UID: \"7026175e-efaf-497a-aaf1-079f2811ad08\") " pod="openstack/horizon-5996ddfbb9-drpwn" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.641381 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcmd2\" (UniqueName: \"kubernetes.io/projected/7026175e-efaf-497a-aaf1-079f2811ad08-kube-api-access-zcmd2\") pod \"horizon-5996ddfbb9-drpwn\" (UID: \"7026175e-efaf-497a-aaf1-079f2811ad08\") " pod="openstack/horizon-5996ddfbb9-drpwn" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.648139 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7026175e-efaf-497a-aaf1-079f2811ad08-horizon-secret-key\") pod \"horizon-5996ddfbb9-drpwn\" (UID: \"7026175e-efaf-497a-aaf1-079f2811ad08\") " pod="openstack/horizon-5996ddfbb9-drpwn" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.724401 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5996ddfbb9-drpwn" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.080806 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.179543 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-svb22"] Mar 01 10:00:52 crc kubenswrapper[4792]: W0301 10:00:52.246301 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7736148_bc12_4621_a1d2_efc4a0143b42.slice/crio-0f8fca8a39e93a444c277a56b4d25c74d8a2b8d4dd4ac5a12e9ce102355a7e5c WatchSource:0}: Error finding container 0f8fca8a39e93a444c277a56b4d25c74d8a2b8d4dd4ac5a12e9ce102355a7e5c: Status 404 returned error can't find the container with id 0f8fca8a39e93a444c277a56b4d25c74d8a2b8d4dd4ac5a12e9ce102355a7e5c Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.284787 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"23d15722-3d0f-44ce-ac55-eba67760f0e9","Type":"ContainerStarted","Data":"7d8b2f2c199c4c39a20036ac6f44cb2e87ed37faba8d8865f674bc98eb31c240"} Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.288872 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-svb22" event={"ID":"c7736148-bc12-4621-a1d2-efc4a0143b42","Type":"ContainerStarted","Data":"0f8fca8a39e93a444c277a56b4d25c74d8a2b8d4dd4ac5a12e9ce102355a7e5c"} Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.298161 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.298810 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727","Type":"ContainerStarted","Data":"097dfe01451643cae3dafae5e153076ff26e6d6a3d8864e96d7db2cb52ad425a"} Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.298862 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.313583 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.334733 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.347083 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5887d74897-rnlz9"] Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.370148 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-fa75-account-create-update-s2kng"] Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.391164 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.451330 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-internal-tls-certs\") pod \"6c25331e-14fb-47d2-aa34-d84b133255e3\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.451377 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-combined-ca-bundle\") pod \"6c25331e-14fb-47d2-aa34-d84b133255e3\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.451426 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6c25331e-14fb-47d2-aa34-d84b133255e3-ceph\") pod \"6c25331e-14fb-47d2-aa34-d84b133255e3\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.451486 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-scripts\") pod \"316eb94f-166a-4fa2-99b8-8967503eba43\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.451509 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbxxb\" (UniqueName: \"kubernetes.io/projected/316eb94f-166a-4fa2-99b8-8967503eba43-kube-api-access-pbxxb\") pod \"316eb94f-166a-4fa2-99b8-8967503eba43\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.451529 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-config-data\") pod \"6c25331e-14fb-47d2-aa34-d84b133255e3\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.451545 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-config-data\") pod \"316eb94f-166a-4fa2-99b8-8967503eba43\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.451574 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6c25331e-14fb-47d2-aa34-d84b133255e3-httpd-run\") pod \"6c25331e-14fb-47d2-aa34-d84b133255e3\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.451619 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/316eb94f-166a-4fa2-99b8-8967503eba43-httpd-run\") pod \"316eb94f-166a-4fa2-99b8-8967503eba43\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.451644 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/316eb94f-166a-4fa2-99b8-8967503eba43-ceph\") pod \"316eb94f-166a-4fa2-99b8-8967503eba43\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.451678 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-public-tls-certs\") pod \"316eb94f-166a-4fa2-99b8-8967503eba43\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.451709 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"6c25331e-14fb-47d2-aa34-d84b133255e3\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.451731 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"316eb94f-166a-4fa2-99b8-8967503eba43\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.451755 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxswf\" (UniqueName: \"kubernetes.io/projected/6c25331e-14fb-47d2-aa34-d84b133255e3-kube-api-access-lxswf\") pod \"6c25331e-14fb-47d2-aa34-d84b133255e3\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.451789 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-scripts\") pod \"6c25331e-14fb-47d2-aa34-d84b133255e3\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.451808 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-combined-ca-bundle\") pod \"316eb94f-166a-4fa2-99b8-8967503eba43\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.451848 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c25331e-14fb-47d2-aa34-d84b133255e3-logs\") pod \"6c25331e-14fb-47d2-aa34-d84b133255e3\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.451879 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/316eb94f-166a-4fa2-99b8-8967503eba43-logs\") pod \"316eb94f-166a-4fa2-99b8-8967503eba43\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.452617 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/316eb94f-166a-4fa2-99b8-8967503eba43-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "316eb94f-166a-4fa2-99b8-8967503eba43" (UID: "316eb94f-166a-4fa2-99b8-8967503eba43"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.453247 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/316eb94f-166a-4fa2-99b8-8967503eba43-logs" (OuterVolumeSpecName: "logs") pod "316eb94f-166a-4fa2-99b8-8967503eba43" (UID: "316eb94f-166a-4fa2-99b8-8967503eba43"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.456294 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c25331e-14fb-47d2-aa34-d84b133255e3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6c25331e-14fb-47d2-aa34-d84b133255e3" (UID: "6c25331e-14fb-47d2-aa34-d84b133255e3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.475136 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/316eb94f-166a-4fa2-99b8-8967503eba43-ceph" (OuterVolumeSpecName: "ceph") pod "316eb94f-166a-4fa2-99b8-8967503eba43" (UID: "316eb94f-166a-4fa2-99b8-8967503eba43"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.475226 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c25331e-14fb-47d2-aa34-d84b133255e3-kube-api-access-lxswf" (OuterVolumeSpecName: "kube-api-access-lxswf") pod "6c25331e-14fb-47d2-aa34-d84b133255e3" (UID: "6c25331e-14fb-47d2-aa34-d84b133255e3"). InnerVolumeSpecName "kube-api-access-lxswf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.475275 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c25331e-14fb-47d2-aa34-d84b133255e3-ceph" (OuterVolumeSpecName: "ceph") pod "6c25331e-14fb-47d2-aa34-d84b133255e3" (UID: "6c25331e-14fb-47d2-aa34-d84b133255e3"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.488204 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "316eb94f-166a-4fa2-99b8-8967503eba43" (UID: "316eb94f-166a-4fa2-99b8-8967503eba43"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.491247 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "6c25331e-14fb-47d2-aa34-d84b133255e3" (UID: "6c25331e-14fb-47d2-aa34-d84b133255e3"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.492028 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c25331e-14fb-47d2-aa34-d84b133255e3-logs" (OuterVolumeSpecName: "logs") pod "6c25331e-14fb-47d2-aa34-d84b133255e3" (UID: "6c25331e-14fb-47d2-aa34-d84b133255e3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.509037 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "316eb94f-166a-4fa2-99b8-8967503eba43" (UID: "316eb94f-166a-4fa2-99b8-8967503eba43"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.512051 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6c25331e-14fb-47d2-aa34-d84b133255e3" (UID: "6c25331e-14fb-47d2-aa34-d84b133255e3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.512073 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-config-data" (OuterVolumeSpecName: "config-data") pod "6c25331e-14fb-47d2-aa34-d84b133255e3" (UID: "6c25331e-14fb-47d2-aa34-d84b133255e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.512116 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-scripts" (OuterVolumeSpecName: "scripts") pod "6c25331e-14fb-47d2-aa34-d84b133255e3" (UID: "6c25331e-14fb-47d2-aa34-d84b133255e3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.512132 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-config-data" (OuterVolumeSpecName: "config-data") pod "316eb94f-166a-4fa2-99b8-8967503eba43" (UID: "316eb94f-166a-4fa2-99b8-8967503eba43"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.512159 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c25331e-14fb-47d2-aa34-d84b133255e3" (UID: "6c25331e-14fb-47d2-aa34-d84b133255e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.515099 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-scripts" (OuterVolumeSpecName: "scripts") pod "316eb94f-166a-4fa2-99b8-8967503eba43" (UID: "316eb94f-166a-4fa2-99b8-8967503eba43"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.515259 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/316eb94f-166a-4fa2-99b8-8967503eba43-kube-api-access-pbxxb" (OuterVolumeSpecName: "kube-api-access-pbxxb") pod "316eb94f-166a-4fa2-99b8-8967503eba43" (UID: "316eb94f-166a-4fa2-99b8-8967503eba43"). InnerVolumeSpecName "kube-api-access-pbxxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.515414 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "316eb94f-166a-4fa2-99b8-8967503eba43" (UID: "316eb94f-166a-4fa2-99b8-8967503eba43"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.554209 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.554243 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbxxb\" (UniqueName: \"kubernetes.io/projected/316eb94f-166a-4fa2-99b8-8967503eba43-kube-api-access-pbxxb\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.554256 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.554265 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.554274 4792 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6c25331e-14fb-47d2-aa34-d84b133255e3-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.554283 4792 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/316eb94f-166a-4fa2-99b8-8967503eba43-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.554293 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/316eb94f-166a-4fa2-99b8-8967503eba43-ceph\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.554301 4792 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.554322 4792 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.554336 4792 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.554346 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxswf\" (UniqueName: \"kubernetes.io/projected/6c25331e-14fb-47d2-aa34-d84b133255e3-kube-api-access-lxswf\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.554354 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.554362 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.554371 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c25331e-14fb-47d2-aa34-d84b133255e3-logs\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.554380 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/316eb94f-166a-4fa2-99b8-8967503eba43-logs\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.554388 4792 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.554396 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.554404 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6c25331e-14fb-47d2-aa34-d84b133255e3-ceph\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.560248 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5996ddfbb9-drpwn"] Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.589144 4792 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.628044 4792 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.656061 4792 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.656095 4792 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.317214 4792 generic.go:334] "Generic (PLEG): container finished" podID="c7736148-bc12-4621-a1d2-efc4a0143b42" containerID="bf57ceafd6066a28052f3666ed7d384740c5837c8329274397bd6fa48c44d661" exitCode=0 Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.317664 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-svb22" event={"ID":"c7736148-bc12-4621-a1d2-efc4a0143b42","Type":"ContainerDied","Data":"bf57ceafd6066a28052f3666ed7d384740c5837c8329274397bd6fa48c44d661"} Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.320933 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5887d74897-rnlz9" event={"ID":"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7","Type":"ContainerStarted","Data":"f6db483fc0d11feaf9f5fa34256dfeb85c12b4ea3626fb137a55f680d9cb8957"} Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.322885 4792 generic.go:334] "Generic (PLEG): container finished" podID="0441b486-847a-4f32-8df2-a1284f39ee5d" containerID="31b1b88de471cccce9a774cf2494168f31cb583ae731b5c4c5efee9a815b5533" exitCode=0 Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.322958 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-fa75-account-create-update-s2kng" event={"ID":"0441b486-847a-4f32-8df2-a1284f39ee5d","Type":"ContainerDied","Data":"31b1b88de471cccce9a774cf2494168f31cb583ae731b5c4c5efee9a815b5533"} Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.322986 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-fa75-account-create-update-s2kng" event={"ID":"0441b486-847a-4f32-8df2-a1284f39ee5d","Type":"ContainerStarted","Data":"f184175f9208322fa996d81dbf3433551442dc12615f186d77ca1d9635f54883"} Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.324447 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.325877 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5996ddfbb9-drpwn" event={"ID":"7026175e-efaf-497a-aaf1-079f2811ad08","Type":"ContainerStarted","Data":"cdd7f60094a6b70916ce6fb3a1317938714cd27149f604e9df29f382e04185b2"} Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.325994 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.503929 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.504182 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.504198 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.505667 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.514759 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.514792 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.515010 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.515123 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-vwjrh" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.593285 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.593375 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6327707a-a9c5-4ba1-9c54-21cbb2e47222-ceph\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.593415 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hljh6\" (UniqueName: \"kubernetes.io/projected/6327707a-a9c5-4ba1-9c54-21cbb2e47222-kube-api-access-hljh6\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.593457 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.593619 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.593740 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.593800 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6327707a-a9c5-4ba1-9c54-21cbb2e47222-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.593836 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6327707a-a9c5-4ba1-9c54-21cbb2e47222-logs\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.593874 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.616995 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.641184 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.654514 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.663425 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.665609 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.674861 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.676858 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.677054 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.695349 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.695623 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.695726 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6327707a-a9c5-4ba1-9c54-21cbb2e47222-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.695802 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6327707a-a9c5-4ba1-9c54-21cbb2e47222-logs\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.696186 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.696326 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-config-data\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.696523 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/52d866c4-a856-4930-9501-2be56e07d3ce-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.696664 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52d866c4-a856-4930-9501-2be56e07d3ce-logs\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.696753 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.696851 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6327707a-a9c5-4ba1-9c54-21cbb2e47222-ceph\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.696949 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hljh6\" (UniqueName: \"kubernetes.io/projected/6327707a-a9c5-4ba1-9c54-21cbb2e47222-kube-api-access-hljh6\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.697251 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.697378 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-scripts\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.697486 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-866lt\" (UniqueName: \"kubernetes.io/projected/52d866c4-a856-4930-9501-2be56e07d3ce-kube-api-access-866lt\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.697555 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.697214 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.697627 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.697753 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.697837 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/52d866c4-a856-4930-9501-2be56e07d3ce-ceph\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.696894 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6327707a-a9c5-4ba1-9c54-21cbb2e47222-logs\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.697147 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6327707a-a9c5-4ba1-9c54-21cbb2e47222-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.724368 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.749868 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hljh6\" (UniqueName: \"kubernetes.io/projected/6327707a-a9c5-4ba1-9c54-21cbb2e47222-kube-api-access-hljh6\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.750108 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.750590 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6327707a-a9c5-4ba1-9c54-21cbb2e47222-ceph\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.749688 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.774215 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.780684 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.807495 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-scripts\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.807697 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-866lt\" (UniqueName: \"kubernetes.io/projected/52d866c4-a856-4930-9501-2be56e07d3ce-kube-api-access-866lt\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.810345 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.810426 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.810484 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/52d866c4-a856-4930-9501-2be56e07d3ce-ceph\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.810638 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.810719 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-config-data\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.810776 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/52d866c4-a856-4930-9501-2be56e07d3ce-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.810805 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52d866c4-a856-4930-9501-2be56e07d3ce-logs\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.811600 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52d866c4-a856-4930-9501-2be56e07d3ce-logs\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.813136 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/52d866c4-a856-4930-9501-2be56e07d3ce-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.813333 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.827127 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-config-data\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.836821 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.841701 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/52d866c4-a856-4930-9501-2be56e07d3ce-ceph\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.844819 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-866lt\" (UniqueName: \"kubernetes.io/projected/52d866c4-a856-4930-9501-2be56e07d3ce-kube-api-access-866lt\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.876033 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5887d74897-rnlz9"] Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.876927 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.880210 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-scripts\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.943406 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-689c76c966-7mbkl"] Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.944975 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.949815 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.952664 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.966237 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.990373 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.010743 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.025082 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/67e060ef-1cc6-4b39-8622-bbcc183bdda0-horizon-tls-certs\") pod \"horizon-689c76c966-7mbkl\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.025131 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67e060ef-1cc6-4b39-8622-bbcc183bdda0-scripts\") pod \"horizon-689c76c966-7mbkl\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.025169 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/67e060ef-1cc6-4b39-8622-bbcc183bdda0-horizon-secret-key\") pod \"horizon-689c76c966-7mbkl\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.025191 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/67e060ef-1cc6-4b39-8622-bbcc183bdda0-config-data\") pod \"horizon-689c76c966-7mbkl\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.025231 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67e060ef-1cc6-4b39-8622-bbcc183bdda0-combined-ca-bundle\") pod \"horizon-689c76c966-7mbkl\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.025270 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv6tz\" (UniqueName: \"kubernetes.io/projected/67e060ef-1cc6-4b39-8622-bbcc183bdda0-kube-api-access-rv6tz\") pod \"horizon-689c76c966-7mbkl\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.025341 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67e060ef-1cc6-4b39-8622-bbcc183bdda0-logs\") pod \"horizon-689c76c966-7mbkl\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.039476 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-689c76c966-7mbkl"] Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.066144 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5996ddfbb9-drpwn"] Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.080361 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.098032 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-79f8cb6d9d-xg7h5"] Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.107803 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.115521 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-79f8cb6d9d-xg7h5"] Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.130074 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7f79f77-ac1b-445e-8e28-85c8964f5461-horizon-tls-certs\") pod \"horizon-79f8cb6d9d-xg7h5\" (UID: \"d7f79f77-ac1b-445e-8e28-85c8964f5461\") " pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.130143 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/67e060ef-1cc6-4b39-8622-bbcc183bdda0-horizon-tls-certs\") pod \"horizon-689c76c966-7mbkl\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.130173 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d7f79f77-ac1b-445e-8e28-85c8964f5461-horizon-secret-key\") pod \"horizon-79f8cb6d9d-xg7h5\" (UID: \"d7f79f77-ac1b-445e-8e28-85c8964f5461\") " pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.130194 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67e060ef-1cc6-4b39-8622-bbcc183bdda0-scripts\") pod \"horizon-689c76c966-7mbkl\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.130220 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbj7z\" (UniqueName: \"kubernetes.io/projected/d7f79f77-ac1b-445e-8e28-85c8964f5461-kube-api-access-dbj7z\") pod \"horizon-79f8cb6d9d-xg7h5\" (UID: \"d7f79f77-ac1b-445e-8e28-85c8964f5461\") " pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.130240 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d7f79f77-ac1b-445e-8e28-85c8964f5461-config-data\") pod \"horizon-79f8cb6d9d-xg7h5\" (UID: \"d7f79f77-ac1b-445e-8e28-85c8964f5461\") " pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.130258 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/67e060ef-1cc6-4b39-8622-bbcc183bdda0-horizon-secret-key\") pod \"horizon-689c76c966-7mbkl\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.130279 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/67e060ef-1cc6-4b39-8622-bbcc183bdda0-config-data\") pod \"horizon-689c76c966-7mbkl\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.130317 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d7f79f77-ac1b-445e-8e28-85c8964f5461-scripts\") pod \"horizon-79f8cb6d9d-xg7h5\" (UID: \"d7f79f77-ac1b-445e-8e28-85c8964f5461\") " pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.130338 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67e060ef-1cc6-4b39-8622-bbcc183bdda0-combined-ca-bundle\") pod \"horizon-689c76c966-7mbkl\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.130367 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7f79f77-ac1b-445e-8e28-85c8964f5461-logs\") pod \"horizon-79f8cb6d9d-xg7h5\" (UID: \"d7f79f77-ac1b-445e-8e28-85c8964f5461\") " pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.130384 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv6tz\" (UniqueName: \"kubernetes.io/projected/67e060ef-1cc6-4b39-8622-bbcc183bdda0-kube-api-access-rv6tz\") pod \"horizon-689c76c966-7mbkl\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.130447 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67e060ef-1cc6-4b39-8622-bbcc183bdda0-logs\") pod \"horizon-689c76c966-7mbkl\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.130477 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7f79f77-ac1b-445e-8e28-85c8964f5461-combined-ca-bundle\") pod \"horizon-79f8cb6d9d-xg7h5\" (UID: \"d7f79f77-ac1b-445e-8e28-85c8964f5461\") " pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.132993 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/67e060ef-1cc6-4b39-8622-bbcc183bdda0-config-data\") pod \"horizon-689c76c966-7mbkl\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.133284 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67e060ef-1cc6-4b39-8622-bbcc183bdda0-logs\") pod \"horizon-689c76c966-7mbkl\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.133571 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67e060ef-1cc6-4b39-8622-bbcc183bdda0-scripts\") pod \"horizon-689c76c966-7mbkl\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.141832 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67e060ef-1cc6-4b39-8622-bbcc183bdda0-combined-ca-bundle\") pod \"horizon-689c76c966-7mbkl\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.146004 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/67e060ef-1cc6-4b39-8622-bbcc183bdda0-horizon-secret-key\") pod \"horizon-689c76c966-7mbkl\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.151653 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/67e060ef-1cc6-4b39-8622-bbcc183bdda0-horizon-tls-certs\") pod \"horizon-689c76c966-7mbkl\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.168596 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv6tz\" (UniqueName: \"kubernetes.io/projected/67e060ef-1cc6-4b39-8622-bbcc183bdda0-kube-api-access-rv6tz\") pod \"horizon-689c76c966-7mbkl\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.240364 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7f79f77-ac1b-445e-8e28-85c8964f5461-combined-ca-bundle\") pod \"horizon-79f8cb6d9d-xg7h5\" (UID: \"d7f79f77-ac1b-445e-8e28-85c8964f5461\") " pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.240718 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7f79f77-ac1b-445e-8e28-85c8964f5461-horizon-tls-certs\") pod \"horizon-79f8cb6d9d-xg7h5\" (UID: \"d7f79f77-ac1b-445e-8e28-85c8964f5461\") " pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.240853 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d7f79f77-ac1b-445e-8e28-85c8964f5461-horizon-secret-key\") pod \"horizon-79f8cb6d9d-xg7h5\" (UID: \"d7f79f77-ac1b-445e-8e28-85c8964f5461\") " pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.240920 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbj7z\" (UniqueName: \"kubernetes.io/projected/d7f79f77-ac1b-445e-8e28-85c8964f5461-kube-api-access-dbj7z\") pod \"horizon-79f8cb6d9d-xg7h5\" (UID: \"d7f79f77-ac1b-445e-8e28-85c8964f5461\") " pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.240949 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d7f79f77-ac1b-445e-8e28-85c8964f5461-config-data\") pod \"horizon-79f8cb6d9d-xg7h5\" (UID: \"d7f79f77-ac1b-445e-8e28-85c8964f5461\") " pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.241011 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d7f79f77-ac1b-445e-8e28-85c8964f5461-scripts\") pod \"horizon-79f8cb6d9d-xg7h5\" (UID: \"d7f79f77-ac1b-445e-8e28-85c8964f5461\") " pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.241078 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7f79f77-ac1b-445e-8e28-85c8964f5461-logs\") pod \"horizon-79f8cb6d9d-xg7h5\" (UID: \"d7f79f77-ac1b-445e-8e28-85c8964f5461\") " pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.243454 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d7f79f77-ac1b-445e-8e28-85c8964f5461-scripts\") pod \"horizon-79f8cb6d9d-xg7h5\" (UID: \"d7f79f77-ac1b-445e-8e28-85c8964f5461\") " pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.243623 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7f79f77-ac1b-445e-8e28-85c8964f5461-logs\") pod \"horizon-79f8cb6d9d-xg7h5\" (UID: \"d7f79f77-ac1b-445e-8e28-85c8964f5461\") " pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.243727 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d7f79f77-ac1b-445e-8e28-85c8964f5461-config-data\") pod \"horizon-79f8cb6d9d-xg7h5\" (UID: \"d7f79f77-ac1b-445e-8e28-85c8964f5461\") " pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.246386 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7f79f77-ac1b-445e-8e28-85c8964f5461-horizon-tls-certs\") pod \"horizon-79f8cb6d9d-xg7h5\" (UID: \"d7f79f77-ac1b-445e-8e28-85c8964f5461\") " pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.264494 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7f79f77-ac1b-445e-8e28-85c8964f5461-combined-ca-bundle\") pod \"horizon-79f8cb6d9d-xg7h5\" (UID: \"d7f79f77-ac1b-445e-8e28-85c8964f5461\") " pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.269789 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d7f79f77-ac1b-445e-8e28-85c8964f5461-horizon-secret-key\") pod \"horizon-79f8cb6d9d-xg7h5\" (UID: \"d7f79f77-ac1b-445e-8e28-85c8964f5461\") " pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.280629 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbj7z\" (UniqueName: \"kubernetes.io/projected/d7f79f77-ac1b-445e-8e28-85c8964f5461-kube-api-access-dbj7z\") pod \"horizon-79f8cb6d9d-xg7h5\" (UID: \"d7f79f77-ac1b-445e-8e28-85c8964f5461\") " pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.320149 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.379714 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727","Type":"ContainerStarted","Data":"0fe3e9d9e5029b744e1fad7c8697fe586697f00dbac1e16da5e28bbf1f726170"} Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.379763 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727","Type":"ContainerStarted","Data":"e4d4459208928b8cecfd99dceb97323b4f6b790a52fc4ce6cf99fd4332f79534"} Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.460764 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.787862 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=3.172949351 podStartE2EDuration="4.78784535s" podCreationTimestamp="2026-03-01 10:00:50 +0000 UTC" firstStartedPulling="2026-03-01 10:00:51.546478753 +0000 UTC m=+3180.788357950" lastFinishedPulling="2026-03-01 10:00:53.161374752 +0000 UTC m=+3182.403253949" observedRunningTime="2026-03-01 10:00:54.407357807 +0000 UTC m=+3183.649237004" watchObservedRunningTime="2026-03-01 10:00:54.78784535 +0000 UTC m=+3184.029724547" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.791521 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 01 10:00:54 crc kubenswrapper[4792]: W0301 10:00:54.848510 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6327707a_a9c5_4ba1_9c54_21cbb2e47222.slice/crio-4dd7bd8f9c142eedf5828dbb684b632a29d244482b177fd704a0a7b1d31e3c86 WatchSource:0}: Error finding container 4dd7bd8f9c142eedf5828dbb684b632a29d244482b177fd704a0a7b1d31e3c86: Status 404 returned error can't find the container with id 4dd7bd8f9c142eedf5828dbb684b632a29d244482b177fd704a0a7b1d31e3c86 Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.878156 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.076550 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-689c76c966-7mbkl"] Mar 01 10:00:55 crc kubenswrapper[4792]: W0301 10:00:55.116893 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67e060ef_1cc6_4b39_8622_bbcc183bdda0.slice/crio-ca973dc9b124eaa6124a0d372262dadb60856c6d51900f904c6c257a734ebfb0 WatchSource:0}: Error finding container ca973dc9b124eaa6124a0d372262dadb60856c6d51900f904c6c257a734ebfb0: Status 404 returned error can't find the container with id ca973dc9b124eaa6124a0d372262dadb60856c6d51900f904c6c257a734ebfb0 Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.207573 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-svb22" Mar 01 10:00:55 crc kubenswrapper[4792]: W0301 10:00:55.231446 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7f79f77_ac1b_445e_8e28_85c8964f5461.slice/crio-ac108770bf85ca6b5834e2dc59696facad649f58525fe3c2fe62f7a6209a3707 WatchSource:0}: Error finding container ac108770bf85ca6b5834e2dc59696facad649f58525fe3c2fe62f7a6209a3707: Status 404 returned error can't find the container with id ac108770bf85ca6b5834e2dc59696facad649f58525fe3c2fe62f7a6209a3707 Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.238819 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-79f8cb6d9d-xg7h5"] Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.257396 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-fa75-account-create-update-s2kng" Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.291712 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62fr5\" (UniqueName: \"kubernetes.io/projected/c7736148-bc12-4621-a1d2-efc4a0143b42-kube-api-access-62fr5\") pod \"c7736148-bc12-4621-a1d2-efc4a0143b42\" (UID: \"c7736148-bc12-4621-a1d2-efc4a0143b42\") " Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.291801 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7736148-bc12-4621-a1d2-efc4a0143b42-operator-scripts\") pod \"c7736148-bc12-4621-a1d2-efc4a0143b42\" (UID: \"c7736148-bc12-4621-a1d2-efc4a0143b42\") " Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.301523 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7736148-bc12-4621-a1d2-efc4a0143b42-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c7736148-bc12-4621-a1d2-efc4a0143b42" (UID: "c7736148-bc12-4621-a1d2-efc4a0143b42"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.317243 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7736148-bc12-4621-a1d2-efc4a0143b42-kube-api-access-62fr5" (OuterVolumeSpecName: "kube-api-access-62fr5") pod "c7736148-bc12-4621-a1d2-efc4a0143b42" (UID: "c7736148-bc12-4621-a1d2-efc4a0143b42"). InnerVolumeSpecName "kube-api-access-62fr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.401532 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbq5q\" (UniqueName: \"kubernetes.io/projected/0441b486-847a-4f32-8df2-a1284f39ee5d-kube-api-access-jbq5q\") pod \"0441b486-847a-4f32-8df2-a1284f39ee5d\" (UID: \"0441b486-847a-4f32-8df2-a1284f39ee5d\") " Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.401580 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0441b486-847a-4f32-8df2-a1284f39ee5d-operator-scripts\") pod \"0441b486-847a-4f32-8df2-a1284f39ee5d\" (UID: \"0441b486-847a-4f32-8df2-a1284f39ee5d\") " Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.402185 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62fr5\" (UniqueName: \"kubernetes.io/projected/c7736148-bc12-4621-a1d2-efc4a0143b42-kube-api-access-62fr5\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.402200 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7736148-bc12-4621-a1d2-efc4a0143b42-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.403560 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0441b486-847a-4f32-8df2-a1284f39ee5d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0441b486-847a-4f32-8df2-a1284f39ee5d" (UID: "0441b486-847a-4f32-8df2-a1284f39ee5d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.409525 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0441b486-847a-4f32-8df2-a1284f39ee5d-kube-api-access-jbq5q" (OuterVolumeSpecName: "kube-api-access-jbq5q") pod "0441b486-847a-4f32-8df2-a1284f39ee5d" (UID: "0441b486-847a-4f32-8df2-a1284f39ee5d"). InnerVolumeSpecName "kube-api-access-jbq5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.444282 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="316eb94f-166a-4fa2-99b8-8967503eba43" path="/var/lib/kubelet/pods/316eb94f-166a-4fa2-99b8-8967503eba43/volumes" Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.445143 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c25331e-14fb-47d2-aa34-d84b133255e3" path="/var/lib/kubelet/pods/6c25331e-14fb-47d2-aa34-d84b133255e3/volumes" Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.445583 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-fa75-account-create-update-s2kng" Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.449462 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6327707a-a9c5-4ba1-9c54-21cbb2e47222","Type":"ContainerStarted","Data":"4dd7bd8f9c142eedf5828dbb684b632a29d244482b177fd704a0a7b1d31e3c86"} Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.449495 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-fa75-account-create-update-s2kng" event={"ID":"0441b486-847a-4f32-8df2-a1284f39ee5d","Type":"ContainerDied","Data":"f184175f9208322fa996d81dbf3433551442dc12615f186d77ca1d9635f54883"} Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.449509 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f184175f9208322fa996d81dbf3433551442dc12615f186d77ca1d9635f54883" Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.479077 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-689c76c966-7mbkl" event={"ID":"67e060ef-1cc6-4b39-8622-bbcc183bdda0","Type":"ContainerStarted","Data":"ca973dc9b124eaa6124a0d372262dadb60856c6d51900f904c6c257a734ebfb0"} Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.504982 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbq5q\" (UniqueName: \"kubernetes.io/projected/0441b486-847a-4f32-8df2-a1284f39ee5d-kube-api-access-jbq5q\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.505007 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0441b486-847a-4f32-8df2-a1284f39ee5d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.513656 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79f8cb6d9d-xg7h5" event={"ID":"d7f79f77-ac1b-445e-8e28-85c8964f5461","Type":"ContainerStarted","Data":"ac108770bf85ca6b5834e2dc59696facad649f58525fe3c2fe62f7a6209a3707"} Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.528162 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"52d866c4-a856-4930-9501-2be56e07d3ce","Type":"ContainerStarted","Data":"73a05d008372db4bf11e47f503cd30ac75f01c346a6a7e72a1f353158728cf02"} Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.539236 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"23d15722-3d0f-44ce-ac55-eba67760f0e9","Type":"ContainerStarted","Data":"7964dba61557d3f331243b26dd157e553388e41ae7845157ebedbb8b686a988f"} Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.539283 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"23d15722-3d0f-44ce-ac55-eba67760f0e9","Type":"ContainerStarted","Data":"df777540105399ac6291b9dde10bc4d5ed076c6cb9b38e40403b88e9a6e303cc"} Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.554276 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-svb22" Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.554316 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-svb22" event={"ID":"c7736148-bc12-4621-a1d2-efc4a0143b42","Type":"ContainerDied","Data":"0f8fca8a39e93a444c277a56b4d25c74d8a2b8d4dd4ac5a12e9ce102355a7e5c"} Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.554340 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f8fca8a39e93a444c277a56b4d25c74d8a2b8d4dd4ac5a12e9ce102355a7e5c" Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.567108 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.567418 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=3.775778644 podStartE2EDuration="5.567409919s" podCreationTimestamp="2026-03-01 10:00:50 +0000 UTC" firstStartedPulling="2026-03-01 10:00:52.085099547 +0000 UTC m=+3181.326978734" lastFinishedPulling="2026-03-01 10:00:53.876730812 +0000 UTC m=+3183.118610009" observedRunningTime="2026-03-01 10:00:55.565505722 +0000 UTC m=+3184.807384919" watchObservedRunningTime="2026-03-01 10:00:55.567409919 +0000 UTC m=+3184.809289116" Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.579746 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Mar 01 10:00:56 crc kubenswrapper[4792]: I0301 10:00:56.639357 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6327707a-a9c5-4ba1-9c54-21cbb2e47222","Type":"ContainerStarted","Data":"da274f80edf717de1da1c26bd84e489a22c77768488cdf1c9349a7c0e2449fda"} Mar 01 10:00:56 crc kubenswrapper[4792]: I0301 10:00:56.665382 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"52d866c4-a856-4930-9501-2be56e07d3ce","Type":"ContainerStarted","Data":"deba6fff610fdacbe233e3a2281a121eaefcf69a6c1a39995332da9dc9830ca5"} Mar 01 10:00:57 crc kubenswrapper[4792]: I0301 10:00:57.675837 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"52d866c4-a856-4930-9501-2be56e07d3ce","Type":"ContainerStarted","Data":"e4ccbed7b23e1a31ea640c9ac93a94d665b7bb34ad516870757d1a5f808133fb"} Mar 01 10:00:57 crc kubenswrapper[4792]: I0301 10:00:57.675848 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="52d866c4-a856-4930-9501-2be56e07d3ce" containerName="glance-log" containerID="cri-o://deba6fff610fdacbe233e3a2281a121eaefcf69a6c1a39995332da9dc9830ca5" gracePeriod=30 Mar 01 10:00:57 crc kubenswrapper[4792]: I0301 10:00:57.675942 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="52d866c4-a856-4930-9501-2be56e07d3ce" containerName="glance-httpd" containerID="cri-o://e4ccbed7b23e1a31ea640c9ac93a94d665b7bb34ad516870757d1a5f808133fb" gracePeriod=30 Mar 01 10:00:57 crc kubenswrapper[4792]: I0301 10:00:57.681976 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6327707a-a9c5-4ba1-9c54-21cbb2e47222","Type":"ContainerStarted","Data":"14967d4571fbfd80a32479d19d2b933beb19ce543caa25b7ae07a663d46e8838"} Mar 01 10:00:57 crc kubenswrapper[4792]: I0301 10:00:57.682100 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="6327707a-a9c5-4ba1-9c54-21cbb2e47222" containerName="glance-log" containerID="cri-o://da274f80edf717de1da1c26bd84e489a22c77768488cdf1c9349a7c0e2449fda" gracePeriod=30 Mar 01 10:00:57 crc kubenswrapper[4792]: I0301 10:00:57.682207 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="6327707a-a9c5-4ba1-9c54-21cbb2e47222" containerName="glance-httpd" containerID="cri-o://14967d4571fbfd80a32479d19d2b933beb19ce543caa25b7ae07a663d46e8838" gracePeriod=30 Mar 01 10:00:57 crc kubenswrapper[4792]: I0301 10:00:57.707236 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.707216462 podStartE2EDuration="4.707216462s" podCreationTimestamp="2026-03-01 10:00:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 10:00:57.697550391 +0000 UTC m=+3186.939429588" watchObservedRunningTime="2026-03-01 10:00:57.707216462 +0000 UTC m=+3186.949095659" Mar 01 10:00:57 crc kubenswrapper[4792]: I0301 10:00:57.739113 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.739096807 podStartE2EDuration="4.739096807s" podCreationTimestamp="2026-03-01 10:00:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 10:00:57.738785399 +0000 UTC m=+3186.980664596" watchObservedRunningTime="2026-03-01 10:00:57.739096807 +0000 UTC m=+3186.980976004" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.501037 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.614614 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52d866c4-a856-4930-9501-2be56e07d3ce-logs\") pod \"52d866c4-a856-4930-9501-2be56e07d3ce\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.614683 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"52d866c4-a856-4930-9501-2be56e07d3ce\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.614754 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/52d866c4-a856-4930-9501-2be56e07d3ce-httpd-run\") pod \"52d866c4-a856-4930-9501-2be56e07d3ce\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.614864 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-866lt\" (UniqueName: \"kubernetes.io/projected/52d866c4-a856-4930-9501-2be56e07d3ce-kube-api-access-866lt\") pod \"52d866c4-a856-4930-9501-2be56e07d3ce\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.614898 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-public-tls-certs\") pod \"52d866c4-a856-4930-9501-2be56e07d3ce\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.615009 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-config-data\") pod \"52d866c4-a856-4930-9501-2be56e07d3ce\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.615038 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/52d866c4-a856-4930-9501-2be56e07d3ce-ceph\") pod \"52d866c4-a856-4930-9501-2be56e07d3ce\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.615134 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-combined-ca-bundle\") pod \"52d866c4-a856-4930-9501-2be56e07d3ce\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.615490 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-scripts\") pod \"52d866c4-a856-4930-9501-2be56e07d3ce\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.618238 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52d866c4-a856-4930-9501-2be56e07d3ce-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "52d866c4-a856-4930-9501-2be56e07d3ce" (UID: "52d866c4-a856-4930-9501-2be56e07d3ce"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.618823 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52d866c4-a856-4930-9501-2be56e07d3ce-logs" (OuterVolumeSpecName: "logs") pod "52d866c4-a856-4930-9501-2be56e07d3ce" (UID: "52d866c4-a856-4930-9501-2be56e07d3ce"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.627651 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52d866c4-a856-4930-9501-2be56e07d3ce-kube-api-access-866lt" (OuterVolumeSpecName: "kube-api-access-866lt") pod "52d866c4-a856-4930-9501-2be56e07d3ce" (UID: "52d866c4-a856-4930-9501-2be56e07d3ce"). InnerVolumeSpecName "kube-api-access-866lt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.628065 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-scripts" (OuterVolumeSpecName: "scripts") pod "52d866c4-a856-4930-9501-2be56e07d3ce" (UID: "52d866c4-a856-4930-9501-2be56e07d3ce"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.629021 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "52d866c4-a856-4930-9501-2be56e07d3ce" (UID: "52d866c4-a856-4930-9501-2be56e07d3ce"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.642988 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52d866c4-a856-4930-9501-2be56e07d3ce-ceph" (OuterVolumeSpecName: "ceph") pod "52d866c4-a856-4930-9501-2be56e07d3ce" (UID: "52d866c4-a856-4930-9501-2be56e07d3ce"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.660608 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52d866c4-a856-4930-9501-2be56e07d3ce" (UID: "52d866c4-a856-4930-9501-2be56e07d3ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.720881 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.721203 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.721214 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52d866c4-a856-4930-9501-2be56e07d3ce-logs\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.721236 4792 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.721245 4792 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/52d866c4-a856-4930-9501-2be56e07d3ce-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.721254 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-866lt\" (UniqueName: \"kubernetes.io/projected/52d866c4-a856-4930-9501-2be56e07d3ce-kube-api-access-866lt\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.721264 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/52d866c4-a856-4930-9501-2be56e07d3ce-ceph\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.735148 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-config-data" (OuterVolumeSpecName: "config-data") pod "52d866c4-a856-4930-9501-2be56e07d3ce" (UID: "52d866c4-a856-4930-9501-2be56e07d3ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.736083 4792 generic.go:334] "Generic (PLEG): container finished" podID="6327707a-a9c5-4ba1-9c54-21cbb2e47222" containerID="14967d4571fbfd80a32479d19d2b933beb19ce543caa25b7ae07a663d46e8838" exitCode=143 Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.736394 4792 generic.go:334] "Generic (PLEG): container finished" podID="6327707a-a9c5-4ba1-9c54-21cbb2e47222" containerID="da274f80edf717de1da1c26bd84e489a22c77768488cdf1c9349a7c0e2449fda" exitCode=143 Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.736658 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6327707a-a9c5-4ba1-9c54-21cbb2e47222","Type":"ContainerDied","Data":"14967d4571fbfd80a32479d19d2b933beb19ce543caa25b7ae07a663d46e8838"} Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.737249 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6327707a-a9c5-4ba1-9c54-21cbb2e47222","Type":"ContainerDied","Data":"da274f80edf717de1da1c26bd84e489a22c77768488cdf1c9349a7c0e2449fda"} Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.749238 4792 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.760970 4792 generic.go:334] "Generic (PLEG): container finished" podID="52d866c4-a856-4930-9501-2be56e07d3ce" containerID="e4ccbed7b23e1a31ea640c9ac93a94d665b7bb34ad516870757d1a5f808133fb" exitCode=143 Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.761005 4792 generic.go:334] "Generic (PLEG): container finished" podID="52d866c4-a856-4930-9501-2be56e07d3ce" containerID="deba6fff610fdacbe233e3a2281a121eaefcf69a6c1a39995332da9dc9830ca5" exitCode=143 Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.761029 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"52d866c4-a856-4930-9501-2be56e07d3ce","Type":"ContainerDied","Data":"e4ccbed7b23e1a31ea640c9ac93a94d665b7bb34ad516870757d1a5f808133fb"} Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.761056 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"52d866c4-a856-4930-9501-2be56e07d3ce","Type":"ContainerDied","Data":"deba6fff610fdacbe233e3a2281a121eaefcf69a6c1a39995332da9dc9830ca5"} Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.761065 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"52d866c4-a856-4930-9501-2be56e07d3ce","Type":"ContainerDied","Data":"73a05d008372db4bf11e47f503cd30ac75f01c346a6a7e72a1f353158728cf02"} Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.761089 4792 scope.go:117] "RemoveContainer" containerID="e4ccbed7b23e1a31ea640c9ac93a94d665b7bb34ad516870757d1a5f808133fb" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.761475 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.819036 4792 scope.go:117] "RemoveContainer" containerID="deba6fff610fdacbe233e3a2281a121eaefcf69a6c1a39995332da9dc9830ca5" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.829522 4792 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.829540 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.830297 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "52d866c4-a856-4930-9501-2be56e07d3ce" (UID: "52d866c4-a856-4930-9501-2be56e07d3ce"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.934808 4792 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.957259 4792 scope.go:117] "RemoveContainer" containerID="e4ccbed7b23e1a31ea640c9ac93a94d665b7bb34ad516870757d1a5f808133fb" Mar 01 10:00:58 crc kubenswrapper[4792]: E0301 10:00:58.962268 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4ccbed7b23e1a31ea640c9ac93a94d665b7bb34ad516870757d1a5f808133fb\": container with ID starting with e4ccbed7b23e1a31ea640c9ac93a94d665b7bb34ad516870757d1a5f808133fb not found: ID does not exist" containerID="e4ccbed7b23e1a31ea640c9ac93a94d665b7bb34ad516870757d1a5f808133fb" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.962312 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4ccbed7b23e1a31ea640c9ac93a94d665b7bb34ad516870757d1a5f808133fb"} err="failed to get container status \"e4ccbed7b23e1a31ea640c9ac93a94d665b7bb34ad516870757d1a5f808133fb\": rpc error: code = NotFound desc = could not find container \"e4ccbed7b23e1a31ea640c9ac93a94d665b7bb34ad516870757d1a5f808133fb\": container with ID starting with e4ccbed7b23e1a31ea640c9ac93a94d665b7bb34ad516870757d1a5f808133fb not found: ID does not exist" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.962338 4792 scope.go:117] "RemoveContainer" containerID="deba6fff610fdacbe233e3a2281a121eaefcf69a6c1a39995332da9dc9830ca5" Mar 01 10:00:58 crc kubenswrapper[4792]: E0301 10:00:58.962875 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"deba6fff610fdacbe233e3a2281a121eaefcf69a6c1a39995332da9dc9830ca5\": container with ID starting with deba6fff610fdacbe233e3a2281a121eaefcf69a6c1a39995332da9dc9830ca5 not found: ID does not exist" containerID="deba6fff610fdacbe233e3a2281a121eaefcf69a6c1a39995332da9dc9830ca5" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.962929 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deba6fff610fdacbe233e3a2281a121eaefcf69a6c1a39995332da9dc9830ca5"} err="failed to get container status \"deba6fff610fdacbe233e3a2281a121eaefcf69a6c1a39995332da9dc9830ca5\": rpc error: code = NotFound desc = could not find container \"deba6fff610fdacbe233e3a2281a121eaefcf69a6c1a39995332da9dc9830ca5\": container with ID starting with deba6fff610fdacbe233e3a2281a121eaefcf69a6c1a39995332da9dc9830ca5 not found: ID does not exist" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.962944 4792 scope.go:117] "RemoveContainer" containerID="e4ccbed7b23e1a31ea640c9ac93a94d665b7bb34ad516870757d1a5f808133fb" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.963983 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4ccbed7b23e1a31ea640c9ac93a94d665b7bb34ad516870757d1a5f808133fb"} err="failed to get container status \"e4ccbed7b23e1a31ea640c9ac93a94d665b7bb34ad516870757d1a5f808133fb\": rpc error: code = NotFound desc = could not find container \"e4ccbed7b23e1a31ea640c9ac93a94d665b7bb34ad516870757d1a5f808133fb\": container with ID starting with e4ccbed7b23e1a31ea640c9ac93a94d665b7bb34ad516870757d1a5f808133fb not found: ID does not exist" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.964029 4792 scope.go:117] "RemoveContainer" containerID="deba6fff610fdacbe233e3a2281a121eaefcf69a6c1a39995332da9dc9830ca5" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.964456 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deba6fff610fdacbe233e3a2281a121eaefcf69a6c1a39995332da9dc9830ca5"} err="failed to get container status \"deba6fff610fdacbe233e3a2281a121eaefcf69a6c1a39995332da9dc9830ca5\": rpc error: code = NotFound desc = could not find container \"deba6fff610fdacbe233e3a2281a121eaefcf69a6c1a39995332da9dc9830ca5\": container with ID starting with deba6fff610fdacbe233e3a2281a121eaefcf69a6c1a39995332da9dc9830ca5 not found: ID does not exist" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.110300 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.133824 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.143994 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.162097 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 01 10:00:59 crc kubenswrapper[4792]: E0301 10:00:59.162685 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0441b486-847a-4f32-8df2-a1284f39ee5d" containerName="mariadb-account-create-update" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.162709 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0441b486-847a-4f32-8df2-a1284f39ee5d" containerName="mariadb-account-create-update" Mar 01 10:00:59 crc kubenswrapper[4792]: E0301 10:00:59.162730 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7736148-bc12-4621-a1d2-efc4a0143b42" containerName="mariadb-database-create" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.162737 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7736148-bc12-4621-a1d2-efc4a0143b42" containerName="mariadb-database-create" Mar 01 10:00:59 crc kubenswrapper[4792]: E0301 10:00:59.162756 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52d866c4-a856-4930-9501-2be56e07d3ce" containerName="glance-log" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.162764 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="52d866c4-a856-4930-9501-2be56e07d3ce" containerName="glance-log" Mar 01 10:00:59 crc kubenswrapper[4792]: E0301 10:00:59.162788 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6327707a-a9c5-4ba1-9c54-21cbb2e47222" containerName="glance-log" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.162796 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6327707a-a9c5-4ba1-9c54-21cbb2e47222" containerName="glance-log" Mar 01 10:00:59 crc kubenswrapper[4792]: E0301 10:00:59.162810 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52d866c4-a856-4930-9501-2be56e07d3ce" containerName="glance-httpd" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.162818 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="52d866c4-a856-4930-9501-2be56e07d3ce" containerName="glance-httpd" Mar 01 10:00:59 crc kubenswrapper[4792]: E0301 10:00:59.162831 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6327707a-a9c5-4ba1-9c54-21cbb2e47222" containerName="glance-httpd" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.162839 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6327707a-a9c5-4ba1-9c54-21cbb2e47222" containerName="glance-httpd" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.163072 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="52d866c4-a856-4930-9501-2be56e07d3ce" containerName="glance-log" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.163121 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7736148-bc12-4621-a1d2-efc4a0143b42" containerName="mariadb-database-create" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.163137 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="52d866c4-a856-4930-9501-2be56e07d3ce" containerName="glance-httpd" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.163152 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="6327707a-a9c5-4ba1-9c54-21cbb2e47222" containerName="glance-log" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.163162 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="6327707a-a9c5-4ba1-9c54-21cbb2e47222" containerName="glance-httpd" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.163176 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0441b486-847a-4f32-8df2-a1284f39ee5d" containerName="mariadb-account-create-update" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.164244 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.168055 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.168651 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.189217 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.248507 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6327707a-a9c5-4ba1-9c54-21cbb2e47222-ceph\") pod \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.248612 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6327707a-a9c5-4ba1-9c54-21cbb2e47222-httpd-run\") pod \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.248648 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.248737 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hljh6\" (UniqueName: \"kubernetes.io/projected/6327707a-a9c5-4ba1-9c54-21cbb2e47222-kube-api-access-hljh6\") pod \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.248780 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-scripts\") pod \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.248810 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-internal-tls-certs\") pod \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.248959 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6327707a-a9c5-4ba1-9c54-21cbb2e47222-logs\") pod \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.248990 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-config-data\") pod \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.249012 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-combined-ca-bundle\") pod \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.249350 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52b189da-3327-40c1-bf22-a842b0980593-scripts\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.249392 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52b189da-3327-40c1-bf22-a842b0980593-config-data\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.249443 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnt28\" (UniqueName: \"kubernetes.io/projected/52b189da-3327-40c1-bf22-a842b0980593-kube-api-access-jnt28\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.249477 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/52b189da-3327-40c1-bf22-a842b0980593-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.249500 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/52b189da-3327-40c1-bf22-a842b0980593-ceph\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.249546 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52b189da-3327-40c1-bf22-a842b0980593-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.249585 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.249642 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52b189da-3327-40c1-bf22-a842b0980593-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.249725 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52b189da-3327-40c1-bf22-a842b0980593-logs\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.252661 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6327707a-a9c5-4ba1-9c54-21cbb2e47222-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6327707a-a9c5-4ba1-9c54-21cbb2e47222" (UID: "6327707a-a9c5-4ba1-9c54-21cbb2e47222"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.253619 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6327707a-a9c5-4ba1-9c54-21cbb2e47222-logs" (OuterVolumeSpecName: "logs") pod "6327707a-a9c5-4ba1-9c54-21cbb2e47222" (UID: "6327707a-a9c5-4ba1-9c54-21cbb2e47222"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.256692 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6327707a-a9c5-4ba1-9c54-21cbb2e47222-kube-api-access-hljh6" (OuterVolumeSpecName: "kube-api-access-hljh6") pod "6327707a-a9c5-4ba1-9c54-21cbb2e47222" (UID: "6327707a-a9c5-4ba1-9c54-21cbb2e47222"). InnerVolumeSpecName "kube-api-access-hljh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.258290 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "6327707a-a9c5-4ba1-9c54-21cbb2e47222" (UID: "6327707a-a9c5-4ba1-9c54-21cbb2e47222"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.262384 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6327707a-a9c5-4ba1-9c54-21cbb2e47222-ceph" (OuterVolumeSpecName: "ceph") pod "6327707a-a9c5-4ba1-9c54-21cbb2e47222" (UID: "6327707a-a9c5-4ba1-9c54-21cbb2e47222"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.274391 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-scripts" (OuterVolumeSpecName: "scripts") pod "6327707a-a9c5-4ba1-9c54-21cbb2e47222" (UID: "6327707a-a9c5-4ba1-9c54-21cbb2e47222"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.285473 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6327707a-a9c5-4ba1-9c54-21cbb2e47222" (UID: "6327707a-a9c5-4ba1-9c54-21cbb2e47222"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.342767 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-config-data" (OuterVolumeSpecName: "config-data") pod "6327707a-a9c5-4ba1-9c54-21cbb2e47222" (UID: "6327707a-a9c5-4ba1-9c54-21cbb2e47222"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.348178 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6327707a-a9c5-4ba1-9c54-21cbb2e47222" (UID: "6327707a-a9c5-4ba1-9c54-21cbb2e47222"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.350940 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52b189da-3327-40c1-bf22-a842b0980593-logs\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.351024 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52b189da-3327-40c1-bf22-a842b0980593-scripts\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.351049 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52b189da-3327-40c1-bf22-a842b0980593-config-data\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.351088 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnt28\" (UniqueName: \"kubernetes.io/projected/52b189da-3327-40c1-bf22-a842b0980593-kube-api-access-jnt28\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.351109 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/52b189da-3327-40c1-bf22-a842b0980593-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.351126 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/52b189da-3327-40c1-bf22-a842b0980593-ceph\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.351193 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52b189da-3327-40c1-bf22-a842b0980593-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.351218 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.351254 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52b189da-3327-40c1-bf22-a842b0980593-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.351314 4792 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.351426 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6327707a-a9c5-4ba1-9c54-21cbb2e47222-logs\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.351438 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.351447 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.351455 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6327707a-a9c5-4ba1-9c54-21cbb2e47222-ceph\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.351463 4792 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6327707a-a9c5-4ba1-9c54-21cbb2e47222-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.351481 4792 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.351492 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hljh6\" (UniqueName: \"kubernetes.io/projected/6327707a-a9c5-4ba1-9c54-21cbb2e47222-kube-api-access-hljh6\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.351500 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.351527 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52b189da-3327-40c1-bf22-a842b0980593-logs\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.352640 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.353856 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/52b189da-3327-40c1-bf22-a842b0980593-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.354627 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52b189da-3327-40c1-bf22-a842b0980593-scripts\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.356040 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52b189da-3327-40c1-bf22-a842b0980593-config-data\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.358070 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/52b189da-3327-40c1-bf22-a842b0980593-ceph\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.366004 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52b189da-3327-40c1-bf22-a842b0980593-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.370892 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnt28\" (UniqueName: \"kubernetes.io/projected/52b189da-3327-40c1-bf22-a842b0980593-kube-api-access-jnt28\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.371063 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52b189da-3327-40c1-bf22-a842b0980593-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.394343 4792 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.395431 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.425337 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52d866c4-a856-4930-9501-2be56e07d3ce" path="/var/lib/kubelet/pods/52d866c4-a856-4930-9501-2be56e07d3ce/volumes" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.452825 4792 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.489560 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.780155 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6327707a-a9c5-4ba1-9c54-21cbb2e47222","Type":"ContainerDied","Data":"4dd7bd8f9c142eedf5828dbb684b632a29d244482b177fd704a0a7b1d31e3c86"} Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.780489 4792 scope.go:117] "RemoveContainer" containerID="14967d4571fbfd80a32479d19d2b933beb19ce543caa25b7ae07a663d46e8838" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.780192 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.815016 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.829460 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.852043 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.853638 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.856684 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.857045 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.897843 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.966998 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d055103-6c35-481f-820a-7aa363543404-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.967485 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6zm2\" (UniqueName: \"kubernetes.io/projected/9d055103-6c35-481f-820a-7aa363543404-kube-api-access-b6zm2\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.967561 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d055103-6c35-481f-820a-7aa363543404-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.967626 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9d055103-6c35-481f-820a-7aa363543404-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.967645 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d055103-6c35-481f-820a-7aa363543404-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.967794 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d055103-6c35-481f-820a-7aa363543404-logs\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.967855 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.967901 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d055103-6c35-481f-820a-7aa363543404-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.967955 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9d055103-6c35-481f-820a-7aa363543404-ceph\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.042572 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.070625 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9d055103-6c35-481f-820a-7aa363543404-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.070671 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d055103-6c35-481f-820a-7aa363543404-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.070753 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d055103-6c35-481f-820a-7aa363543404-logs\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.070825 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.070907 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d055103-6c35-481f-820a-7aa363543404-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.071017 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9d055103-6c35-481f-820a-7aa363543404-ceph\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.071175 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d055103-6c35-481f-820a-7aa363543404-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.072201 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6zm2\" (UniqueName: \"kubernetes.io/projected/9d055103-6c35-481f-820a-7aa363543404-kube-api-access-b6zm2\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.072278 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d055103-6c35-481f-820a-7aa363543404-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.072323 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d055103-6c35-481f-820a-7aa363543404-logs\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.071356 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.071200 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9d055103-6c35-481f-820a-7aa363543404-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.077125 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d055103-6c35-481f-820a-7aa363543404-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.079400 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d055103-6c35-481f-820a-7aa363543404-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.081360 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d055103-6c35-481f-820a-7aa363543404-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.088758 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6zm2\" (UniqueName: \"kubernetes.io/projected/9d055103-6c35-481f-820a-7aa363543404-kube-api-access-b6zm2\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.088841 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9d055103-6c35-481f-820a-7aa363543404-ceph\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.097371 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d055103-6c35-481f-820a-7aa363543404-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.133609 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.144807 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29539321-sclgm"] Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.146353 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29539321-sclgm" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.157613 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29539321-sclgm"] Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.202830 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.278277 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72wdc\" (UniqueName: \"kubernetes.io/projected/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-kube-api-access-72wdc\") pod \"keystone-cron-29539321-sclgm\" (UID: \"7ec04609-b280-4df0-a0c5-2e4c7208c1c6\") " pod="openstack/keystone-cron-29539321-sclgm" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.278356 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-combined-ca-bundle\") pod \"keystone-cron-29539321-sclgm\" (UID: \"7ec04609-b280-4df0-a0c5-2e4c7208c1c6\") " pod="openstack/keystone-cron-29539321-sclgm" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.278385 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-config-data\") pod \"keystone-cron-29539321-sclgm\" (UID: \"7ec04609-b280-4df0-a0c5-2e4c7208c1c6\") " pod="openstack/keystone-cron-29539321-sclgm" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.278420 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-fernet-keys\") pod \"keystone-cron-29539321-sclgm\" (UID: \"7ec04609-b280-4df0-a0c5-2e4c7208c1c6\") " pod="openstack/keystone-cron-29539321-sclgm" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.379716 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72wdc\" (UniqueName: \"kubernetes.io/projected/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-kube-api-access-72wdc\") pod \"keystone-cron-29539321-sclgm\" (UID: \"7ec04609-b280-4df0-a0c5-2e4c7208c1c6\") " pod="openstack/keystone-cron-29539321-sclgm" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.379803 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-combined-ca-bundle\") pod \"keystone-cron-29539321-sclgm\" (UID: \"7ec04609-b280-4df0-a0c5-2e4c7208c1c6\") " pod="openstack/keystone-cron-29539321-sclgm" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.379822 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-config-data\") pod \"keystone-cron-29539321-sclgm\" (UID: \"7ec04609-b280-4df0-a0c5-2e4c7208c1c6\") " pod="openstack/keystone-cron-29539321-sclgm" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.379857 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-fernet-keys\") pod \"keystone-cron-29539321-sclgm\" (UID: \"7ec04609-b280-4df0-a0c5-2e4c7208c1c6\") " pod="openstack/keystone-cron-29539321-sclgm" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.383781 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-combined-ca-bundle\") pod \"keystone-cron-29539321-sclgm\" (UID: \"7ec04609-b280-4df0-a0c5-2e4c7208c1c6\") " pod="openstack/keystone-cron-29539321-sclgm" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.416292 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-fernet-keys\") pod \"keystone-cron-29539321-sclgm\" (UID: \"7ec04609-b280-4df0-a0c5-2e4c7208c1c6\") " pod="openstack/keystone-cron-29539321-sclgm" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.416352 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72wdc\" (UniqueName: \"kubernetes.io/projected/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-kube-api-access-72wdc\") pod \"keystone-cron-29539321-sclgm\" (UID: \"7ec04609-b280-4df0-a0c5-2e4c7208c1c6\") " pod="openstack/keystone-cron-29539321-sclgm" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.419142 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-config-data\") pod \"keystone-cron-29539321-sclgm\" (UID: \"7ec04609-b280-4df0-a0c5-2e4c7208c1c6\") " pod="openstack/keystone-cron-29539321-sclgm" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.539241 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29539321-sclgm" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.769960 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.873563 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Mar 01 10:01:01 crc kubenswrapper[4792]: I0301 10:01:01.432620 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6327707a-a9c5-4ba1-9c54-21cbb2e47222" path="/var/lib/kubelet/pods/6327707a-a9c5-4ba1-9c54-21cbb2e47222/volumes" Mar 01 10:01:01 crc kubenswrapper[4792]: I0301 10:01:01.491566 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-p2dtn"] Mar 01 10:01:01 crc kubenswrapper[4792]: I0301 10:01:01.493074 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-p2dtn" Mar 01 10:01:01 crc kubenswrapper[4792]: I0301 10:01:01.504097 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Mar 01 10:01:01 crc kubenswrapper[4792]: I0301 10:01:01.504789 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-x2pjp" Mar 01 10:01:01 crc kubenswrapper[4792]: I0301 10:01:01.527867 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-p2dtn"] Mar 01 10:01:01 crc kubenswrapper[4792]: I0301 10:01:01.617611 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/01bf5dae-6217-4644-9c9b-65d3886a4dc1-job-config-data\") pod \"manila-db-sync-p2dtn\" (UID: \"01bf5dae-6217-4644-9c9b-65d3886a4dc1\") " pod="openstack/manila-db-sync-p2dtn" Mar 01 10:01:01 crc kubenswrapper[4792]: I0301 10:01:01.617704 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01bf5dae-6217-4644-9c9b-65d3886a4dc1-combined-ca-bundle\") pod \"manila-db-sync-p2dtn\" (UID: \"01bf5dae-6217-4644-9c9b-65d3886a4dc1\") " pod="openstack/manila-db-sync-p2dtn" Mar 01 10:01:01 crc kubenswrapper[4792]: I0301 10:01:01.617764 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01bf5dae-6217-4644-9c9b-65d3886a4dc1-config-data\") pod \"manila-db-sync-p2dtn\" (UID: \"01bf5dae-6217-4644-9c9b-65d3886a4dc1\") " pod="openstack/manila-db-sync-p2dtn" Mar 01 10:01:01 crc kubenswrapper[4792]: I0301 10:01:01.617809 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8pgg\" (UniqueName: \"kubernetes.io/projected/01bf5dae-6217-4644-9c9b-65d3886a4dc1-kube-api-access-n8pgg\") pod \"manila-db-sync-p2dtn\" (UID: \"01bf5dae-6217-4644-9c9b-65d3886a4dc1\") " pod="openstack/manila-db-sync-p2dtn" Mar 01 10:01:01 crc kubenswrapper[4792]: I0301 10:01:01.719553 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01bf5dae-6217-4644-9c9b-65d3886a4dc1-combined-ca-bundle\") pod \"manila-db-sync-p2dtn\" (UID: \"01bf5dae-6217-4644-9c9b-65d3886a4dc1\") " pod="openstack/manila-db-sync-p2dtn" Mar 01 10:01:01 crc kubenswrapper[4792]: I0301 10:01:01.719636 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01bf5dae-6217-4644-9c9b-65d3886a4dc1-config-data\") pod \"manila-db-sync-p2dtn\" (UID: \"01bf5dae-6217-4644-9c9b-65d3886a4dc1\") " pod="openstack/manila-db-sync-p2dtn" Mar 01 10:01:01 crc kubenswrapper[4792]: I0301 10:01:01.719678 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8pgg\" (UniqueName: \"kubernetes.io/projected/01bf5dae-6217-4644-9c9b-65d3886a4dc1-kube-api-access-n8pgg\") pod \"manila-db-sync-p2dtn\" (UID: \"01bf5dae-6217-4644-9c9b-65d3886a4dc1\") " pod="openstack/manila-db-sync-p2dtn" Mar 01 10:01:01 crc kubenswrapper[4792]: I0301 10:01:01.719768 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/01bf5dae-6217-4644-9c9b-65d3886a4dc1-job-config-data\") pod \"manila-db-sync-p2dtn\" (UID: \"01bf5dae-6217-4644-9c9b-65d3886a4dc1\") " pod="openstack/manila-db-sync-p2dtn" Mar 01 10:01:01 crc kubenswrapper[4792]: I0301 10:01:01.730478 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/01bf5dae-6217-4644-9c9b-65d3886a4dc1-job-config-data\") pod \"manila-db-sync-p2dtn\" (UID: \"01bf5dae-6217-4644-9c9b-65d3886a4dc1\") " pod="openstack/manila-db-sync-p2dtn" Mar 01 10:01:01 crc kubenswrapper[4792]: I0301 10:01:01.739365 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8pgg\" (UniqueName: \"kubernetes.io/projected/01bf5dae-6217-4644-9c9b-65d3886a4dc1-kube-api-access-n8pgg\") pod \"manila-db-sync-p2dtn\" (UID: \"01bf5dae-6217-4644-9c9b-65d3886a4dc1\") " pod="openstack/manila-db-sync-p2dtn" Mar 01 10:01:01 crc kubenswrapper[4792]: I0301 10:01:01.740112 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01bf5dae-6217-4644-9c9b-65d3886a4dc1-combined-ca-bundle\") pod \"manila-db-sync-p2dtn\" (UID: \"01bf5dae-6217-4644-9c9b-65d3886a4dc1\") " pod="openstack/manila-db-sync-p2dtn" Mar 01 10:01:01 crc kubenswrapper[4792]: I0301 10:01:01.745047 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01bf5dae-6217-4644-9c9b-65d3886a4dc1-config-data\") pod \"manila-db-sync-p2dtn\" (UID: \"01bf5dae-6217-4644-9c9b-65d3886a4dc1\") " pod="openstack/manila-db-sync-p2dtn" Mar 01 10:01:01 crc kubenswrapper[4792]: I0301 10:01:01.822029 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-p2dtn" Mar 01 10:01:04 crc kubenswrapper[4792]: I0301 10:01:04.943281 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 10:01:04 crc kubenswrapper[4792]: I0301 10:01:04.943664 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 10:01:05 crc kubenswrapper[4792]: W0301 10:01:05.666576 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52b189da_3327_40c1_bf22_a842b0980593.slice/crio-3a9834ee2fc0174c61ddd45d2bc8e20177af7786aa6702cb7d4060cedc520537 WatchSource:0}: Error finding container 3a9834ee2fc0174c61ddd45d2bc8e20177af7786aa6702cb7d4060cedc520537: Status 404 returned error can't find the container with id 3a9834ee2fc0174c61ddd45d2bc8e20177af7786aa6702cb7d4060cedc520537 Mar 01 10:01:05 crc kubenswrapper[4792]: I0301 10:01:05.708767 4792 scope.go:117] "RemoveContainer" containerID="da274f80edf717de1da1c26bd84e489a22c77768488cdf1c9349a7c0e2449fda" Mar 01 10:01:05 crc kubenswrapper[4792]: I0301 10:01:05.884341 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"52b189da-3327-40c1-bf22-a842b0980593","Type":"ContainerStarted","Data":"3a9834ee2fc0174c61ddd45d2bc8e20177af7786aa6702cb7d4060cedc520537"} Mar 01 10:01:06 crc kubenswrapper[4792]: I0301 10:01:06.318942 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 01 10:01:06 crc kubenswrapper[4792]: I0301 10:01:06.437395 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29539321-sclgm"] Mar 01 10:01:06 crc kubenswrapper[4792]: W0301 10:01:06.515216 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ec04609_b280_4df0_a0c5_2e4c7208c1c6.slice/crio-9eb42b80e67cd6ee111c52690140db4093810e4ed24e856419a0421b142b5cae WatchSource:0}: Error finding container 9eb42b80e67cd6ee111c52690140db4093810e4ed24e856419a0421b142b5cae: Status 404 returned error can't find the container with id 9eb42b80e67cd6ee111c52690140db4093810e4ed24e856419a0421b142b5cae Mar 01 10:01:06 crc kubenswrapper[4792]: I0301 10:01:06.532078 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-p2dtn"] Mar 01 10:01:06 crc kubenswrapper[4792]: W0301 10:01:06.570935 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01bf5dae_6217_4644_9c9b_65d3886a4dc1.slice/crio-602a6322b970b1abc9872680d2570a6e260b8dc7e4033e670c6368378000d632 WatchSource:0}: Error finding container 602a6322b970b1abc9872680d2570a6e260b8dc7e4033e670c6368378000d632: Status 404 returned error can't find the container with id 602a6322b970b1abc9872680d2570a6e260b8dc7e4033e670c6368378000d632 Mar 01 10:01:06 crc kubenswrapper[4792]: I0301 10:01:06.589688 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 01 10:01:06 crc kubenswrapper[4792]: I0301 10:01:06.901889 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5996ddfbb9-drpwn" podUID="7026175e-efaf-497a-aaf1-079f2811ad08" containerName="horizon-log" containerID="cri-o://b6c7806f191bce291b1fc9008b4e9e5c2b3a84dce2dc94634b54dd102f025baa" gracePeriod=30 Mar 01 10:01:06 crc kubenswrapper[4792]: I0301 10:01:06.901883 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5996ddfbb9-drpwn" event={"ID":"7026175e-efaf-497a-aaf1-079f2811ad08","Type":"ContainerStarted","Data":"54af972835bf1ffc5525cfd1de0f83cd8659d1c005a950a21016d293f0b51bac"} Mar 01 10:01:06 crc kubenswrapper[4792]: I0301 10:01:06.902398 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5996ddfbb9-drpwn" event={"ID":"7026175e-efaf-497a-aaf1-079f2811ad08","Type":"ContainerStarted","Data":"b6c7806f191bce291b1fc9008b4e9e5c2b3a84dce2dc94634b54dd102f025baa"} Mar 01 10:01:06 crc kubenswrapper[4792]: I0301 10:01:06.902521 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5996ddfbb9-drpwn" podUID="7026175e-efaf-497a-aaf1-079f2811ad08" containerName="horizon" containerID="cri-o://54af972835bf1ffc5525cfd1de0f83cd8659d1c005a950a21016d293f0b51bac" gracePeriod=30 Mar 01 10:01:06 crc kubenswrapper[4792]: I0301 10:01:06.907119 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79f8cb6d9d-xg7h5" event={"ID":"d7f79f77-ac1b-445e-8e28-85c8964f5461","Type":"ContainerStarted","Data":"e344c821e4bd4422c2a14dbebedafc5386ba96ddbd63c7249bebc2482072eb00"} Mar 01 10:01:06 crc kubenswrapper[4792]: I0301 10:01:06.909611 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9d055103-6c35-481f-820a-7aa363543404","Type":"ContainerStarted","Data":"10cf97f98099f7739343179f7e9a4039e3126ba719c407bf92cd7c8fe310ddc5"} Mar 01 10:01:06 crc kubenswrapper[4792]: I0301 10:01:06.911609 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5887d74897-rnlz9" event={"ID":"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7","Type":"ContainerStarted","Data":"b9cdb74c5cb376a6549f73b6d93daac9b503601025b178e429bb6ba097174373"} Mar 01 10:01:06 crc kubenswrapper[4792]: I0301 10:01:06.918182 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-p2dtn" event={"ID":"01bf5dae-6217-4644-9c9b-65d3886a4dc1","Type":"ContainerStarted","Data":"602a6322b970b1abc9872680d2570a6e260b8dc7e4033e670c6368378000d632"} Mar 01 10:01:06 crc kubenswrapper[4792]: I0301 10:01:06.926053 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-689c76c966-7mbkl" event={"ID":"67e060ef-1cc6-4b39-8622-bbcc183bdda0","Type":"ContainerStarted","Data":"809c1ff3a35d23863d9e774c5c257d32ce27f54609e7871a540c404967d6246b"} Mar 01 10:01:06 crc kubenswrapper[4792]: I0301 10:01:06.926132 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-689c76c966-7mbkl" event={"ID":"67e060ef-1cc6-4b39-8622-bbcc183bdda0","Type":"ContainerStarted","Data":"7eefa3b27a0172eca37fa354a49dffdfcea57a296cf27db9b99db7f3391b92ce"} Mar 01 10:01:06 crc kubenswrapper[4792]: I0301 10:01:06.933870 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5996ddfbb9-drpwn" podStartSLOduration=2.582071496 podStartE2EDuration="15.933847055s" podCreationTimestamp="2026-03-01 10:00:51 +0000 UTC" firstStartedPulling="2026-03-01 10:00:52.598505993 +0000 UTC m=+3181.840385190" lastFinishedPulling="2026-03-01 10:01:05.950281552 +0000 UTC m=+3195.192160749" observedRunningTime="2026-03-01 10:01:06.923572589 +0000 UTC m=+3196.165451786" watchObservedRunningTime="2026-03-01 10:01:06.933847055 +0000 UTC m=+3196.175726252" Mar 01 10:01:06 crc kubenswrapper[4792]: I0301 10:01:06.935985 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29539321-sclgm" event={"ID":"7ec04609-b280-4df0-a0c5-2e4c7208c1c6","Type":"ContainerStarted","Data":"9eb42b80e67cd6ee111c52690140db4093810e4ed24e856419a0421b142b5cae"} Mar 01 10:01:07 crc kubenswrapper[4792]: I0301 10:01:07.961756 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29539321-sclgm" event={"ID":"7ec04609-b280-4df0-a0c5-2e4c7208c1c6","Type":"ContainerStarted","Data":"45c567ce10495b75abefb1478976cbc609499a5f59a4a0f633c7587b44678000"} Mar 01 10:01:07 crc kubenswrapper[4792]: I0301 10:01:07.965613 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79f8cb6d9d-xg7h5" event={"ID":"d7f79f77-ac1b-445e-8e28-85c8964f5461","Type":"ContainerStarted","Data":"0cf91c9a522aacb2134a6cd7d91d0553f3efca2cb569fa0827511bb944ef8438"} Mar 01 10:01:07 crc kubenswrapper[4792]: I0301 10:01:07.975523 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9d055103-6c35-481f-820a-7aa363543404","Type":"ContainerStarted","Data":"7d2cc88f8a6f37e266989b6de9d788f976fe55173800f18c1f69b4a860e6c34f"} Mar 01 10:01:07 crc kubenswrapper[4792]: I0301 10:01:07.985572 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-689c76c966-7mbkl" podStartSLOduration=4.156399664 podStartE2EDuration="14.985557338s" podCreationTimestamp="2026-03-01 10:00:53 +0000 UTC" firstStartedPulling="2026-03-01 10:00:55.129533447 +0000 UTC m=+3184.371412644" lastFinishedPulling="2026-03-01 10:01:05.958691131 +0000 UTC m=+3195.200570318" observedRunningTime="2026-03-01 10:01:06.954230473 +0000 UTC m=+3196.196109670" watchObservedRunningTime="2026-03-01 10:01:07.985557338 +0000 UTC m=+3197.227436535" Mar 01 10:01:07 crc kubenswrapper[4792]: I0301 10:01:07.986768 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29539321-sclgm" podStartSLOduration=7.986764488 podStartE2EDuration="7.986764488s" podCreationTimestamp="2026-03-01 10:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 10:01:07.982891522 +0000 UTC m=+3197.224770719" watchObservedRunningTime="2026-03-01 10:01:07.986764488 +0000 UTC m=+3197.228643675" Mar 01 10:01:08 crc kubenswrapper[4792]: I0301 10:01:08.004887 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-79f8cb6d9d-xg7h5" podStartSLOduration=4.301035419 podStartE2EDuration="15.00487385s" podCreationTimestamp="2026-03-01 10:00:53 +0000 UTC" firstStartedPulling="2026-03-01 10:00:55.259448145 +0000 UTC m=+3184.501327342" lastFinishedPulling="2026-03-01 10:01:05.963286586 +0000 UTC m=+3195.205165773" observedRunningTime="2026-03-01 10:01:08.002954642 +0000 UTC m=+3197.244833849" watchObservedRunningTime="2026-03-01 10:01:08.00487385 +0000 UTC m=+3197.246753047" Mar 01 10:01:08 crc kubenswrapper[4792]: I0301 10:01:08.006775 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"52b189da-3327-40c1-bf22-a842b0980593","Type":"ContainerStarted","Data":"ff29ffdd6a80c6f635ebf305dac713755a6834f40a80f0b4fc3312233561d5b1"} Mar 01 10:01:08 crc kubenswrapper[4792]: I0301 10:01:08.006827 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"52b189da-3327-40c1-bf22-a842b0980593","Type":"ContainerStarted","Data":"bcb072fe43d6152d4fc47f391c97a3f0de0392a05202abc73c45febc53acaf52"} Mar 01 10:01:08 crc kubenswrapper[4792]: I0301 10:01:08.021744 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5887d74897-rnlz9" podUID="43f97fc5-fad3-4979-9a7d-2f24f9a8dac7" containerName="horizon-log" containerID="cri-o://b9cdb74c5cb376a6549f73b6d93daac9b503601025b178e429bb6ba097174373" gracePeriod=30 Mar 01 10:01:08 crc kubenswrapper[4792]: I0301 10:01:08.021860 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5887d74897-rnlz9" event={"ID":"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7","Type":"ContainerStarted","Data":"4236f2f0ade05ed729fae7f1ab3406c381b830b740b88994ee4a748285301095"} Mar 01 10:01:08 crc kubenswrapper[4792]: I0301 10:01:08.021932 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5887d74897-rnlz9" podUID="43f97fc5-fad3-4979-9a7d-2f24f9a8dac7" containerName="horizon" containerID="cri-o://4236f2f0ade05ed729fae7f1ab3406c381b830b740b88994ee4a748285301095" gracePeriod=30 Mar 01 10:01:08 crc kubenswrapper[4792]: I0301 10:01:08.057184 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=9.057160293 podStartE2EDuration="9.057160293s" podCreationTimestamp="2026-03-01 10:00:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 10:01:08.043714238 +0000 UTC m=+3197.285593445" watchObservedRunningTime="2026-03-01 10:01:08.057160293 +0000 UTC m=+3197.299039490" Mar 01 10:01:08 crc kubenswrapper[4792]: I0301 10:01:08.085939 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5887d74897-rnlz9" podStartSLOduration=3.596435919 podStartE2EDuration="17.08592169s" podCreationTimestamp="2026-03-01 10:00:51 +0000 UTC" firstStartedPulling="2026-03-01 10:00:52.375235238 +0000 UTC m=+3181.617114435" lastFinishedPulling="2026-03-01 10:01:05.864721009 +0000 UTC m=+3195.106600206" observedRunningTime="2026-03-01 10:01:08.072568807 +0000 UTC m=+3197.314448004" watchObservedRunningTime="2026-03-01 10:01:08.08592169 +0000 UTC m=+3197.327800887" Mar 01 10:01:09 crc kubenswrapper[4792]: I0301 10:01:09.035311 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9d055103-6c35-481f-820a-7aa363543404","Type":"ContainerStarted","Data":"95765fba596a2f1562b23cd8ba4b618f6eae542fa1c12361f390021023e3a630"} Mar 01 10:01:09 crc kubenswrapper[4792]: I0301 10:01:09.065447 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=10.065424023 podStartE2EDuration="10.065424023s" podCreationTimestamp="2026-03-01 10:00:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 10:01:09.054152682 +0000 UTC m=+3198.296031879" watchObservedRunningTime="2026-03-01 10:01:09.065424023 +0000 UTC m=+3198.307303220" Mar 01 10:01:09 crc kubenswrapper[4792]: I0301 10:01:09.491355 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 01 10:01:09 crc kubenswrapper[4792]: I0301 10:01:09.491406 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 01 10:01:09 crc kubenswrapper[4792]: I0301 10:01:09.537441 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 01 10:01:09 crc kubenswrapper[4792]: I0301 10:01:09.559736 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 01 10:01:10 crc kubenswrapper[4792]: I0301 10:01:10.041332 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 01 10:01:10 crc kubenswrapper[4792]: I0301 10:01:10.042739 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 01 10:01:10 crc kubenswrapper[4792]: I0301 10:01:10.203896 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 01 10:01:10 crc kubenswrapper[4792]: I0301 10:01:10.203966 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 01 10:01:10 crc kubenswrapper[4792]: I0301 10:01:10.243246 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 01 10:01:10 crc kubenswrapper[4792]: I0301 10:01:10.285068 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 01 10:01:11 crc kubenswrapper[4792]: I0301 10:01:11.049232 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 01 10:01:11 crc kubenswrapper[4792]: I0301 10:01:11.049287 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 01 10:01:11 crc kubenswrapper[4792]: I0301 10:01:11.458460 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5887d74897-rnlz9" Mar 01 10:01:11 crc kubenswrapper[4792]: I0301 10:01:11.725278 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5996ddfbb9-drpwn" Mar 01 10:01:12 crc kubenswrapper[4792]: I0301 10:01:12.059270 4792 generic.go:334] "Generic (PLEG): container finished" podID="7ec04609-b280-4df0-a0c5-2e4c7208c1c6" containerID="45c567ce10495b75abefb1478976cbc609499a5f59a4a0f633c7587b44678000" exitCode=0 Mar 01 10:01:12 crc kubenswrapper[4792]: I0301 10:01:12.060151 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29539321-sclgm" event={"ID":"7ec04609-b280-4df0-a0c5-2e4c7208c1c6","Type":"ContainerDied","Data":"45c567ce10495b75abefb1478976cbc609499a5f59a4a0f633c7587b44678000"} Mar 01 10:01:14 crc kubenswrapper[4792]: I0301 10:01:14.321877 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:01:14 crc kubenswrapper[4792]: I0301 10:01:14.322441 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:01:14 crc kubenswrapper[4792]: I0301 10:01:14.462082 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:01:14 crc kubenswrapper[4792]: I0301 10:01:14.462119 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:01:16 crc kubenswrapper[4792]: I0301 10:01:16.132580 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29539321-sclgm" event={"ID":"7ec04609-b280-4df0-a0c5-2e4c7208c1c6","Type":"ContainerDied","Data":"9eb42b80e67cd6ee111c52690140db4093810e4ed24e856419a0421b142b5cae"} Mar 01 10:01:16 crc kubenswrapper[4792]: I0301 10:01:16.132831 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9eb42b80e67cd6ee111c52690140db4093810e4ed24e856419a0421b142b5cae" Mar 01 10:01:16 crc kubenswrapper[4792]: I0301 10:01:16.217728 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29539321-sclgm" Mar 01 10:01:16 crc kubenswrapper[4792]: I0301 10:01:16.411164 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-fernet-keys\") pod \"7ec04609-b280-4df0-a0c5-2e4c7208c1c6\" (UID: \"7ec04609-b280-4df0-a0c5-2e4c7208c1c6\") " Mar 01 10:01:16 crc kubenswrapper[4792]: I0301 10:01:16.411274 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-config-data\") pod \"7ec04609-b280-4df0-a0c5-2e4c7208c1c6\" (UID: \"7ec04609-b280-4df0-a0c5-2e4c7208c1c6\") " Mar 01 10:01:16 crc kubenswrapper[4792]: I0301 10:01:16.411303 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72wdc\" (UniqueName: \"kubernetes.io/projected/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-kube-api-access-72wdc\") pod \"7ec04609-b280-4df0-a0c5-2e4c7208c1c6\" (UID: \"7ec04609-b280-4df0-a0c5-2e4c7208c1c6\") " Mar 01 10:01:16 crc kubenswrapper[4792]: I0301 10:01:16.411452 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-combined-ca-bundle\") pod \"7ec04609-b280-4df0-a0c5-2e4c7208c1c6\" (UID: \"7ec04609-b280-4df0-a0c5-2e4c7208c1c6\") " Mar 01 10:01:16 crc kubenswrapper[4792]: I0301 10:01:16.434627 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "7ec04609-b280-4df0-a0c5-2e4c7208c1c6" (UID: "7ec04609-b280-4df0-a0c5-2e4c7208c1c6"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:01:16 crc kubenswrapper[4792]: I0301 10:01:16.457106 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-kube-api-access-72wdc" (OuterVolumeSpecName: "kube-api-access-72wdc") pod "7ec04609-b280-4df0-a0c5-2e4c7208c1c6" (UID: "7ec04609-b280-4df0-a0c5-2e4c7208c1c6"). InnerVolumeSpecName "kube-api-access-72wdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:01:16 crc kubenswrapper[4792]: I0301 10:01:16.503065 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ec04609-b280-4df0-a0c5-2e4c7208c1c6" (UID: "7ec04609-b280-4df0-a0c5-2e4c7208c1c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:01:16 crc kubenswrapper[4792]: I0301 10:01:16.523367 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72wdc\" (UniqueName: \"kubernetes.io/projected/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-kube-api-access-72wdc\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:16 crc kubenswrapper[4792]: I0301 10:01:16.523400 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:16 crc kubenswrapper[4792]: I0301 10:01:16.523411 4792 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:16 crc kubenswrapper[4792]: I0301 10:01:16.535277 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-config-data" (OuterVolumeSpecName: "config-data") pod "7ec04609-b280-4df0-a0c5-2e4c7208c1c6" (UID: "7ec04609-b280-4df0-a0c5-2e4c7208c1c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:01:16 crc kubenswrapper[4792]: I0301 10:01:16.626326 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:17 crc kubenswrapper[4792]: I0301 10:01:17.085351 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 01 10:01:17 crc kubenswrapper[4792]: I0301 10:01:17.115449 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 01 10:01:17 crc kubenswrapper[4792]: I0301 10:01:17.143899 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29539321-sclgm" Mar 01 10:01:17 crc kubenswrapper[4792]: I0301 10:01:17.145349 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-p2dtn" event={"ID":"01bf5dae-6217-4644-9c9b-65d3886a4dc1","Type":"ContainerStarted","Data":"a0b92e5e12c9f6c28a62c7a978997e469e5d99be007aba61824b2b6c8d62ffa5"} Mar 01 10:01:17 crc kubenswrapper[4792]: I0301 10:01:17.163925 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-p2dtn" podStartSLOduration=6.72130499 podStartE2EDuration="16.163892306s" podCreationTimestamp="2026-03-01 10:01:01 +0000 UTC" firstStartedPulling="2026-03-01 10:01:06.589448941 +0000 UTC m=+3195.831328138" lastFinishedPulling="2026-03-01 10:01:16.032036257 +0000 UTC m=+3205.273915454" observedRunningTime="2026-03-01 10:01:17.161304382 +0000 UTC m=+3206.403183579" watchObservedRunningTime="2026-03-01 10:01:17.163892306 +0000 UTC m=+3206.405771503" Mar 01 10:01:17 crc kubenswrapper[4792]: I0301 10:01:17.184454 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 01 10:01:19 crc kubenswrapper[4792]: I0301 10:01:19.289787 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 01 10:01:24 crc kubenswrapper[4792]: I0301 10:01:24.323258 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-689c76c966-7mbkl" podUID="67e060ef-1cc6-4b39-8622-bbcc183bdda0" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.12:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.12:8443: connect: connection refused" Mar 01 10:01:24 crc kubenswrapper[4792]: I0301 10:01:24.464049 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-79f8cb6d9d-xg7h5" podUID="d7f79f77-ac1b-445e-8e28-85c8964f5461" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.13:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.13:8443: connect: connection refused" Mar 01 10:01:29 crc kubenswrapper[4792]: I0301 10:01:29.248980 4792 generic.go:334] "Generic (PLEG): container finished" podID="01bf5dae-6217-4644-9c9b-65d3886a4dc1" containerID="a0b92e5e12c9f6c28a62c7a978997e469e5d99be007aba61824b2b6c8d62ffa5" exitCode=0 Mar 01 10:01:29 crc kubenswrapper[4792]: I0301 10:01:29.249547 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-p2dtn" event={"ID":"01bf5dae-6217-4644-9c9b-65d3886a4dc1","Type":"ContainerDied","Data":"a0b92e5e12c9f6c28a62c7a978997e469e5d99be007aba61824b2b6c8d62ffa5"} Mar 01 10:01:30 crc kubenswrapper[4792]: I0301 10:01:30.986524 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-p2dtn" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.107783 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/01bf5dae-6217-4644-9c9b-65d3886a4dc1-job-config-data\") pod \"01bf5dae-6217-4644-9c9b-65d3886a4dc1\" (UID: \"01bf5dae-6217-4644-9c9b-65d3886a4dc1\") " Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.107879 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8pgg\" (UniqueName: \"kubernetes.io/projected/01bf5dae-6217-4644-9c9b-65d3886a4dc1-kube-api-access-n8pgg\") pod \"01bf5dae-6217-4644-9c9b-65d3886a4dc1\" (UID: \"01bf5dae-6217-4644-9c9b-65d3886a4dc1\") " Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.107907 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01bf5dae-6217-4644-9c9b-65d3886a4dc1-combined-ca-bundle\") pod \"01bf5dae-6217-4644-9c9b-65d3886a4dc1\" (UID: \"01bf5dae-6217-4644-9c9b-65d3886a4dc1\") " Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.107939 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01bf5dae-6217-4644-9c9b-65d3886a4dc1-config-data\") pod \"01bf5dae-6217-4644-9c9b-65d3886a4dc1\" (UID: \"01bf5dae-6217-4644-9c9b-65d3886a4dc1\") " Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.134376 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01bf5dae-6217-4644-9c9b-65d3886a4dc1-kube-api-access-n8pgg" (OuterVolumeSpecName: "kube-api-access-n8pgg") pod "01bf5dae-6217-4644-9c9b-65d3886a4dc1" (UID: "01bf5dae-6217-4644-9c9b-65d3886a4dc1"). InnerVolumeSpecName "kube-api-access-n8pgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.140232 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01bf5dae-6217-4644-9c9b-65d3886a4dc1-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "01bf5dae-6217-4644-9c9b-65d3886a4dc1" (UID: "01bf5dae-6217-4644-9c9b-65d3886a4dc1"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.147481 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01bf5dae-6217-4644-9c9b-65d3886a4dc1-config-data" (OuterVolumeSpecName: "config-data") pod "01bf5dae-6217-4644-9c9b-65d3886a4dc1" (UID: "01bf5dae-6217-4644-9c9b-65d3886a4dc1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.151079 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01bf5dae-6217-4644-9c9b-65d3886a4dc1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01bf5dae-6217-4644-9c9b-65d3886a4dc1" (UID: "01bf5dae-6217-4644-9c9b-65d3886a4dc1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.210380 4792 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/01bf5dae-6217-4644-9c9b-65d3886a4dc1-job-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.210431 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8pgg\" (UniqueName: \"kubernetes.io/projected/01bf5dae-6217-4644-9c9b-65d3886a4dc1-kube-api-access-n8pgg\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.210447 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01bf5dae-6217-4644-9c9b-65d3886a4dc1-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.210455 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01bf5dae-6217-4644-9c9b-65d3886a4dc1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.269168 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-p2dtn" event={"ID":"01bf5dae-6217-4644-9c9b-65d3886a4dc1","Type":"ContainerDied","Data":"602a6322b970b1abc9872680d2570a6e260b8dc7e4033e670c6368378000d632"} Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.269524 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="602a6322b970b1abc9872680d2570a6e260b8dc7e4033e670c6368378000d632" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.269593 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-p2dtn" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.770133 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Mar 01 10:01:31 crc kubenswrapper[4792]: E0301 10:01:31.770491 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ec04609-b280-4df0-a0c5-2e4c7208c1c6" containerName="keystone-cron" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.770507 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ec04609-b280-4df0-a0c5-2e4c7208c1c6" containerName="keystone-cron" Mar 01 10:01:31 crc kubenswrapper[4792]: E0301 10:01:31.770534 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01bf5dae-6217-4644-9c9b-65d3886a4dc1" containerName="manila-db-sync" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.770539 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="01bf5dae-6217-4644-9c9b-65d3886a4dc1" containerName="manila-db-sync" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.770722 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="01bf5dae-6217-4644-9c9b-65d3886a4dc1" containerName="manila-db-sync" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.770741 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ec04609-b280-4df0-a0c5-2e4c7208c1c6" containerName="keystone-cron" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.779689 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.790261 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.790489 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-x2pjp" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.790607 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.790715 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.823966 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.825572 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.832180 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.848566 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-config-data\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.848610 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-scripts\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.848654 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2e20671b-de40-40a8-8237-2bd9940b9af5-ceph\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.848709 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.848754 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/2e20671b-de40-40a8-8237-2bd9940b9af5-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.848774 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2e20671b-de40-40a8-8237-2bd9940b9af5-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.848818 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5f4d\" (UniqueName: \"kubernetes.io/projected/2e20671b-de40-40a8-8237-2bd9940b9af5-kube-api-access-r5f4d\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.848863 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.857099 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.882195 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.941232 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86c6bdcc4c-fqgkv"] Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.942708 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.952880 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-scripts\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.952961 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2e20671b-de40-40a8-8237-2bd9940b9af5-ceph\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.953005 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-scripts\") pod \"manila-scheduler-0\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.953024 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.953042 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.953072 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.953123 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/2e20671b-de40-40a8-8237-2bd9940b9af5-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.953140 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2e20671b-de40-40a8-8237-2bd9940b9af5-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.953157 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.953196 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5f4d\" (UniqueName: \"kubernetes.io/projected/2e20671b-de40-40a8-8237-2bd9940b9af5-kube-api-access-r5f4d\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.953215 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-config-data\") pod \"manila-scheduler-0\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.953246 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbl2x\" (UniqueName: \"kubernetes.io/projected/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-kube-api-access-qbl2x\") pod \"manila-scheduler-0\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.953272 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.953297 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-config-data\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.953502 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/2e20671b-de40-40a8-8237-2bd9940b9af5-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.954331 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2e20671b-de40-40a8-8237-2bd9940b9af5-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.966587 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-scripts\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.978514 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2e20671b-de40-40a8-8237-2bd9940b9af5-ceph\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.979278 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-config-data\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.993451 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5f4d\" (UniqueName: \"kubernetes.io/projected/2e20671b-de40-40a8-8237-2bd9940b9af5-kube-api-access-r5f4d\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.006986 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86c6bdcc4c-fqgkv"] Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.007604 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.009542 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.057171 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.058894 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.061972 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/49541358-1fd0-4d1d-8b61-0c618994dfc0-openstack-edpm-ipam\") pod \"dnsmasq-dns-86c6bdcc4c-fqgkv\" (UID: \"49541358-1fd0-4d1d-8b61-0c618994dfc0\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.062016 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-scripts\") pod \"manila-scheduler-0\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.062040 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.062058 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.062098 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49541358-1fd0-4d1d-8b61-0c618994dfc0-ovsdbserver-nb\") pod \"dnsmasq-dns-86c6bdcc4c-fqgkv\" (UID: \"49541358-1fd0-4d1d-8b61-0c618994dfc0\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.062140 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.062167 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49541358-1fd0-4d1d-8b61-0c618994dfc0-dns-svc\") pod \"dnsmasq-dns-86c6bdcc4c-fqgkv\" (UID: \"49541358-1fd0-4d1d-8b61-0c618994dfc0\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.062189 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49541358-1fd0-4d1d-8b61-0c618994dfc0-config\") pod \"dnsmasq-dns-86c6bdcc4c-fqgkv\" (UID: \"49541358-1fd0-4d1d-8b61-0c618994dfc0\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.062211 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-config-data\") pod \"manila-scheduler-0\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.062241 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbl2x\" (UniqueName: \"kubernetes.io/projected/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-kube-api-access-qbl2x\") pod \"manila-scheduler-0\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.062290 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49541358-1fd0-4d1d-8b61-0c618994dfc0-ovsdbserver-sb\") pod \"dnsmasq-dns-86c6bdcc4c-fqgkv\" (UID: \"49541358-1fd0-4d1d-8b61-0c618994dfc0\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.062316 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxsmh\" (UniqueName: \"kubernetes.io/projected/49541358-1fd0-4d1d-8b61-0c618994dfc0-kube-api-access-rxsmh\") pod \"dnsmasq-dns-86c6bdcc4c-fqgkv\" (UID: \"49541358-1fd0-4d1d-8b61-0c618994dfc0\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.067622 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.068286 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.069189 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.074206 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.090364 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-scripts\") pod \"manila-scheduler-0\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.090747 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-config-data\") pod \"manila-scheduler-0\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.091258 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.120001 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.153547 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbl2x\" (UniqueName: \"kubernetes.io/projected/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-kube-api-access-qbl2x\") pod \"manila-scheduler-0\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.156255 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.165711 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xv8r\" (UniqueName: \"kubernetes.io/projected/992c179f-7feb-4441-b94a-81b52133f671-kube-api-access-6xv8r\") pod \"manila-api-0\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " pod="openstack/manila-api-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.165777 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49541358-1fd0-4d1d-8b61-0c618994dfc0-ovsdbserver-nb\") pod \"dnsmasq-dns-86c6bdcc4c-fqgkv\" (UID: \"49541358-1fd0-4d1d-8b61-0c618994dfc0\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.165830 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49541358-1fd0-4d1d-8b61-0c618994dfc0-dns-svc\") pod \"dnsmasq-dns-86c6bdcc4c-fqgkv\" (UID: \"49541358-1fd0-4d1d-8b61-0c618994dfc0\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.165858 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49541358-1fd0-4d1d-8b61-0c618994dfc0-config\") pod \"dnsmasq-dns-86c6bdcc4c-fqgkv\" (UID: \"49541358-1fd0-4d1d-8b61-0c618994dfc0\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.165881 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " pod="openstack/manila-api-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.165934 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-config-data\") pod \"manila-api-0\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " pod="openstack/manila-api-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.165949 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-scripts\") pod \"manila-api-0\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " pod="openstack/manila-api-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.165987 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49541358-1fd0-4d1d-8b61-0c618994dfc0-ovsdbserver-sb\") pod \"dnsmasq-dns-86c6bdcc4c-fqgkv\" (UID: \"49541358-1fd0-4d1d-8b61-0c618994dfc0\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.166005 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-config-data-custom\") pod \"manila-api-0\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " pod="openstack/manila-api-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.166029 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxsmh\" (UniqueName: \"kubernetes.io/projected/49541358-1fd0-4d1d-8b61-0c618994dfc0-kube-api-access-rxsmh\") pod \"dnsmasq-dns-86c6bdcc4c-fqgkv\" (UID: \"49541358-1fd0-4d1d-8b61-0c618994dfc0\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.166048 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/992c179f-7feb-4441-b94a-81b52133f671-etc-machine-id\") pod \"manila-api-0\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " pod="openstack/manila-api-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.166069 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/992c179f-7feb-4441-b94a-81b52133f671-logs\") pod \"manila-api-0\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " pod="openstack/manila-api-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.166096 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/49541358-1fd0-4d1d-8b61-0c618994dfc0-openstack-edpm-ipam\") pod \"dnsmasq-dns-86c6bdcc4c-fqgkv\" (UID: \"49541358-1fd0-4d1d-8b61-0c618994dfc0\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.167722 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49541358-1fd0-4d1d-8b61-0c618994dfc0-ovsdbserver-nb\") pod \"dnsmasq-dns-86c6bdcc4c-fqgkv\" (UID: \"49541358-1fd0-4d1d-8b61-0c618994dfc0\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.168243 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49541358-1fd0-4d1d-8b61-0c618994dfc0-dns-svc\") pod \"dnsmasq-dns-86c6bdcc4c-fqgkv\" (UID: \"49541358-1fd0-4d1d-8b61-0c618994dfc0\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.169070 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49541358-1fd0-4d1d-8b61-0c618994dfc0-config\") pod \"dnsmasq-dns-86c6bdcc4c-fqgkv\" (UID: \"49541358-1fd0-4d1d-8b61-0c618994dfc0\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.185259 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49541358-1fd0-4d1d-8b61-0c618994dfc0-ovsdbserver-sb\") pod \"dnsmasq-dns-86c6bdcc4c-fqgkv\" (UID: \"49541358-1fd0-4d1d-8b61-0c618994dfc0\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.206180 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/49541358-1fd0-4d1d-8b61-0c618994dfc0-openstack-edpm-ipam\") pod \"dnsmasq-dns-86c6bdcc4c-fqgkv\" (UID: \"49541358-1fd0-4d1d-8b61-0c618994dfc0\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.222679 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxsmh\" (UniqueName: \"kubernetes.io/projected/49541358-1fd0-4d1d-8b61-0c618994dfc0-kube-api-access-rxsmh\") pod \"dnsmasq-dns-86c6bdcc4c-fqgkv\" (UID: \"49541358-1fd0-4d1d-8b61-0c618994dfc0\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.271363 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-config-data\") pod \"manila-api-0\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " pod="openstack/manila-api-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.271421 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-scripts\") pod \"manila-api-0\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " pod="openstack/manila-api-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.271498 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-config-data-custom\") pod \"manila-api-0\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " pod="openstack/manila-api-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.271529 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/992c179f-7feb-4441-b94a-81b52133f671-etc-machine-id\") pod \"manila-api-0\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " pod="openstack/manila-api-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.271552 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/992c179f-7feb-4441-b94a-81b52133f671-logs\") pod \"manila-api-0\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " pod="openstack/manila-api-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.271595 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xv8r\" (UniqueName: \"kubernetes.io/projected/992c179f-7feb-4441-b94a-81b52133f671-kube-api-access-6xv8r\") pod \"manila-api-0\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " pod="openstack/manila-api-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.274406 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " pod="openstack/manila-api-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.278003 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/992c179f-7feb-4441-b94a-81b52133f671-etc-machine-id\") pod \"manila-api-0\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " pod="openstack/manila-api-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.278410 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/992c179f-7feb-4441-b94a-81b52133f671-logs\") pod \"manila-api-0\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " pod="openstack/manila-api-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.284513 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-config-data-custom\") pod \"manila-api-0\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " pod="openstack/manila-api-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.291658 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " pod="openstack/manila-api-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.291880 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-scripts\") pod \"manila-api-0\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " pod="openstack/manila-api-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.292774 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.304688 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-config-data\") pod \"manila-api-0\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " pod="openstack/manila-api-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.321208 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xv8r\" (UniqueName: \"kubernetes.io/projected/992c179f-7feb-4441-b94a-81b52133f671-kube-api-access-6xv8r\") pod \"manila-api-0\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " pod="openstack/manila-api-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.609402 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 01 10:01:33 crc kubenswrapper[4792]: I0301 10:01:33.199165 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 01 10:01:33 crc kubenswrapper[4792]: I0301 10:01:33.292938 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86c6bdcc4c-fqgkv"] Mar 01 10:01:33 crc kubenswrapper[4792]: I0301 10:01:33.394150 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38","Type":"ContainerStarted","Data":"bd049fc1e63654f23ff767894cd0f3d2ee5d142592fec5855ea0f40d653d992d"} Mar 01 10:01:33 crc kubenswrapper[4792]: I0301 10:01:33.403710 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" event={"ID":"49541358-1fd0-4d1d-8b61-0c618994dfc0","Type":"ContainerStarted","Data":"908de245b9fc63d7b33b773f418fa3e1ffa0e2698ef86729aa65356a6eba8160"} Mar 01 10:01:33 crc kubenswrapper[4792]: I0301 10:01:33.495880 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 01 10:01:33 crc kubenswrapper[4792]: I0301 10:01:33.608585 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 01 10:01:34 crc kubenswrapper[4792]: I0301 10:01:34.322009 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-689c76c966-7mbkl" podUID="67e060ef-1cc6-4b39-8622-bbcc183bdda0" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.12:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.12:8443: connect: connection refused" Mar 01 10:01:34 crc kubenswrapper[4792]: I0301 10:01:34.421888 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"992c179f-7feb-4441-b94a-81b52133f671","Type":"ContainerStarted","Data":"b3828e251539f5a3929096bc64abd403423b14917056f3386127f41c90279e5b"} Mar 01 10:01:34 crc kubenswrapper[4792]: I0301 10:01:34.429692 4792 generic.go:334] "Generic (PLEG): container finished" podID="49541358-1fd0-4d1d-8b61-0c618994dfc0" containerID="bab38211efbbbcda82215e208b0cd4286972a7d51938b81d8c458f5923927746" exitCode=0 Mar 01 10:01:34 crc kubenswrapper[4792]: I0301 10:01:34.429767 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" event={"ID":"49541358-1fd0-4d1d-8b61-0c618994dfc0","Type":"ContainerDied","Data":"bab38211efbbbcda82215e208b0cd4286972a7d51938b81d8c458f5923927746"} Mar 01 10:01:34 crc kubenswrapper[4792]: I0301 10:01:34.445087 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"2e20671b-de40-40a8-8237-2bd9940b9af5","Type":"ContainerStarted","Data":"4f751b65cd13e8f9d7bbd49d2e059dd52782b4f71c9b59600e77ee78c852e20a"} Mar 01 10:01:34 crc kubenswrapper[4792]: I0301 10:01:34.464098 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-79f8cb6d9d-xg7h5" podUID="d7f79f77-ac1b-445e-8e28-85c8964f5461" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.13:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.13:8443: connect: connection refused" Mar 01 10:01:34 crc kubenswrapper[4792]: I0301 10:01:34.943675 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 10:01:34 crc kubenswrapper[4792]: I0301 10:01:34.944137 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 10:01:34 crc kubenswrapper[4792]: I0301 10:01:34.944179 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 10:01:34 crc kubenswrapper[4792]: I0301 10:01:34.945055 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b"} pod="openshift-machine-config-operator/machine-config-daemon-bqszv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 01 10:01:34 crc kubenswrapper[4792]: I0301 10:01:34.945119 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" containerID="cri-o://a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" gracePeriod=600 Mar 01 10:01:35 crc kubenswrapper[4792]: E0301 10:01:35.169003 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:01:35 crc kubenswrapper[4792]: I0301 10:01:35.487654 4792 generic.go:334] "Generic (PLEG): container finished" podID="9105f6b0-6f16-47aa-8009-73736a90b765" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" exitCode=0 Mar 01 10:01:35 crc kubenswrapper[4792]: I0301 10:01:35.488054 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerDied","Data":"a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b"} Mar 01 10:01:35 crc kubenswrapper[4792]: I0301 10:01:35.488089 4792 scope.go:117] "RemoveContainer" containerID="0c8bbdbe3553a3ed4617103bbd379245337bfad8f18870faba13af9b3c14caa1" Mar 01 10:01:35 crc kubenswrapper[4792]: I0301 10:01:35.488728 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:01:35 crc kubenswrapper[4792]: E0301 10:01:35.489030 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:01:35 crc kubenswrapper[4792]: I0301 10:01:35.503284 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"992c179f-7feb-4441-b94a-81b52133f671","Type":"ContainerStarted","Data":"6771ca6e038c525ca49561d70b9c91f98294b59eb490890e83179c43e3ef6385"} Mar 01 10:01:35 crc kubenswrapper[4792]: I0301 10:01:35.517838 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" event={"ID":"49541358-1fd0-4d1d-8b61-0c618994dfc0","Type":"ContainerStarted","Data":"260f9efb3b812a0d52fed5708cf6028ed15756fdebf443c912b3f22eea189dcb"} Mar 01 10:01:35 crc kubenswrapper[4792]: I0301 10:01:35.518372 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" Mar 01 10:01:35 crc kubenswrapper[4792]: I0301 10:01:35.562714 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" podStartSLOduration=4.562696494 podStartE2EDuration="4.562696494s" podCreationTimestamp="2026-03-01 10:01:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 10:01:35.554807397 +0000 UTC m=+3224.796686594" watchObservedRunningTime="2026-03-01 10:01:35.562696494 +0000 UTC m=+3224.804575681" Mar 01 10:01:35 crc kubenswrapper[4792]: I0301 10:01:35.815261 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Mar 01 10:01:36 crc kubenswrapper[4792]: I0301 10:01:36.566402 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"992c179f-7feb-4441-b94a-81b52133f671","Type":"ContainerStarted","Data":"6344b5116272aa3291056d2fadaea6c73aaf3b2724251ae1e478364c6553097b"} Mar 01 10:01:36 crc kubenswrapper[4792]: I0301 10:01:36.566798 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="992c179f-7feb-4441-b94a-81b52133f671" containerName="manila-api-log" containerID="cri-o://6771ca6e038c525ca49561d70b9c91f98294b59eb490890e83179c43e3ef6385" gracePeriod=30 Mar 01 10:01:36 crc kubenswrapper[4792]: I0301 10:01:36.567204 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Mar 01 10:01:36 crc kubenswrapper[4792]: I0301 10:01:36.567511 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="992c179f-7feb-4441-b94a-81b52133f671" containerName="manila-api" containerID="cri-o://6344b5116272aa3291056d2fadaea6c73aaf3b2724251ae1e478364c6553097b" gracePeriod=30 Mar 01 10:01:36 crc kubenswrapper[4792]: I0301 10:01:36.628053 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38","Type":"ContainerStarted","Data":"dde55c4e7a9e0e028ff7df99a2c399eb3840b164a780338d688539c0a3c4421a"} Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.584599 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5996ddfbb9-drpwn" Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.624466 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=6.624447081 podStartE2EDuration="6.624447081s" podCreationTimestamp="2026-03-01 10:01:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 10:01:36.597542457 +0000 UTC m=+3225.839421654" watchObservedRunningTime="2026-03-01 10:01:37.624447081 +0000 UTC m=+3226.866326278" Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.679551 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcmd2\" (UniqueName: \"kubernetes.io/projected/7026175e-efaf-497a-aaf1-079f2811ad08-kube-api-access-zcmd2\") pod \"7026175e-efaf-497a-aaf1-079f2811ad08\" (UID: \"7026175e-efaf-497a-aaf1-079f2811ad08\") " Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.679603 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7026175e-efaf-497a-aaf1-079f2811ad08-config-data\") pod \"7026175e-efaf-497a-aaf1-079f2811ad08\" (UID: \"7026175e-efaf-497a-aaf1-079f2811ad08\") " Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.679637 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7026175e-efaf-497a-aaf1-079f2811ad08-logs\") pod \"7026175e-efaf-497a-aaf1-079f2811ad08\" (UID: \"7026175e-efaf-497a-aaf1-079f2811ad08\") " Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.680355 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7026175e-efaf-497a-aaf1-079f2811ad08-horizon-secret-key\") pod \"7026175e-efaf-497a-aaf1-079f2811ad08\" (UID: \"7026175e-efaf-497a-aaf1-079f2811ad08\") " Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.680449 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7026175e-efaf-497a-aaf1-079f2811ad08-scripts\") pod \"7026175e-efaf-497a-aaf1-079f2811ad08\" (UID: \"7026175e-efaf-497a-aaf1-079f2811ad08\") " Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.680814 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38","Type":"ContainerStarted","Data":"06366ec6e532eeea2e6a301ab0ca2d1e855bb93e6c5118bc9e3b934abde5eb98"} Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.688126 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7026175e-efaf-497a-aaf1-079f2811ad08-logs" (OuterVolumeSpecName: "logs") pod "7026175e-efaf-497a-aaf1-079f2811ad08" (UID: "7026175e-efaf-497a-aaf1-079f2811ad08"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.728859 4792 generic.go:334] "Generic (PLEG): container finished" podID="7026175e-efaf-497a-aaf1-079f2811ad08" containerID="54af972835bf1ffc5525cfd1de0f83cd8659d1c005a950a21016d293f0b51bac" exitCode=137 Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.728898 4792 generic.go:334] "Generic (PLEG): container finished" podID="7026175e-efaf-497a-aaf1-079f2811ad08" containerID="b6c7806f191bce291b1fc9008b4e9e5c2b3a84dce2dc94634b54dd102f025baa" exitCode=137 Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.729019 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5996ddfbb9-drpwn" event={"ID":"7026175e-efaf-497a-aaf1-079f2811ad08","Type":"ContainerDied","Data":"54af972835bf1ffc5525cfd1de0f83cd8659d1c005a950a21016d293f0b51bac"} Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.729048 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5996ddfbb9-drpwn" event={"ID":"7026175e-efaf-497a-aaf1-079f2811ad08","Type":"ContainerDied","Data":"b6c7806f191bce291b1fc9008b4e9e5c2b3a84dce2dc94634b54dd102f025baa"} Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.729058 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5996ddfbb9-drpwn" event={"ID":"7026175e-efaf-497a-aaf1-079f2811ad08","Type":"ContainerDied","Data":"cdd7f60094a6b70916ce6fb3a1317938714cd27149f604e9df29f382e04185b2"} Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.729104 4792 scope.go:117] "RemoveContainer" containerID="54af972835bf1ffc5525cfd1de0f83cd8659d1c005a950a21016d293f0b51bac" Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.729700 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=5.385704516 podStartE2EDuration="6.729660973s" podCreationTimestamp="2026-03-01 10:01:31 +0000 UTC" firstStartedPulling="2026-03-01 10:01:33.217093603 +0000 UTC m=+3222.458972800" lastFinishedPulling="2026-03-01 10:01:34.56105006 +0000 UTC m=+3223.802929257" observedRunningTime="2026-03-01 10:01:37.716720371 +0000 UTC m=+3226.958599568" watchObservedRunningTime="2026-03-01 10:01:37.729660973 +0000 UTC m=+3226.971540170" Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.729759 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5996ddfbb9-drpwn" Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.733781 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7026175e-efaf-497a-aaf1-079f2811ad08-kube-api-access-zcmd2" (OuterVolumeSpecName: "kube-api-access-zcmd2") pod "7026175e-efaf-497a-aaf1-079f2811ad08" (UID: "7026175e-efaf-497a-aaf1-079f2811ad08"). InnerVolumeSpecName "kube-api-access-zcmd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.737256 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7026175e-efaf-497a-aaf1-079f2811ad08-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "7026175e-efaf-497a-aaf1-079f2811ad08" (UID: "7026175e-efaf-497a-aaf1-079f2811ad08"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.765084 4792 generic.go:334] "Generic (PLEG): container finished" podID="992c179f-7feb-4441-b94a-81b52133f671" containerID="6344b5116272aa3291056d2fadaea6c73aaf3b2724251ae1e478364c6553097b" exitCode=0 Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.765389 4792 generic.go:334] "Generic (PLEG): container finished" podID="992c179f-7feb-4441-b94a-81b52133f671" containerID="6771ca6e038c525ca49561d70b9c91f98294b59eb490890e83179c43e3ef6385" exitCode=143 Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.765414 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"992c179f-7feb-4441-b94a-81b52133f671","Type":"ContainerDied","Data":"6344b5116272aa3291056d2fadaea6c73aaf3b2724251ae1e478364c6553097b"} Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.765439 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"992c179f-7feb-4441-b94a-81b52133f671","Type":"ContainerDied","Data":"6771ca6e038c525ca49561d70b9c91f98294b59eb490890e83179c43e3ef6385"} Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.780691 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7026175e-efaf-497a-aaf1-079f2811ad08-config-data" (OuterVolumeSpecName: "config-data") pod "7026175e-efaf-497a-aaf1-079f2811ad08" (UID: "7026175e-efaf-497a-aaf1-079f2811ad08"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.783702 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7026175e-efaf-497a-aaf1-079f2811ad08-scripts" (OuterVolumeSpecName: "scripts") pod "7026175e-efaf-497a-aaf1-079f2811ad08" (UID: "7026175e-efaf-497a-aaf1-079f2811ad08"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.786393 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcmd2\" (UniqueName: \"kubernetes.io/projected/7026175e-efaf-497a-aaf1-079f2811ad08-kube-api-access-zcmd2\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.786414 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7026175e-efaf-497a-aaf1-079f2811ad08-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.786424 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7026175e-efaf-497a-aaf1-079f2811ad08-logs\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.786435 4792 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7026175e-efaf-497a-aaf1-079f2811ad08-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.786468 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7026175e-efaf-497a-aaf1-079f2811ad08-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.069513 4792 scope.go:117] "RemoveContainer" containerID="b6c7806f191bce291b1fc9008b4e9e5c2b3a84dce2dc94634b54dd102f025baa" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.308747 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.341972 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5996ddfbb9-drpwn"] Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.369403 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5996ddfbb9-drpwn"] Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.373529 4792 scope.go:117] "RemoveContainer" containerID="54af972835bf1ffc5525cfd1de0f83cd8659d1c005a950a21016d293f0b51bac" Mar 01 10:01:38 crc kubenswrapper[4792]: E0301 10:01:38.391060 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54af972835bf1ffc5525cfd1de0f83cd8659d1c005a950a21016d293f0b51bac\": container with ID starting with 54af972835bf1ffc5525cfd1de0f83cd8659d1c005a950a21016d293f0b51bac not found: ID does not exist" containerID="54af972835bf1ffc5525cfd1de0f83cd8659d1c005a950a21016d293f0b51bac" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.391108 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54af972835bf1ffc5525cfd1de0f83cd8659d1c005a950a21016d293f0b51bac"} err="failed to get container status \"54af972835bf1ffc5525cfd1de0f83cd8659d1c005a950a21016d293f0b51bac\": rpc error: code = NotFound desc = could not find container \"54af972835bf1ffc5525cfd1de0f83cd8659d1c005a950a21016d293f0b51bac\": container with ID starting with 54af972835bf1ffc5525cfd1de0f83cd8659d1c005a950a21016d293f0b51bac not found: ID does not exist" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.391135 4792 scope.go:117] "RemoveContainer" containerID="b6c7806f191bce291b1fc9008b4e9e5c2b3a84dce2dc94634b54dd102f025baa" Mar 01 10:01:38 crc kubenswrapper[4792]: E0301 10:01:38.398021 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6c7806f191bce291b1fc9008b4e9e5c2b3a84dce2dc94634b54dd102f025baa\": container with ID starting with b6c7806f191bce291b1fc9008b4e9e5c2b3a84dce2dc94634b54dd102f025baa not found: ID does not exist" containerID="b6c7806f191bce291b1fc9008b4e9e5c2b3a84dce2dc94634b54dd102f025baa" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.398054 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6c7806f191bce291b1fc9008b4e9e5c2b3a84dce2dc94634b54dd102f025baa"} err="failed to get container status \"b6c7806f191bce291b1fc9008b4e9e5c2b3a84dce2dc94634b54dd102f025baa\": rpc error: code = NotFound desc = could not find container \"b6c7806f191bce291b1fc9008b4e9e5c2b3a84dce2dc94634b54dd102f025baa\": container with ID starting with b6c7806f191bce291b1fc9008b4e9e5c2b3a84dce2dc94634b54dd102f025baa not found: ID does not exist" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.398075 4792 scope.go:117] "RemoveContainer" containerID="54af972835bf1ffc5525cfd1de0f83cd8659d1c005a950a21016d293f0b51bac" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.398597 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54af972835bf1ffc5525cfd1de0f83cd8659d1c005a950a21016d293f0b51bac"} err="failed to get container status \"54af972835bf1ffc5525cfd1de0f83cd8659d1c005a950a21016d293f0b51bac\": rpc error: code = NotFound desc = could not find container \"54af972835bf1ffc5525cfd1de0f83cd8659d1c005a950a21016d293f0b51bac\": container with ID starting with 54af972835bf1ffc5525cfd1de0f83cd8659d1c005a950a21016d293f0b51bac not found: ID does not exist" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.398630 4792 scope.go:117] "RemoveContainer" containerID="b6c7806f191bce291b1fc9008b4e9e5c2b3a84dce2dc94634b54dd102f025baa" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.398849 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6c7806f191bce291b1fc9008b4e9e5c2b3a84dce2dc94634b54dd102f025baa"} err="failed to get container status \"b6c7806f191bce291b1fc9008b4e9e5c2b3a84dce2dc94634b54dd102f025baa\": rpc error: code = NotFound desc = could not find container \"b6c7806f191bce291b1fc9008b4e9e5c2b3a84dce2dc94634b54dd102f025baa\": container with ID starting with b6c7806f191bce291b1fc9008b4e9e5c2b3a84dce2dc94634b54dd102f025baa not found: ID does not exist" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.409498 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-config-data-custom\") pod \"992c179f-7feb-4441-b94a-81b52133f671\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.409640 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xv8r\" (UniqueName: \"kubernetes.io/projected/992c179f-7feb-4441-b94a-81b52133f671-kube-api-access-6xv8r\") pod \"992c179f-7feb-4441-b94a-81b52133f671\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.409678 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/992c179f-7feb-4441-b94a-81b52133f671-etc-machine-id\") pod \"992c179f-7feb-4441-b94a-81b52133f671\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.409774 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/992c179f-7feb-4441-b94a-81b52133f671-logs\") pod \"992c179f-7feb-4441-b94a-81b52133f671\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.409838 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-config-data\") pod \"992c179f-7feb-4441-b94a-81b52133f671\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.409881 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-combined-ca-bundle\") pod \"992c179f-7feb-4441-b94a-81b52133f671\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.409896 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-scripts\") pod \"992c179f-7feb-4441-b94a-81b52133f671\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.413122 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/992c179f-7feb-4441-b94a-81b52133f671-logs" (OuterVolumeSpecName: "logs") pod "992c179f-7feb-4441-b94a-81b52133f671" (UID: "992c179f-7feb-4441-b94a-81b52133f671"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.414681 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/992c179f-7feb-4441-b94a-81b52133f671-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "992c179f-7feb-4441-b94a-81b52133f671" (UID: "992c179f-7feb-4441-b94a-81b52133f671"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.437050 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "992c179f-7feb-4441-b94a-81b52133f671" (UID: "992c179f-7feb-4441-b94a-81b52133f671"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.451416 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/992c179f-7feb-4441-b94a-81b52133f671-kube-api-access-6xv8r" (OuterVolumeSpecName: "kube-api-access-6xv8r") pod "992c179f-7feb-4441-b94a-81b52133f671" (UID: "992c179f-7feb-4441-b94a-81b52133f671"). InnerVolumeSpecName "kube-api-access-6xv8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.473303 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-scripts" (OuterVolumeSpecName: "scripts") pod "992c179f-7feb-4441-b94a-81b52133f671" (UID: "992c179f-7feb-4441-b94a-81b52133f671"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.512088 4792 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/992c179f-7feb-4441-b94a-81b52133f671-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.512116 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/992c179f-7feb-4441-b94a-81b52133f671-logs\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.512127 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.512135 4792 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.512146 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xv8r\" (UniqueName: \"kubernetes.io/projected/992c179f-7feb-4441-b94a-81b52133f671-kube-api-access-6xv8r\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.568974 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-config-data" (OuterVolumeSpecName: "config-data") pod "992c179f-7feb-4441-b94a-81b52133f671" (UID: "992c179f-7feb-4441-b94a-81b52133f671"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.584612 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "992c179f-7feb-4441-b94a-81b52133f671" (UID: "992c179f-7feb-4441-b94a-81b52133f671"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.614109 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.614135 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.758601 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5887d74897-rnlz9" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.793788 4792 generic.go:334] "Generic (PLEG): container finished" podID="43f97fc5-fad3-4979-9a7d-2f24f9a8dac7" containerID="4236f2f0ade05ed729fae7f1ab3406c381b830b740b88994ee4a748285301095" exitCode=137 Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.793819 4792 generic.go:334] "Generic (PLEG): container finished" podID="43f97fc5-fad3-4979-9a7d-2f24f9a8dac7" containerID="b9cdb74c5cb376a6549f73b6d93daac9b503601025b178e429bb6ba097174373" exitCode=137 Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.793854 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5887d74897-rnlz9" event={"ID":"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7","Type":"ContainerDied","Data":"4236f2f0ade05ed729fae7f1ab3406c381b830b740b88994ee4a748285301095"} Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.793878 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5887d74897-rnlz9" event={"ID":"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7","Type":"ContainerDied","Data":"b9cdb74c5cb376a6549f73b6d93daac9b503601025b178e429bb6ba097174373"} Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.793888 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5887d74897-rnlz9" event={"ID":"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7","Type":"ContainerDied","Data":"f6db483fc0d11feaf9f5fa34256dfeb85c12b4ea3626fb137a55f680d9cb8957"} Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.793902 4792 scope.go:117] "RemoveContainer" containerID="4236f2f0ade05ed729fae7f1ab3406c381b830b740b88994ee4a748285301095" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.794169 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5887d74897-rnlz9" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.800350 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.810625 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"992c179f-7feb-4441-b94a-81b52133f671","Type":"ContainerDied","Data":"b3828e251539f5a3929096bc64abd403423b14917056f3386127f41c90279e5b"} Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.820065 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-config-data\") pod \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\" (UID: \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\") " Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.820112 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-logs\") pod \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\" (UID: \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\") " Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.820152 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-scripts\") pod \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\" (UID: \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\") " Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.820184 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-horizon-secret-key\") pod \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\" (UID: \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\") " Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.820207 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pf4f\" (UniqueName: \"kubernetes.io/projected/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-kube-api-access-2pf4f\") pod \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\" (UID: \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\") " Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.822534 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-logs" (OuterVolumeSpecName: "logs") pod "43f97fc5-fad3-4979-9a7d-2f24f9a8dac7" (UID: "43f97fc5-fad3-4979-9a7d-2f24f9a8dac7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.839170 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-kube-api-access-2pf4f" (OuterVolumeSpecName: "kube-api-access-2pf4f") pod "43f97fc5-fad3-4979-9a7d-2f24f9a8dac7" (UID: "43f97fc5-fad3-4979-9a7d-2f24f9a8dac7"). InnerVolumeSpecName "kube-api-access-2pf4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.852894 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.858125 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "43f97fc5-fad3-4979-9a7d-2f24f9a8dac7" (UID: "43f97fc5-fad3-4979-9a7d-2f24f9a8dac7"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.865940 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-api-0"] Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.904604 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-scripts" (OuterVolumeSpecName: "scripts") pod "43f97fc5-fad3-4979-9a7d-2f24f9a8dac7" (UID: "43f97fc5-fad3-4979-9a7d-2f24f9a8dac7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.910088 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Mar 01 10:01:38 crc kubenswrapper[4792]: E0301 10:01:38.910564 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7026175e-efaf-497a-aaf1-079f2811ad08" containerName="horizon" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.910585 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7026175e-efaf-497a-aaf1-079f2811ad08" containerName="horizon" Mar 01 10:01:38 crc kubenswrapper[4792]: E0301 10:01:38.910599 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="992c179f-7feb-4441-b94a-81b52133f671" containerName="manila-api" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.910607 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="992c179f-7feb-4441-b94a-81b52133f671" containerName="manila-api" Mar 01 10:01:38 crc kubenswrapper[4792]: E0301 10:01:38.910624 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43f97fc5-fad3-4979-9a7d-2f24f9a8dac7" containerName="horizon" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.910631 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="43f97fc5-fad3-4979-9a7d-2f24f9a8dac7" containerName="horizon" Mar 01 10:01:38 crc kubenswrapper[4792]: E0301 10:01:38.910644 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43f97fc5-fad3-4979-9a7d-2f24f9a8dac7" containerName="horizon-log" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.910651 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="43f97fc5-fad3-4979-9a7d-2f24f9a8dac7" containerName="horizon-log" Mar 01 10:01:38 crc kubenswrapper[4792]: E0301 10:01:38.910673 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="992c179f-7feb-4441-b94a-81b52133f671" containerName="manila-api-log" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.910681 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="992c179f-7feb-4441-b94a-81b52133f671" containerName="manila-api-log" Mar 01 10:01:38 crc kubenswrapper[4792]: E0301 10:01:38.910691 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7026175e-efaf-497a-aaf1-079f2811ad08" containerName="horizon-log" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.910699 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7026175e-efaf-497a-aaf1-079f2811ad08" containerName="horizon-log" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.912024 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="992c179f-7feb-4441-b94a-81b52133f671" containerName="manila-api" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.912052 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="43f97fc5-fad3-4979-9a7d-2f24f9a8dac7" containerName="horizon" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.912064 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="7026175e-efaf-497a-aaf1-079f2811ad08" containerName="horizon" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.912075 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="992c179f-7feb-4441-b94a-81b52133f671" containerName="manila-api-log" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.912090 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="43f97fc5-fad3-4979-9a7d-2f24f9a8dac7" containerName="horizon-log" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.912108 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="7026175e-efaf-497a-aaf1-079f2811ad08" containerName="horizon-log" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.913125 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.918270 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-internal-svc" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.918446 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-public-svc" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.918549 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.924935 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-scripts\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.924977 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-logs\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.925009 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-etc-machine-id\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.925031 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-internal-tls-certs\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.925051 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-config-data-custom\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.925092 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttblj\" (UniqueName: \"kubernetes.io/projected/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-kube-api-access-ttblj\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.925224 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-config-data\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.925269 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.925290 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-public-tls-certs\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.925354 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-logs\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.925366 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.925376 4792 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.925387 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pf4f\" (UniqueName: \"kubernetes.io/projected/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-kube-api-access-2pf4f\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.949452 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.005633 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-config-data" (OuterVolumeSpecName: "config-data") pod "43f97fc5-fad3-4979-9a7d-2f24f9a8dac7" (UID: "43f97fc5-fad3-4979-9a7d-2f24f9a8dac7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.036347 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-etc-machine-id\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.038300 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-etc-machine-id\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.038317 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-internal-tls-certs\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.038368 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-config-data-custom\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.038427 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttblj\" (UniqueName: \"kubernetes.io/projected/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-kube-api-access-ttblj\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.038587 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-config-data\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.038676 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.038707 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-public-tls-certs\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.038789 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-scripts\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.038818 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-logs\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.040782 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.041138 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-logs\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.043002 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-config-data-custom\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.049730 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.051898 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-scripts\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.052353 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-public-tls-certs\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.052735 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-internal-tls-certs\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.058863 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-config-data\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.065371 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttblj\" (UniqueName: \"kubernetes.io/projected/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-kube-api-access-ttblj\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.087201 4792 scope.go:117] "RemoveContainer" containerID="b9cdb74c5cb376a6549f73b6d93daac9b503601025b178e429bb6ba097174373" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.160564 4792 scope.go:117] "RemoveContainer" containerID="4236f2f0ade05ed729fae7f1ab3406c381b830b740b88994ee4a748285301095" Mar 01 10:01:39 crc kubenswrapper[4792]: E0301 10:01:39.162263 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4236f2f0ade05ed729fae7f1ab3406c381b830b740b88994ee4a748285301095\": container with ID starting with 4236f2f0ade05ed729fae7f1ab3406c381b830b740b88994ee4a748285301095 not found: ID does not exist" containerID="4236f2f0ade05ed729fae7f1ab3406c381b830b740b88994ee4a748285301095" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.162306 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4236f2f0ade05ed729fae7f1ab3406c381b830b740b88994ee4a748285301095"} err="failed to get container status \"4236f2f0ade05ed729fae7f1ab3406c381b830b740b88994ee4a748285301095\": rpc error: code = NotFound desc = could not find container \"4236f2f0ade05ed729fae7f1ab3406c381b830b740b88994ee4a748285301095\": container with ID starting with 4236f2f0ade05ed729fae7f1ab3406c381b830b740b88994ee4a748285301095 not found: ID does not exist" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.162335 4792 scope.go:117] "RemoveContainer" containerID="b9cdb74c5cb376a6549f73b6d93daac9b503601025b178e429bb6ba097174373" Mar 01 10:01:39 crc kubenswrapper[4792]: E0301 10:01:39.164347 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9cdb74c5cb376a6549f73b6d93daac9b503601025b178e429bb6ba097174373\": container with ID starting with b9cdb74c5cb376a6549f73b6d93daac9b503601025b178e429bb6ba097174373 not found: ID does not exist" containerID="b9cdb74c5cb376a6549f73b6d93daac9b503601025b178e429bb6ba097174373" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.164374 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9cdb74c5cb376a6549f73b6d93daac9b503601025b178e429bb6ba097174373"} err="failed to get container status \"b9cdb74c5cb376a6549f73b6d93daac9b503601025b178e429bb6ba097174373\": rpc error: code = NotFound desc = could not find container \"b9cdb74c5cb376a6549f73b6d93daac9b503601025b178e429bb6ba097174373\": container with ID starting with b9cdb74c5cb376a6549f73b6d93daac9b503601025b178e429bb6ba097174373 not found: ID does not exist" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.164387 4792 scope.go:117] "RemoveContainer" containerID="4236f2f0ade05ed729fae7f1ab3406c381b830b740b88994ee4a748285301095" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.164560 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4236f2f0ade05ed729fae7f1ab3406c381b830b740b88994ee4a748285301095"} err="failed to get container status \"4236f2f0ade05ed729fae7f1ab3406c381b830b740b88994ee4a748285301095\": rpc error: code = NotFound desc = could not find container \"4236f2f0ade05ed729fae7f1ab3406c381b830b740b88994ee4a748285301095\": container with ID starting with 4236f2f0ade05ed729fae7f1ab3406c381b830b740b88994ee4a748285301095 not found: ID does not exist" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.164578 4792 scope.go:117] "RemoveContainer" containerID="b9cdb74c5cb376a6549f73b6d93daac9b503601025b178e429bb6ba097174373" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.164749 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9cdb74c5cb376a6549f73b6d93daac9b503601025b178e429bb6ba097174373"} err="failed to get container status \"b9cdb74c5cb376a6549f73b6d93daac9b503601025b178e429bb6ba097174373\": rpc error: code = NotFound desc = could not find container \"b9cdb74c5cb376a6549f73b6d93daac9b503601025b178e429bb6ba097174373\": container with ID starting with b9cdb74c5cb376a6549f73b6d93daac9b503601025b178e429bb6ba097174373 not found: ID does not exist" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.164769 4792 scope.go:117] "RemoveContainer" containerID="6344b5116272aa3291056d2fadaea6c73aaf3b2724251ae1e478364c6553097b" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.176396 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5887d74897-rnlz9"] Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.186051 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5887d74897-rnlz9"] Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.207052 4792 scope.go:117] "RemoveContainer" containerID="6771ca6e038c525ca49561d70b9c91f98294b59eb490890e83179c43e3ef6385" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.249669 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.438675 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43f97fc5-fad3-4979-9a7d-2f24f9a8dac7" path="/var/lib/kubelet/pods/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7/volumes" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.450165 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7026175e-efaf-497a-aaf1-079f2811ad08" path="/var/lib/kubelet/pods/7026175e-efaf-497a-aaf1-079f2811ad08/volumes" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.450780 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="992c179f-7feb-4441-b94a-81b52133f671" path="/var/lib/kubelet/pods/992c179f-7feb-4441-b94a-81b52133f671/volumes" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.906264 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 01 10:01:40 crc kubenswrapper[4792]: I0301 10:01:40.837052 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"e6660fdc-5636-44ec-b6c0-e0e417d72e8a","Type":"ContainerStarted","Data":"2391d1418b109063f4ee67e28ee9b1de55f0fc32895676092aed9ef9a77d1746"} Mar 01 10:01:40 crc kubenswrapper[4792]: I0301 10:01:40.837565 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"e6660fdc-5636-44ec-b6c0-e0e417d72e8a","Type":"ContainerStarted","Data":"a8f7036e4a049e5741f848ea5e0b58d2f8611e1e0df796b9d649218a1f4571c5"} Mar 01 10:01:41 crc kubenswrapper[4792]: I0301 10:01:41.851679 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"e6660fdc-5636-44ec-b6c0-e0e417d72e8a","Type":"ContainerStarted","Data":"d2d356db92cc8311cc9321284f1aa2bc6fa39f9e2291eb803eace6b30a6f0334"} Mar 01 10:01:41 crc kubenswrapper[4792]: I0301 10:01:41.852102 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Mar 01 10:01:41 crc kubenswrapper[4792]: I0301 10:01:41.903324 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=3.9033084369999997 podStartE2EDuration="3.903308437s" podCreationTimestamp="2026-03-01 10:01:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 10:01:41.897559473 +0000 UTC m=+3231.139438670" watchObservedRunningTime="2026-03-01 10:01:41.903308437 +0000 UTC m=+3231.145187644" Mar 01 10:01:42 crc kubenswrapper[4792]: I0301 10:01:42.159060 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Mar 01 10:01:42 crc kubenswrapper[4792]: I0301 10:01:42.295079 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" Mar 01 10:01:42 crc kubenswrapper[4792]: I0301 10:01:42.405779 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cb7494899-9x44w"] Mar 01 10:01:42 crc kubenswrapper[4792]: I0301 10:01:42.406064 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cb7494899-9x44w" podUID="a3a408d8-0510-4867-8517-e609d614a5d2" containerName="dnsmasq-dns" containerID="cri-o://8b04ed1bd42aa863eef16fe08f4819a294c0cd44d80b6329225717f3d7d610c0" gracePeriod=10 Mar 01 10:01:42 crc kubenswrapper[4792]: I0301 10:01:42.863304 4792 generic.go:334] "Generic (PLEG): container finished" podID="a3a408d8-0510-4867-8517-e609d614a5d2" containerID="8b04ed1bd42aa863eef16fe08f4819a294c0cd44d80b6329225717f3d7d610c0" exitCode=0 Mar 01 10:01:42 crc kubenswrapper[4792]: I0301 10:01:42.864547 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb7494899-9x44w" event={"ID":"a3a408d8-0510-4867-8517-e609d614a5d2","Type":"ContainerDied","Data":"8b04ed1bd42aa863eef16fe08f4819a294c0cd44d80b6329225717f3d7d610c0"} Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.078108 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.146332 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-ovsdbserver-sb\") pod \"a3a408d8-0510-4867-8517-e609d614a5d2\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.146640 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-dns-svc\") pod \"a3a408d8-0510-4867-8517-e609d614a5d2\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.146674 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-ovsdbserver-nb\") pod \"a3a408d8-0510-4867-8517-e609d614a5d2\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.146737 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjnt9\" (UniqueName: \"kubernetes.io/projected/a3a408d8-0510-4867-8517-e609d614a5d2-kube-api-access-cjnt9\") pod \"a3a408d8-0510-4867-8517-e609d614a5d2\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.146774 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-openstack-edpm-ipam\") pod \"a3a408d8-0510-4867-8517-e609d614a5d2\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.146790 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-config\") pod \"a3a408d8-0510-4867-8517-e609d614a5d2\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.175219 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3a408d8-0510-4867-8517-e609d614a5d2-kube-api-access-cjnt9" (OuterVolumeSpecName: "kube-api-access-cjnt9") pod "a3a408d8-0510-4867-8517-e609d614a5d2" (UID: "a3a408d8-0510-4867-8517-e609d614a5d2"). InnerVolumeSpecName "kube-api-access-cjnt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.230658 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a3a408d8-0510-4867-8517-e609d614a5d2" (UID: "a3a408d8-0510-4867-8517-e609d614a5d2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.260384 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjnt9\" (UniqueName: \"kubernetes.io/projected/a3a408d8-0510-4867-8517-e609d614a5d2-kube-api-access-cjnt9\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.260422 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.263772 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "a3a408d8-0510-4867-8517-e609d614a5d2" (UID: "a3a408d8-0510-4867-8517-e609d614a5d2"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.265682 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-config" (OuterVolumeSpecName: "config") pod "a3a408d8-0510-4867-8517-e609d614a5d2" (UID: "a3a408d8-0510-4867-8517-e609d614a5d2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.265887 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a3a408d8-0510-4867-8517-e609d614a5d2" (UID: "a3a408d8-0510-4867-8517-e609d614a5d2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.273574 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a3a408d8-0510-4867-8517-e609d614a5d2" (UID: "a3a408d8-0510-4867-8517-e609d614a5d2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.363758 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.363787 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.363798 4792 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.363807 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-config\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.877736 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb7494899-9x44w" event={"ID":"a3a408d8-0510-4867-8517-e609d614a5d2","Type":"ContainerDied","Data":"3830dbf250a4bedcc384f157cb50c2184899780d10921fec2974544052904874"} Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.877784 4792 scope.go:117] "RemoveContainer" containerID="8b04ed1bd42aa863eef16fe08f4819a294c0cd44d80b6329225717f3d7d610c0" Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.877799 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.916377 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cb7494899-9x44w"] Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.935001 4792 scope.go:117] "RemoveContainer" containerID="9c10172b37a7ca756da9dc968292ad3961c9a1084ca116b725c3c50da7e6ecd8" Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.935173 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cb7494899-9x44w"] Mar 01 10:01:45 crc kubenswrapper[4792]: I0301 10:01:45.427873 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3a408d8-0510-4867-8517-e609d614a5d2" path="/var/lib/kubelet/pods/a3a408d8-0510-4867-8517-e609d614a5d2/volumes" Mar 01 10:01:47 crc kubenswrapper[4792]: I0301 10:01:47.334938 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 01 10:01:47 crc kubenswrapper[4792]: I0301 10:01:47.335481 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a4871267-e63c-4804-a404-869a0fdbd171" containerName="ceilometer-central-agent" containerID="cri-o://3d636a3af1c36dd5782bf30fc9b5448d960eda4cadc3be15e2341f14ec6c7b14" gracePeriod=30 Mar 01 10:01:47 crc kubenswrapper[4792]: I0301 10:01:47.335783 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a4871267-e63c-4804-a404-869a0fdbd171" containerName="proxy-httpd" containerID="cri-o://6a717ecdd9676dfa29b9bb443090712efe20cbd520284d6f5c7116f4cf10875d" gracePeriod=30 Mar 01 10:01:47 crc kubenswrapper[4792]: I0301 10:01:47.335861 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a4871267-e63c-4804-a404-869a0fdbd171" containerName="ceilometer-notification-agent" containerID="cri-o://61259115634a264180b7ef973ccc70cba261d421d03c9f32f31608df7c9afe57" gracePeriod=30 Mar 01 10:01:47 crc kubenswrapper[4792]: I0301 10:01:47.335879 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a4871267-e63c-4804-a404-869a0fdbd171" containerName="sg-core" containerID="cri-o://9953bdd8b70e316fcb7e17da4271cb1872c378bfc503f331c38bf892ec73dc20" gracePeriod=30 Mar 01 10:01:47 crc kubenswrapper[4792]: I0301 10:01:47.920138 4792 generic.go:334] "Generic (PLEG): container finished" podID="a4871267-e63c-4804-a404-869a0fdbd171" containerID="6a717ecdd9676dfa29b9bb443090712efe20cbd520284d6f5c7116f4cf10875d" exitCode=0 Mar 01 10:01:47 crc kubenswrapper[4792]: I0301 10:01:47.920463 4792 generic.go:334] "Generic (PLEG): container finished" podID="a4871267-e63c-4804-a404-869a0fdbd171" containerID="9953bdd8b70e316fcb7e17da4271cb1872c378bfc503f331c38bf892ec73dc20" exitCode=2 Mar 01 10:01:47 crc kubenswrapper[4792]: I0301 10:01:47.920474 4792 generic.go:334] "Generic (PLEG): container finished" podID="a4871267-e63c-4804-a404-869a0fdbd171" containerID="3d636a3af1c36dd5782bf30fc9b5448d960eda4cadc3be15e2341f14ec6c7b14" exitCode=0 Mar 01 10:01:47 crc kubenswrapper[4792]: I0301 10:01:47.920172 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4871267-e63c-4804-a404-869a0fdbd171","Type":"ContainerDied","Data":"6a717ecdd9676dfa29b9bb443090712efe20cbd520284d6f5c7116f4cf10875d"} Mar 01 10:01:47 crc kubenswrapper[4792]: I0301 10:01:47.920506 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4871267-e63c-4804-a404-869a0fdbd171","Type":"ContainerDied","Data":"9953bdd8b70e316fcb7e17da4271cb1872c378bfc503f331c38bf892ec73dc20"} Mar 01 10:01:47 crc kubenswrapper[4792]: I0301 10:01:47.920517 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4871267-e63c-4804-a404-869a0fdbd171","Type":"ContainerDied","Data":"3d636a3af1c36dd5782bf30fc9b5448d960eda4cadc3be15e2341f14ec6c7b14"} Mar 01 10:01:47 crc kubenswrapper[4792]: I0301 10:01:47.960767 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:01:47 crc kubenswrapper[4792]: I0301 10:01:47.969344 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:01:48 crc kubenswrapper[4792]: I0301 10:01:48.412325 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:01:48 crc kubenswrapper[4792]: E0301 10:01:48.413353 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:01:48 crc kubenswrapper[4792]: I0301 10:01:48.832387 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 10:01:48 crc kubenswrapper[4792]: I0301 10:01:48.889772 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4871267-e63c-4804-a404-869a0fdbd171-log-httpd\") pod \"a4871267-e63c-4804-a404-869a0fdbd171\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " Mar 01 10:01:48 crc kubenswrapper[4792]: I0301 10:01:48.889831 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4871267-e63c-4804-a404-869a0fdbd171-run-httpd\") pod \"a4871267-e63c-4804-a404-869a0fdbd171\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " Mar 01 10:01:48 crc kubenswrapper[4792]: I0301 10:01:48.889967 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-sg-core-conf-yaml\") pod \"a4871267-e63c-4804-a404-869a0fdbd171\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " Mar 01 10:01:48 crc kubenswrapper[4792]: I0301 10:01:48.889985 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cl6tg\" (UniqueName: \"kubernetes.io/projected/a4871267-e63c-4804-a404-869a0fdbd171-kube-api-access-cl6tg\") pod \"a4871267-e63c-4804-a404-869a0fdbd171\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " Mar 01 10:01:48 crc kubenswrapper[4792]: I0301 10:01:48.890031 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-combined-ca-bundle\") pod \"a4871267-e63c-4804-a404-869a0fdbd171\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " Mar 01 10:01:48 crc kubenswrapper[4792]: I0301 10:01:48.890056 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-config-data\") pod \"a4871267-e63c-4804-a404-869a0fdbd171\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " Mar 01 10:01:48 crc kubenswrapper[4792]: I0301 10:01:48.890106 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-ceilometer-tls-certs\") pod \"a4871267-e63c-4804-a404-869a0fdbd171\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " Mar 01 10:01:48 crc kubenswrapper[4792]: I0301 10:01:48.890126 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-scripts\") pod \"a4871267-e63c-4804-a404-869a0fdbd171\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " Mar 01 10:01:48 crc kubenswrapper[4792]: I0301 10:01:48.890501 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4871267-e63c-4804-a404-869a0fdbd171-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a4871267-e63c-4804-a404-869a0fdbd171" (UID: "a4871267-e63c-4804-a404-869a0fdbd171"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:01:48 crc kubenswrapper[4792]: I0301 10:01:48.893200 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4871267-e63c-4804-a404-869a0fdbd171-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a4871267-e63c-4804-a404-869a0fdbd171" (UID: "a4871267-e63c-4804-a404-869a0fdbd171"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:01:48 crc kubenswrapper[4792]: I0301 10:01:48.895943 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4871267-e63c-4804-a404-869a0fdbd171-kube-api-access-cl6tg" (OuterVolumeSpecName: "kube-api-access-cl6tg") pod "a4871267-e63c-4804-a404-869a0fdbd171" (UID: "a4871267-e63c-4804-a404-869a0fdbd171"). InnerVolumeSpecName "kube-api-access-cl6tg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:01:48 crc kubenswrapper[4792]: I0301 10:01:48.911525 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-scripts" (OuterVolumeSpecName: "scripts") pod "a4871267-e63c-4804-a404-869a0fdbd171" (UID: "a4871267-e63c-4804-a404-869a0fdbd171"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:01:48 crc kubenswrapper[4792]: I0301 10:01:48.966128 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a4871267-e63c-4804-a404-869a0fdbd171" (UID: "a4871267-e63c-4804-a404-869a0fdbd171"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:01:48 crc kubenswrapper[4792]: I0301 10:01:48.998146 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:48 crc kubenswrapper[4792]: I0301 10:01:48.998169 4792 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4871267-e63c-4804-a404-869a0fdbd171-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:48 crc kubenswrapper[4792]: I0301 10:01:48.998180 4792 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4871267-e63c-4804-a404-869a0fdbd171-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:48 crc kubenswrapper[4792]: I0301 10:01:48.998188 4792 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:48 crc kubenswrapper[4792]: I0301 10:01:48.998198 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cl6tg\" (UniqueName: \"kubernetes.io/projected/a4871267-e63c-4804-a404-869a0fdbd171-kube-api-access-cl6tg\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.002640 4792 generic.go:334] "Generic (PLEG): container finished" podID="a4871267-e63c-4804-a404-869a0fdbd171" containerID="61259115634a264180b7ef973ccc70cba261d421d03c9f32f31608df7c9afe57" exitCode=0 Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.002681 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4871267-e63c-4804-a404-869a0fdbd171","Type":"ContainerDied","Data":"61259115634a264180b7ef973ccc70cba261d421d03c9f32f31608df7c9afe57"} Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.002708 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4871267-e63c-4804-a404-869a0fdbd171","Type":"ContainerDied","Data":"5daa2a8c28823d6b2f08c8830868b8a9480988afbc9d6f7999eee6cf3c7e7ff9"} Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.002724 4792 scope.go:117] "RemoveContainer" containerID="6a717ecdd9676dfa29b9bb443090712efe20cbd520284d6f5c7116f4cf10875d" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.002874 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.033024 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "a4871267-e63c-4804-a404-869a0fdbd171" (UID: "a4871267-e63c-4804-a404-869a0fdbd171"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.089421 4792 scope.go:117] "RemoveContainer" containerID="9953bdd8b70e316fcb7e17da4271cb1872c378bfc503f331c38bf892ec73dc20" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.096098 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4871267-e63c-4804-a404-869a0fdbd171" (UID: "a4871267-e63c-4804-a404-869a0fdbd171"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.100486 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.100518 4792 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.122994 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-config-data" (OuterVolumeSpecName: "config-data") pod "a4871267-e63c-4804-a404-869a0fdbd171" (UID: "a4871267-e63c-4804-a404-869a0fdbd171"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.147046 4792 scope.go:117] "RemoveContainer" containerID="61259115634a264180b7ef973ccc70cba261d421d03c9f32f31608df7c9afe57" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.202179 4792 scope.go:117] "RemoveContainer" containerID="3d636a3af1c36dd5782bf30fc9b5448d960eda4cadc3be15e2341f14ec6c7b14" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.202315 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.261071 4792 scope.go:117] "RemoveContainer" containerID="6a717ecdd9676dfa29b9bb443090712efe20cbd520284d6f5c7116f4cf10875d" Mar 01 10:01:49 crc kubenswrapper[4792]: E0301 10:01:49.268039 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a717ecdd9676dfa29b9bb443090712efe20cbd520284d6f5c7116f4cf10875d\": container with ID starting with 6a717ecdd9676dfa29b9bb443090712efe20cbd520284d6f5c7116f4cf10875d not found: ID does not exist" containerID="6a717ecdd9676dfa29b9bb443090712efe20cbd520284d6f5c7116f4cf10875d" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.268079 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a717ecdd9676dfa29b9bb443090712efe20cbd520284d6f5c7116f4cf10875d"} err="failed to get container status \"6a717ecdd9676dfa29b9bb443090712efe20cbd520284d6f5c7116f4cf10875d\": rpc error: code = NotFound desc = could not find container \"6a717ecdd9676dfa29b9bb443090712efe20cbd520284d6f5c7116f4cf10875d\": container with ID starting with 6a717ecdd9676dfa29b9bb443090712efe20cbd520284d6f5c7116f4cf10875d not found: ID does not exist" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.268105 4792 scope.go:117] "RemoveContainer" containerID="9953bdd8b70e316fcb7e17da4271cb1872c378bfc503f331c38bf892ec73dc20" Mar 01 10:01:49 crc kubenswrapper[4792]: E0301 10:01:49.274017 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9953bdd8b70e316fcb7e17da4271cb1872c378bfc503f331c38bf892ec73dc20\": container with ID starting with 9953bdd8b70e316fcb7e17da4271cb1872c378bfc503f331c38bf892ec73dc20 not found: ID does not exist" containerID="9953bdd8b70e316fcb7e17da4271cb1872c378bfc503f331c38bf892ec73dc20" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.274052 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9953bdd8b70e316fcb7e17da4271cb1872c378bfc503f331c38bf892ec73dc20"} err="failed to get container status \"9953bdd8b70e316fcb7e17da4271cb1872c378bfc503f331c38bf892ec73dc20\": rpc error: code = NotFound desc = could not find container \"9953bdd8b70e316fcb7e17da4271cb1872c378bfc503f331c38bf892ec73dc20\": container with ID starting with 9953bdd8b70e316fcb7e17da4271cb1872c378bfc503f331c38bf892ec73dc20 not found: ID does not exist" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.274072 4792 scope.go:117] "RemoveContainer" containerID="61259115634a264180b7ef973ccc70cba261d421d03c9f32f31608df7c9afe57" Mar 01 10:01:49 crc kubenswrapper[4792]: E0301 10:01:49.276188 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61259115634a264180b7ef973ccc70cba261d421d03c9f32f31608df7c9afe57\": container with ID starting with 61259115634a264180b7ef973ccc70cba261d421d03c9f32f31608df7c9afe57 not found: ID does not exist" containerID="61259115634a264180b7ef973ccc70cba261d421d03c9f32f31608df7c9afe57" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.276224 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61259115634a264180b7ef973ccc70cba261d421d03c9f32f31608df7c9afe57"} err="failed to get container status \"61259115634a264180b7ef973ccc70cba261d421d03c9f32f31608df7c9afe57\": rpc error: code = NotFound desc = could not find container \"61259115634a264180b7ef973ccc70cba261d421d03c9f32f31608df7c9afe57\": container with ID starting with 61259115634a264180b7ef973ccc70cba261d421d03c9f32f31608df7c9afe57 not found: ID does not exist" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.276244 4792 scope.go:117] "RemoveContainer" containerID="3d636a3af1c36dd5782bf30fc9b5448d960eda4cadc3be15e2341f14ec6c7b14" Mar 01 10:01:49 crc kubenswrapper[4792]: E0301 10:01:49.281000 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d636a3af1c36dd5782bf30fc9b5448d960eda4cadc3be15e2341f14ec6c7b14\": container with ID starting with 3d636a3af1c36dd5782bf30fc9b5448d960eda4cadc3be15e2341f14ec6c7b14 not found: ID does not exist" containerID="3d636a3af1c36dd5782bf30fc9b5448d960eda4cadc3be15e2341f14ec6c7b14" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.281027 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d636a3af1c36dd5782bf30fc9b5448d960eda4cadc3be15e2341f14ec6c7b14"} err="failed to get container status \"3d636a3af1c36dd5782bf30fc9b5448d960eda4cadc3be15e2341f14ec6c7b14\": rpc error: code = NotFound desc = could not find container \"3d636a3af1c36dd5782bf30fc9b5448d960eda4cadc3be15e2341f14ec6c7b14\": container with ID starting with 3d636a3af1c36dd5782bf30fc9b5448d960eda4cadc3be15e2341f14ec6c7b14 not found: ID does not exist" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.362363 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.375691 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.439763 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4871267-e63c-4804-a404-869a0fdbd171" path="/var/lib/kubelet/pods/a4871267-e63c-4804-a404-869a0fdbd171/volumes" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.440561 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 01 10:01:49 crc kubenswrapper[4792]: E0301 10:01:49.441067 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4871267-e63c-4804-a404-869a0fdbd171" containerName="ceilometer-central-agent" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.441079 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4871267-e63c-4804-a404-869a0fdbd171" containerName="ceilometer-central-agent" Mar 01 10:01:49 crc kubenswrapper[4792]: E0301 10:01:49.441093 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4871267-e63c-4804-a404-869a0fdbd171" containerName="ceilometer-notification-agent" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.441099 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4871267-e63c-4804-a404-869a0fdbd171" containerName="ceilometer-notification-agent" Mar 01 10:01:49 crc kubenswrapper[4792]: E0301 10:01:49.441114 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4871267-e63c-4804-a404-869a0fdbd171" containerName="sg-core" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.441119 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4871267-e63c-4804-a404-869a0fdbd171" containerName="sg-core" Mar 01 10:01:49 crc kubenswrapper[4792]: E0301 10:01:49.441130 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3a408d8-0510-4867-8517-e609d614a5d2" containerName="dnsmasq-dns" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.441136 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3a408d8-0510-4867-8517-e609d614a5d2" containerName="dnsmasq-dns" Mar 01 10:01:49 crc kubenswrapper[4792]: E0301 10:01:49.441146 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3a408d8-0510-4867-8517-e609d614a5d2" containerName="init" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.441152 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3a408d8-0510-4867-8517-e609d614a5d2" containerName="init" Mar 01 10:01:49 crc kubenswrapper[4792]: E0301 10:01:49.441167 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4871267-e63c-4804-a404-869a0fdbd171" containerName="proxy-httpd" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.441172 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4871267-e63c-4804-a404-869a0fdbd171" containerName="proxy-httpd" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.441348 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4871267-e63c-4804-a404-869a0fdbd171" containerName="ceilometer-central-agent" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.441369 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3a408d8-0510-4867-8517-e609d614a5d2" containerName="dnsmasq-dns" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.441377 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4871267-e63c-4804-a404-869a0fdbd171" containerName="sg-core" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.441388 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4871267-e63c-4804-a404-869a0fdbd171" containerName="ceilometer-notification-agent" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.441398 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4871267-e63c-4804-a404-869a0fdbd171" containerName="proxy-httpd" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.443191 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.443280 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.446973 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.447175 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.451258 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.521742 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.521888 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d145fe82-e716-418e-990b-c139edc82fa5-log-httpd\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.521939 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d145fe82-e716-418e-990b-c139edc82fa5-run-httpd\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.521965 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvckx\" (UniqueName: \"kubernetes.io/projected/d145fe82-e716-418e-990b-c139edc82fa5-kube-api-access-wvckx\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.521996 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.522018 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-config-data\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.522149 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-scripts\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.522173 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.623489 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-scripts\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.623530 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.623570 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.623642 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d145fe82-e716-418e-990b-c139edc82fa5-log-httpd\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.623668 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d145fe82-e716-418e-990b-c139edc82fa5-run-httpd\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.623688 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvckx\" (UniqueName: \"kubernetes.io/projected/d145fe82-e716-418e-990b-c139edc82fa5-kube-api-access-wvckx\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.623709 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.623726 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-config-data\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.624506 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d145fe82-e716-418e-990b-c139edc82fa5-log-httpd\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.626433 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d145fe82-e716-418e-990b-c139edc82fa5-run-httpd\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.628873 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-scripts\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.632855 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.636978 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-config-data\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.637631 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.638019 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.651370 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvckx\" (UniqueName: \"kubernetes.io/projected/d145fe82-e716-418e-990b-c139edc82fa5-kube-api-access-wvckx\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.770417 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 10:01:50 crc kubenswrapper[4792]: I0301 10:01:50.047825 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"2e20671b-de40-40a8-8237-2bd9940b9af5","Type":"ContainerStarted","Data":"f8623324113a06ed0f840a7abbd929eaf57c541d1aae8f4b05f1bf6b431b24fe"} Mar 01 10:01:50 crc kubenswrapper[4792]: I0301 10:01:50.414289 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 01 10:01:50 crc kubenswrapper[4792]: I0301 10:01:50.713744 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:01:51 crc kubenswrapper[4792]: I0301 10:01:51.058965 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"2e20671b-de40-40a8-8237-2bd9940b9af5","Type":"ContainerStarted","Data":"e9350b459945620c5d55700a03e5aa49740d1a4854b26be997bf075b28191bfe"} Mar 01 10:01:51 crc kubenswrapper[4792]: I0301 10:01:51.060749 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d145fe82-e716-418e-990b-c139edc82fa5","Type":"ContainerStarted","Data":"ce25965b24241e019e5342dcd3aa81c21c3982896b8fa1aed06c530d8002ff34"} Mar 01 10:01:51 crc kubenswrapper[4792]: I0301 10:01:51.091246 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=5.19442946 podStartE2EDuration="20.091226474s" podCreationTimestamp="2026-03-01 10:01:31 +0000 UTC" firstStartedPulling="2026-03-01 10:01:33.546805901 +0000 UTC m=+3222.788685098" lastFinishedPulling="2026-03-01 10:01:48.443602915 +0000 UTC m=+3237.685482112" observedRunningTime="2026-03-01 10:01:51.088712211 +0000 UTC m=+3240.330591428" watchObservedRunningTime="2026-03-01 10:01:51.091226474 +0000 UTC m=+3240.333105671" Mar 01 10:01:51 crc kubenswrapper[4792]: I0301 10:01:51.372610 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:01:51 crc kubenswrapper[4792]: I0301 10:01:51.520405 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-689c76c966-7mbkl"] Mar 01 10:01:51 crc kubenswrapper[4792]: I0301 10:01:51.520701 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-689c76c966-7mbkl" podUID="67e060ef-1cc6-4b39-8622-bbcc183bdda0" containerName="horizon-log" containerID="cri-o://7eefa3b27a0172eca37fa354a49dffdfcea57a296cf27db9b99db7f3391b92ce" gracePeriod=30 Mar 01 10:01:51 crc kubenswrapper[4792]: I0301 10:01:51.520978 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-689c76c966-7mbkl" podUID="67e060ef-1cc6-4b39-8622-bbcc183bdda0" containerName="horizon" containerID="cri-o://809c1ff3a35d23863d9e774c5c257d32ce27f54609e7871a540c404967d6246b" gracePeriod=30 Mar 01 10:01:52 crc kubenswrapper[4792]: I0301 10:01:52.070245 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d145fe82-e716-418e-990b-c139edc82fa5","Type":"ContainerStarted","Data":"397b59d785e3917d16a895a2d5a3a1112dd5a93072735c2c87d76a8c570eace8"} Mar 01 10:01:52 crc kubenswrapper[4792]: I0301 10:01:52.070586 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d145fe82-e716-418e-990b-c139edc82fa5","Type":"ContainerStarted","Data":"60637fe04075519a9c42d81a400a742249a0bd0cb88a9744d6cb782e2e26c189"} Mar 01 10:01:52 crc kubenswrapper[4792]: I0301 10:01:52.120449 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Mar 01 10:01:53 crc kubenswrapper[4792]: I0301 10:01:53.081164 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d145fe82-e716-418e-990b-c139edc82fa5","Type":"ContainerStarted","Data":"1ae470c9722b44df4620ea0f02bb04d1aaff8e37f4b00ed39ee445412fbd0e34"} Mar 01 10:01:54 crc kubenswrapper[4792]: I0301 10:01:54.496274 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Mar 01 10:01:54 crc kubenswrapper[4792]: I0301 10:01:54.578993 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Mar 01 10:01:54 crc kubenswrapper[4792]: I0301 10:01:54.709605 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-689c76c966-7mbkl" podUID="67e060ef-1cc6-4b39-8622-bbcc183bdda0" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.12:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:57444->10.217.1.12:8443: read: connection reset by peer" Mar 01 10:01:55 crc kubenswrapper[4792]: I0301 10:01:55.098969 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d145fe82-e716-418e-990b-c139edc82fa5","Type":"ContainerStarted","Data":"db37dac072d6d2caae6fc34c918a3cef14b10a312e58745dfb92fa4708db31e6"} Mar 01 10:01:55 crc kubenswrapper[4792]: I0301 10:01:55.099570 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 01 10:01:55 crc kubenswrapper[4792]: I0301 10:01:55.101256 4792 generic.go:334] "Generic (PLEG): container finished" podID="67e060ef-1cc6-4b39-8622-bbcc183bdda0" containerID="809c1ff3a35d23863d9e774c5c257d32ce27f54609e7871a540c404967d6246b" exitCode=0 Mar 01 10:01:55 crc kubenswrapper[4792]: I0301 10:01:55.101307 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-689c76c966-7mbkl" event={"ID":"67e060ef-1cc6-4b39-8622-bbcc183bdda0","Type":"ContainerDied","Data":"809c1ff3a35d23863d9e774c5c257d32ce27f54609e7871a540c404967d6246b"} Mar 01 10:01:55 crc kubenswrapper[4792]: I0301 10:01:55.101453 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38" containerName="manila-scheduler" containerID="cri-o://dde55c4e7a9e0e028ff7df99a2c399eb3840b164a780338d688539c0a3c4421a" gracePeriod=30 Mar 01 10:01:55 crc kubenswrapper[4792]: I0301 10:01:55.101479 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38" containerName="probe" containerID="cri-o://06366ec6e532eeea2e6a301ab0ca2d1e855bb93e6c5118bc9e3b934abde5eb98" gracePeriod=30 Mar 01 10:01:55 crc kubenswrapper[4792]: I0301 10:01:55.126047 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.037003952 podStartE2EDuration="6.126031466s" podCreationTimestamp="2026-03-01 10:01:49 +0000 UTC" firstStartedPulling="2026-03-01 10:01:50.437161142 +0000 UTC m=+3239.679040339" lastFinishedPulling="2026-03-01 10:01:54.526188666 +0000 UTC m=+3243.768067853" observedRunningTime="2026-03-01 10:01:55.12337733 +0000 UTC m=+3244.365256527" watchObservedRunningTime="2026-03-01 10:01:55.126031466 +0000 UTC m=+3244.367910663" Mar 01 10:01:56 crc kubenswrapper[4792]: I0301 10:01:56.112879 4792 generic.go:334] "Generic (PLEG): container finished" podID="624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38" containerID="06366ec6e532eeea2e6a301ab0ca2d1e855bb93e6c5118bc9e3b934abde5eb98" exitCode=0 Mar 01 10:01:56 crc kubenswrapper[4792]: I0301 10:01:56.112952 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38","Type":"ContainerDied","Data":"06366ec6e532eeea2e6a301ab0ca2d1e855bb93e6c5118bc9e3b934abde5eb98"} Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.124622 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.134255 4792 generic.go:334] "Generic (PLEG): container finished" podID="624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38" containerID="dde55c4e7a9e0e028ff7df99a2c399eb3840b164a780338d688539c0a3c4421a" exitCode=0 Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.134298 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.134299 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38","Type":"ContainerDied","Data":"dde55c4e7a9e0e028ff7df99a2c399eb3840b164a780338d688539c0a3c4421a"} Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.134327 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38","Type":"ContainerDied","Data":"bd049fc1e63654f23ff767894cd0f3d2ee5d142592fec5855ea0f40d653d992d"} Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.134350 4792 scope.go:117] "RemoveContainer" containerID="06366ec6e532eeea2e6a301ab0ca2d1e855bb93e6c5118bc9e3b934abde5eb98" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.197120 4792 scope.go:117] "RemoveContainer" containerID="dde55c4e7a9e0e028ff7df99a2c399eb3840b164a780338d688539c0a3c4421a" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.227769 4792 scope.go:117] "RemoveContainer" containerID="06366ec6e532eeea2e6a301ab0ca2d1e855bb93e6c5118bc9e3b934abde5eb98" Mar 01 10:01:58 crc kubenswrapper[4792]: E0301 10:01:58.228660 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06366ec6e532eeea2e6a301ab0ca2d1e855bb93e6c5118bc9e3b934abde5eb98\": container with ID starting with 06366ec6e532eeea2e6a301ab0ca2d1e855bb93e6c5118bc9e3b934abde5eb98 not found: ID does not exist" containerID="06366ec6e532eeea2e6a301ab0ca2d1e855bb93e6c5118bc9e3b934abde5eb98" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.228736 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06366ec6e532eeea2e6a301ab0ca2d1e855bb93e6c5118bc9e3b934abde5eb98"} err="failed to get container status \"06366ec6e532eeea2e6a301ab0ca2d1e855bb93e6c5118bc9e3b934abde5eb98\": rpc error: code = NotFound desc = could not find container \"06366ec6e532eeea2e6a301ab0ca2d1e855bb93e6c5118bc9e3b934abde5eb98\": container with ID starting with 06366ec6e532eeea2e6a301ab0ca2d1e855bb93e6c5118bc9e3b934abde5eb98 not found: ID does not exist" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.228770 4792 scope.go:117] "RemoveContainer" containerID="dde55c4e7a9e0e028ff7df99a2c399eb3840b164a780338d688539c0a3c4421a" Mar 01 10:01:58 crc kubenswrapper[4792]: E0301 10:01:58.232213 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dde55c4e7a9e0e028ff7df99a2c399eb3840b164a780338d688539c0a3c4421a\": container with ID starting with dde55c4e7a9e0e028ff7df99a2c399eb3840b164a780338d688539c0a3c4421a not found: ID does not exist" containerID="dde55c4e7a9e0e028ff7df99a2c399eb3840b164a780338d688539c0a3c4421a" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.232260 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dde55c4e7a9e0e028ff7df99a2c399eb3840b164a780338d688539c0a3c4421a"} err="failed to get container status \"dde55c4e7a9e0e028ff7df99a2c399eb3840b164a780338d688539c0a3c4421a\": rpc error: code = NotFound desc = could not find container \"dde55c4e7a9e0e028ff7df99a2c399eb3840b164a780338d688539c0a3c4421a\": container with ID starting with dde55c4e7a9e0e028ff7df99a2c399eb3840b164a780338d688539c0a3c4421a not found: ID does not exist" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.242538 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-combined-ca-bundle\") pod \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.242781 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-config-data\") pod \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.242870 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-scripts\") pod \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.243117 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbl2x\" (UniqueName: \"kubernetes.io/projected/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-kube-api-access-qbl2x\") pod \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.243261 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-config-data-custom\") pod \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.243420 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-etc-machine-id\") pod \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.244419 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38" (UID: "624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.251859 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-scripts" (OuterVolumeSpecName: "scripts") pod "624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38" (UID: "624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.253184 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38" (UID: "624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.268271 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-kube-api-access-qbl2x" (OuterVolumeSpecName: "kube-api-access-qbl2x") pod "624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38" (UID: "624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38"). InnerVolumeSpecName "kube-api-access-qbl2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.308259 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38" (UID: "624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.346829 4792 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.346867 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.346881 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.346890 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbl2x\" (UniqueName: \"kubernetes.io/projected/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-kube-api-access-qbl2x\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.346901 4792 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.365964 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-config-data" (OuterVolumeSpecName: "config-data") pod "624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38" (UID: "624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.449250 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.466299 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.474917 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-scheduler-0"] Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.490367 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Mar 01 10:01:58 crc kubenswrapper[4792]: E0301 10:01:58.490715 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38" containerName="manila-scheduler" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.490730 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38" containerName="manila-scheduler" Mar 01 10:01:58 crc kubenswrapper[4792]: E0301 10:01:58.490759 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38" containerName="probe" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.490765 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38" containerName="probe" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.490953 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38" containerName="manila-scheduler" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.490974 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38" containerName="probe" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.493299 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.505407 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.508052 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.552353 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5813cf9a-1d9e-4a74-82e1-68e994c9175a-scripts\") pod \"manila-scheduler-0\" (UID: \"5813cf9a-1d9e-4a74-82e1-68e994c9175a\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.552397 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5813cf9a-1d9e-4a74-82e1-68e994c9175a-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"5813cf9a-1d9e-4a74-82e1-68e994c9175a\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.552514 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5813cf9a-1d9e-4a74-82e1-68e994c9175a-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"5813cf9a-1d9e-4a74-82e1-68e994c9175a\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.552569 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5813cf9a-1d9e-4a74-82e1-68e994c9175a-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"5813cf9a-1d9e-4a74-82e1-68e994c9175a\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.552603 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgsgj\" (UniqueName: \"kubernetes.io/projected/5813cf9a-1d9e-4a74-82e1-68e994c9175a-kube-api-access-rgsgj\") pod \"manila-scheduler-0\" (UID: \"5813cf9a-1d9e-4a74-82e1-68e994c9175a\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.552736 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5813cf9a-1d9e-4a74-82e1-68e994c9175a-config-data\") pod \"manila-scheduler-0\" (UID: \"5813cf9a-1d9e-4a74-82e1-68e994c9175a\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.655334 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5813cf9a-1d9e-4a74-82e1-68e994c9175a-config-data\") pod \"manila-scheduler-0\" (UID: \"5813cf9a-1d9e-4a74-82e1-68e994c9175a\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.655398 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5813cf9a-1d9e-4a74-82e1-68e994c9175a-scripts\") pod \"manila-scheduler-0\" (UID: \"5813cf9a-1d9e-4a74-82e1-68e994c9175a\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.655417 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5813cf9a-1d9e-4a74-82e1-68e994c9175a-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"5813cf9a-1d9e-4a74-82e1-68e994c9175a\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.655477 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5813cf9a-1d9e-4a74-82e1-68e994c9175a-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"5813cf9a-1d9e-4a74-82e1-68e994c9175a\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.655522 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5813cf9a-1d9e-4a74-82e1-68e994c9175a-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"5813cf9a-1d9e-4a74-82e1-68e994c9175a\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.655547 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgsgj\" (UniqueName: \"kubernetes.io/projected/5813cf9a-1d9e-4a74-82e1-68e994c9175a-kube-api-access-rgsgj\") pod \"manila-scheduler-0\" (UID: \"5813cf9a-1d9e-4a74-82e1-68e994c9175a\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.656870 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5813cf9a-1d9e-4a74-82e1-68e994c9175a-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"5813cf9a-1d9e-4a74-82e1-68e994c9175a\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.668778 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5813cf9a-1d9e-4a74-82e1-68e994c9175a-config-data\") pod \"manila-scheduler-0\" (UID: \"5813cf9a-1d9e-4a74-82e1-68e994c9175a\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.669202 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5813cf9a-1d9e-4a74-82e1-68e994c9175a-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"5813cf9a-1d9e-4a74-82e1-68e994c9175a\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.670648 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5813cf9a-1d9e-4a74-82e1-68e994c9175a-scripts\") pod \"manila-scheduler-0\" (UID: \"5813cf9a-1d9e-4a74-82e1-68e994c9175a\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.672458 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5813cf9a-1d9e-4a74-82e1-68e994c9175a-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"5813cf9a-1d9e-4a74-82e1-68e994c9175a\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.684968 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgsgj\" (UniqueName: \"kubernetes.io/projected/5813cf9a-1d9e-4a74-82e1-68e994c9175a-kube-api-access-rgsgj\") pod \"manila-scheduler-0\" (UID: \"5813cf9a-1d9e-4a74-82e1-68e994c9175a\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.855460 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 01 10:01:59 crc kubenswrapper[4792]: I0301 10:01:59.355272 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 01 10:01:59 crc kubenswrapper[4792]: I0301 10:01:59.434171 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38" path="/var/lib/kubelet/pods/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38/volumes" Mar 01 10:02:00 crc kubenswrapper[4792]: I0301 10:02:00.183689 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539322-cwkrx"] Mar 01 10:02:00 crc kubenswrapper[4792]: I0301 10:02:00.185214 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539322-cwkrx" Mar 01 10:02:00 crc kubenswrapper[4792]: I0301 10:02:00.190300 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 10:02:00 crc kubenswrapper[4792]: I0301 10:02:00.190453 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 10:02:00 crc kubenswrapper[4792]: I0301 10:02:00.190566 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 10:02:00 crc kubenswrapper[4792]: I0301 10:02:00.221152 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"5813cf9a-1d9e-4a74-82e1-68e994c9175a","Type":"ContainerStarted","Data":"d9817d4083c6146200ac1a8ff5a62c565ddee025f2fa812a7e0633271e866126"} Mar 01 10:02:00 crc kubenswrapper[4792]: I0301 10:02:00.221380 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"5813cf9a-1d9e-4a74-82e1-68e994c9175a","Type":"ContainerStarted","Data":"569ed6ced776f8e2e2dea6fb2be8cc5e458809032eb5e68566854ce47fc18ffe"} Mar 01 10:02:00 crc kubenswrapper[4792]: I0301 10:02:00.224964 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539322-cwkrx"] Mar 01 10:02:00 crc kubenswrapper[4792]: I0301 10:02:00.244312 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xjg6\" (UniqueName: \"kubernetes.io/projected/d1097b0e-1156-4f3c-b1e9-6f7b83d0e07b-kube-api-access-7xjg6\") pod \"auto-csr-approver-29539322-cwkrx\" (UID: \"d1097b0e-1156-4f3c-b1e9-6f7b83d0e07b\") " pod="openshift-infra/auto-csr-approver-29539322-cwkrx" Mar 01 10:02:00 crc kubenswrapper[4792]: I0301 10:02:00.349185 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xjg6\" (UniqueName: \"kubernetes.io/projected/d1097b0e-1156-4f3c-b1e9-6f7b83d0e07b-kube-api-access-7xjg6\") pod \"auto-csr-approver-29539322-cwkrx\" (UID: \"d1097b0e-1156-4f3c-b1e9-6f7b83d0e07b\") " pod="openshift-infra/auto-csr-approver-29539322-cwkrx" Mar 01 10:02:00 crc kubenswrapper[4792]: I0301 10:02:00.372759 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xjg6\" (UniqueName: \"kubernetes.io/projected/d1097b0e-1156-4f3c-b1e9-6f7b83d0e07b-kube-api-access-7xjg6\") pod \"auto-csr-approver-29539322-cwkrx\" (UID: \"d1097b0e-1156-4f3c-b1e9-6f7b83d0e07b\") " pod="openshift-infra/auto-csr-approver-29539322-cwkrx" Mar 01 10:02:00 crc kubenswrapper[4792]: I0301 10:02:00.522876 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539322-cwkrx" Mar 01 10:02:01 crc kubenswrapper[4792]: I0301 10:02:01.123634 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539322-cwkrx"] Mar 01 10:02:01 crc kubenswrapper[4792]: I0301 10:02:01.230375 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539322-cwkrx" event={"ID":"d1097b0e-1156-4f3c-b1e9-6f7b83d0e07b","Type":"ContainerStarted","Data":"5a7d9c9a77b56bc28c264fe19e619434f70526367aaa6c6a9caab4e4dd2ae71c"} Mar 01 10:02:01 crc kubenswrapper[4792]: I0301 10:02:01.241374 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"5813cf9a-1d9e-4a74-82e1-68e994c9175a","Type":"ContainerStarted","Data":"d2093085d488f06da18044dc0dee3114661762c3b456b0a7fcc6c58eabb671c3"} Mar 01 10:02:01 crc kubenswrapper[4792]: I0301 10:02:01.262309 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.2622684140000002 podStartE2EDuration="3.262268414s" podCreationTimestamp="2026-03-01 10:01:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 10:02:01.259057014 +0000 UTC m=+3250.500936231" watchObservedRunningTime="2026-03-01 10:02:01.262268414 +0000 UTC m=+3250.504147621" Mar 01 10:02:01 crc kubenswrapper[4792]: I0301 10:02:01.418370 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:02:01 crc kubenswrapper[4792]: E0301 10:02:01.418970 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:02:01 crc kubenswrapper[4792]: I0301 10:02:01.604584 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Mar 01 10:02:02 crc kubenswrapper[4792]: I0301 10:02:02.250259 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539322-cwkrx" event={"ID":"d1097b0e-1156-4f3c-b1e9-6f7b83d0e07b","Type":"ContainerStarted","Data":"ec82ee72087ccca998ba33c4f0000c5036f1693b6f0bbb69a46ea60c4b7efd7e"} Mar 01 10:02:02 crc kubenswrapper[4792]: I0301 10:02:02.270263 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29539322-cwkrx" podStartSLOduration=1.480602566 podStartE2EDuration="2.270241657s" podCreationTimestamp="2026-03-01 10:02:00 +0000 UTC" firstStartedPulling="2026-03-01 10:02:01.146617982 +0000 UTC m=+3250.388497179" lastFinishedPulling="2026-03-01 10:02:01.936257073 +0000 UTC m=+3251.178136270" observedRunningTime="2026-03-01 10:02:02.262970806 +0000 UTC m=+3251.504850013" watchObservedRunningTime="2026-03-01 10:02:02.270241657 +0000 UTC m=+3251.512120854" Mar 01 10:02:03 crc kubenswrapper[4792]: I0301 10:02:03.611420 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 01 10:02:03 crc kubenswrapper[4792]: I0301 10:02:03.612086 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d145fe82-e716-418e-990b-c139edc82fa5" containerName="ceilometer-central-agent" containerID="cri-o://60637fe04075519a9c42d81a400a742249a0bd0cb88a9744d6cb782e2e26c189" gracePeriod=30 Mar 01 10:02:03 crc kubenswrapper[4792]: I0301 10:02:03.612656 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d145fe82-e716-418e-990b-c139edc82fa5" containerName="proxy-httpd" containerID="cri-o://db37dac072d6d2caae6fc34c918a3cef14b10a312e58745dfb92fa4708db31e6" gracePeriod=30 Mar 01 10:02:03 crc kubenswrapper[4792]: I0301 10:02:03.612711 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d145fe82-e716-418e-990b-c139edc82fa5" containerName="sg-core" containerID="cri-o://1ae470c9722b44df4620ea0f02bb04d1aaff8e37f4b00ed39ee445412fbd0e34" gracePeriod=30 Mar 01 10:02:03 crc kubenswrapper[4792]: I0301 10:02:03.612750 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d145fe82-e716-418e-990b-c139edc82fa5" containerName="ceilometer-notification-agent" containerID="cri-o://397b59d785e3917d16a895a2d5a3a1112dd5a93072735c2c87d76a8c570eace8" gracePeriod=30 Mar 01 10:02:03 crc kubenswrapper[4792]: I0301 10:02:03.893355 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Mar 01 10:02:03 crc kubenswrapper[4792]: I0301 10:02:03.948555 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Mar 01 10:02:04 crc kubenswrapper[4792]: I0301 10:02:04.267680 4792 generic.go:334] "Generic (PLEG): container finished" podID="d1097b0e-1156-4f3c-b1e9-6f7b83d0e07b" containerID="ec82ee72087ccca998ba33c4f0000c5036f1693b6f0bbb69a46ea60c4b7efd7e" exitCode=0 Mar 01 10:02:04 crc kubenswrapper[4792]: I0301 10:02:04.267731 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539322-cwkrx" event={"ID":"d1097b0e-1156-4f3c-b1e9-6f7b83d0e07b","Type":"ContainerDied","Data":"ec82ee72087ccca998ba33c4f0000c5036f1693b6f0bbb69a46ea60c4b7efd7e"} Mar 01 10:02:04 crc kubenswrapper[4792]: I0301 10:02:04.271370 4792 generic.go:334] "Generic (PLEG): container finished" podID="d145fe82-e716-418e-990b-c139edc82fa5" containerID="db37dac072d6d2caae6fc34c918a3cef14b10a312e58745dfb92fa4708db31e6" exitCode=0 Mar 01 10:02:04 crc kubenswrapper[4792]: I0301 10:02:04.271397 4792 generic.go:334] "Generic (PLEG): container finished" podID="d145fe82-e716-418e-990b-c139edc82fa5" containerID="1ae470c9722b44df4620ea0f02bb04d1aaff8e37f4b00ed39ee445412fbd0e34" exitCode=2 Mar 01 10:02:04 crc kubenswrapper[4792]: I0301 10:02:04.271410 4792 generic.go:334] "Generic (PLEG): container finished" podID="d145fe82-e716-418e-990b-c139edc82fa5" containerID="60637fe04075519a9c42d81a400a742249a0bd0cb88a9744d6cb782e2e26c189" exitCode=0 Mar 01 10:02:04 crc kubenswrapper[4792]: I0301 10:02:04.271858 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d145fe82-e716-418e-990b-c139edc82fa5","Type":"ContainerDied","Data":"db37dac072d6d2caae6fc34c918a3cef14b10a312e58745dfb92fa4708db31e6"} Mar 01 10:02:04 crc kubenswrapper[4792]: I0301 10:02:04.271935 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d145fe82-e716-418e-990b-c139edc82fa5","Type":"ContainerDied","Data":"1ae470c9722b44df4620ea0f02bb04d1aaff8e37f4b00ed39ee445412fbd0e34"} Mar 01 10:02:04 crc kubenswrapper[4792]: I0301 10:02:04.271948 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d145fe82-e716-418e-990b-c139edc82fa5","Type":"ContainerDied","Data":"60637fe04075519a9c42d81a400a742249a0bd0cb88a9744d6cb782e2e26c189"} Mar 01 10:02:04 crc kubenswrapper[4792]: I0301 10:02:04.272225 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="2e20671b-de40-40a8-8237-2bd9940b9af5" containerName="manila-share" containerID="cri-o://f8623324113a06ed0f840a7abbd929eaf57c541d1aae8f4b05f1bf6b431b24fe" gracePeriod=30 Mar 01 10:02:04 crc kubenswrapper[4792]: I0301 10:02:04.272394 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="2e20671b-de40-40a8-8237-2bd9940b9af5" containerName="probe" containerID="cri-o://e9350b459945620c5d55700a03e5aa49740d1a4854b26be997bf075b28191bfe" gracePeriod=30 Mar 01 10:02:04 crc kubenswrapper[4792]: I0301 10:02:04.321927 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-689c76c966-7mbkl" podUID="67e060ef-1cc6-4b39-8622-bbcc183bdda0" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.12:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.12:8443: connect: connection refused" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.221380 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.256217 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-ceilometer-tls-certs\") pod \"d145fe82-e716-418e-990b-c139edc82fa5\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.256257 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-combined-ca-bundle\") pod \"d145fe82-e716-418e-990b-c139edc82fa5\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.256280 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-scripts\") pod \"d145fe82-e716-418e-990b-c139edc82fa5\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.256513 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-config-data\") pod \"d145fe82-e716-418e-990b-c139edc82fa5\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.256568 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d145fe82-e716-418e-990b-c139edc82fa5-run-httpd\") pod \"d145fe82-e716-418e-990b-c139edc82fa5\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.256586 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d145fe82-e716-418e-990b-c139edc82fa5-log-httpd\") pod \"d145fe82-e716-418e-990b-c139edc82fa5\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.256641 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvckx\" (UniqueName: \"kubernetes.io/projected/d145fe82-e716-418e-990b-c139edc82fa5-kube-api-access-wvckx\") pod \"d145fe82-e716-418e-990b-c139edc82fa5\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.256702 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-sg-core-conf-yaml\") pod \"d145fe82-e716-418e-990b-c139edc82fa5\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.261697 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d145fe82-e716-418e-990b-c139edc82fa5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d145fe82-e716-418e-990b-c139edc82fa5" (UID: "d145fe82-e716-418e-990b-c139edc82fa5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.262014 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d145fe82-e716-418e-990b-c139edc82fa5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d145fe82-e716-418e-990b-c139edc82fa5" (UID: "d145fe82-e716-418e-990b-c139edc82fa5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.268930 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-scripts" (OuterVolumeSpecName: "scripts") pod "d145fe82-e716-418e-990b-c139edc82fa5" (UID: "d145fe82-e716-418e-990b-c139edc82fa5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.278405 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d145fe82-e716-418e-990b-c139edc82fa5-kube-api-access-wvckx" (OuterVolumeSpecName: "kube-api-access-wvckx") pod "d145fe82-e716-418e-990b-c139edc82fa5" (UID: "d145fe82-e716-418e-990b-c139edc82fa5"). InnerVolumeSpecName "kube-api-access-wvckx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.285064 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.285008 4792 generic.go:334] "Generic (PLEG): container finished" podID="d145fe82-e716-418e-990b-c139edc82fa5" containerID="397b59d785e3917d16a895a2d5a3a1112dd5a93072735c2c87d76a8c570eace8" exitCode=0 Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.285071 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d145fe82-e716-418e-990b-c139edc82fa5","Type":"ContainerDied","Data":"397b59d785e3917d16a895a2d5a3a1112dd5a93072735c2c87d76a8c570eace8"} Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.285779 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d145fe82-e716-418e-990b-c139edc82fa5","Type":"ContainerDied","Data":"ce25965b24241e019e5342dcd3aa81c21c3982896b8fa1aed06c530d8002ff34"} Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.285871 4792 scope.go:117] "RemoveContainer" containerID="db37dac072d6d2caae6fc34c918a3cef14b10a312e58745dfb92fa4708db31e6" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.288937 4792 generic.go:334] "Generic (PLEG): container finished" podID="2e20671b-de40-40a8-8237-2bd9940b9af5" containerID="e9350b459945620c5d55700a03e5aa49740d1a4854b26be997bf075b28191bfe" exitCode=0 Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.288968 4792 generic.go:334] "Generic (PLEG): container finished" podID="2e20671b-de40-40a8-8237-2bd9940b9af5" containerID="f8623324113a06ed0f840a7abbd929eaf57c541d1aae8f4b05f1bf6b431b24fe" exitCode=1 Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.289170 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"2e20671b-de40-40a8-8237-2bd9940b9af5","Type":"ContainerDied","Data":"e9350b459945620c5d55700a03e5aa49740d1a4854b26be997bf075b28191bfe"} Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.289229 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"2e20671b-de40-40a8-8237-2bd9940b9af5","Type":"ContainerDied","Data":"f8623324113a06ed0f840a7abbd929eaf57c541d1aae8f4b05f1bf6b431b24fe"} Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.289245 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"2e20671b-de40-40a8-8237-2bd9940b9af5","Type":"ContainerDied","Data":"4f751b65cd13e8f9d7bbd49d2e059dd52782b4f71c9b59600e77ee78c852e20a"} Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.289259 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f751b65cd13e8f9d7bbd49d2e059dd52782b4f71c9b59600e77ee78c852e20a" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.339895 4792 scope.go:117] "RemoveContainer" containerID="1ae470c9722b44df4620ea0f02bb04d1aaff8e37f4b00ed39ee445412fbd0e34" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.341798 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "d145fe82-e716-418e-990b-c139edc82fa5" (UID: "d145fe82-e716-418e-990b-c139edc82fa5"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.349390 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.358128 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5f4d\" (UniqueName: \"kubernetes.io/projected/2e20671b-de40-40a8-8237-2bd9940b9af5-kube-api-access-r5f4d\") pod \"2e20671b-de40-40a8-8237-2bd9940b9af5\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.358205 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-config-data-custom\") pod \"2e20671b-de40-40a8-8237-2bd9940b9af5\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.358270 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2e20671b-de40-40a8-8237-2bd9940b9af5-ceph\") pod \"2e20671b-de40-40a8-8237-2bd9940b9af5\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.358325 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-scripts\") pod \"2e20671b-de40-40a8-8237-2bd9940b9af5\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.358366 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-combined-ca-bundle\") pod \"2e20671b-de40-40a8-8237-2bd9940b9af5\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.358403 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-config-data\") pod \"2e20671b-de40-40a8-8237-2bd9940b9af5\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.358471 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2e20671b-de40-40a8-8237-2bd9940b9af5-etc-machine-id\") pod \"2e20671b-de40-40a8-8237-2bd9940b9af5\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.358533 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/2e20671b-de40-40a8-8237-2bd9940b9af5-var-lib-manila\") pod \"2e20671b-de40-40a8-8237-2bd9940b9af5\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.358956 4792 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d145fe82-e716-418e-990b-c139edc82fa5-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.358979 4792 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d145fe82-e716-418e-990b-c139edc82fa5-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.359022 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvckx\" (UniqueName: \"kubernetes.io/projected/d145fe82-e716-418e-990b-c139edc82fa5-kube-api-access-wvckx\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.359037 4792 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.359048 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.359125 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2e20671b-de40-40a8-8237-2bd9940b9af5-var-lib-manila" (OuterVolumeSpecName: "var-lib-manila") pod "2e20671b-de40-40a8-8237-2bd9940b9af5" (UID: "2e20671b-de40-40a8-8237-2bd9940b9af5"). InnerVolumeSpecName "var-lib-manila". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.362064 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2e20671b-de40-40a8-8237-2bd9940b9af5-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2e20671b-de40-40a8-8237-2bd9940b9af5" (UID: "2e20671b-de40-40a8-8237-2bd9940b9af5"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.369866 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-scripts" (OuterVolumeSpecName: "scripts") pod "2e20671b-de40-40a8-8237-2bd9940b9af5" (UID: "2e20671b-de40-40a8-8237-2bd9940b9af5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.372686 4792 scope.go:117] "RemoveContainer" containerID="397b59d785e3917d16a895a2d5a3a1112dd5a93072735c2c87d76a8c570eace8" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.374878 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e20671b-de40-40a8-8237-2bd9940b9af5-ceph" (OuterVolumeSpecName: "ceph") pod "2e20671b-de40-40a8-8237-2bd9940b9af5" (UID: "2e20671b-de40-40a8-8237-2bd9940b9af5"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.379614 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e20671b-de40-40a8-8237-2bd9940b9af5-kube-api-access-r5f4d" (OuterVolumeSpecName: "kube-api-access-r5f4d") pod "2e20671b-de40-40a8-8237-2bd9940b9af5" (UID: "2e20671b-de40-40a8-8237-2bd9940b9af5"). InnerVolumeSpecName "kube-api-access-r5f4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.384168 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2e20671b-de40-40a8-8237-2bd9940b9af5" (UID: "2e20671b-de40-40a8-8237-2bd9940b9af5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.404861 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d145fe82-e716-418e-990b-c139edc82fa5" (UID: "d145fe82-e716-418e-990b-c139edc82fa5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.404964 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d145fe82-e716-418e-990b-c139edc82fa5" (UID: "d145fe82-e716-418e-990b-c139edc82fa5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.457145 4792 scope.go:117] "RemoveContainer" containerID="60637fe04075519a9c42d81a400a742249a0bd0cb88a9744d6cb782e2e26c189" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.461268 4792 reconciler_common.go:293] "Volume detached for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/2e20671b-de40-40a8-8237-2bd9940b9af5-var-lib-manila\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.461290 4792 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.461302 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5f4d\" (UniqueName: \"kubernetes.io/projected/2e20671b-de40-40a8-8237-2bd9940b9af5-kube-api-access-r5f4d\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.461312 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.461320 4792 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.461329 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2e20671b-de40-40a8-8237-2bd9940b9af5-ceph\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.461338 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.461346 4792 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2e20671b-de40-40a8-8237-2bd9940b9af5-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.492978 4792 scope.go:117] "RemoveContainer" containerID="db37dac072d6d2caae6fc34c918a3cef14b10a312e58745dfb92fa4708db31e6" Mar 01 10:02:05 crc kubenswrapper[4792]: E0301 10:02:05.497465 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db37dac072d6d2caae6fc34c918a3cef14b10a312e58745dfb92fa4708db31e6\": container with ID starting with db37dac072d6d2caae6fc34c918a3cef14b10a312e58745dfb92fa4708db31e6 not found: ID does not exist" containerID="db37dac072d6d2caae6fc34c918a3cef14b10a312e58745dfb92fa4708db31e6" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.497518 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db37dac072d6d2caae6fc34c918a3cef14b10a312e58745dfb92fa4708db31e6"} err="failed to get container status \"db37dac072d6d2caae6fc34c918a3cef14b10a312e58745dfb92fa4708db31e6\": rpc error: code = NotFound desc = could not find container \"db37dac072d6d2caae6fc34c918a3cef14b10a312e58745dfb92fa4708db31e6\": container with ID starting with db37dac072d6d2caae6fc34c918a3cef14b10a312e58745dfb92fa4708db31e6 not found: ID does not exist" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.497545 4792 scope.go:117] "RemoveContainer" containerID="1ae470c9722b44df4620ea0f02bb04d1aaff8e37f4b00ed39ee445412fbd0e34" Mar 01 10:02:05 crc kubenswrapper[4792]: E0301 10:02:05.498044 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ae470c9722b44df4620ea0f02bb04d1aaff8e37f4b00ed39ee445412fbd0e34\": container with ID starting with 1ae470c9722b44df4620ea0f02bb04d1aaff8e37f4b00ed39ee445412fbd0e34 not found: ID does not exist" containerID="1ae470c9722b44df4620ea0f02bb04d1aaff8e37f4b00ed39ee445412fbd0e34" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.498081 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ae470c9722b44df4620ea0f02bb04d1aaff8e37f4b00ed39ee445412fbd0e34"} err="failed to get container status \"1ae470c9722b44df4620ea0f02bb04d1aaff8e37f4b00ed39ee445412fbd0e34\": rpc error: code = NotFound desc = could not find container \"1ae470c9722b44df4620ea0f02bb04d1aaff8e37f4b00ed39ee445412fbd0e34\": container with ID starting with 1ae470c9722b44df4620ea0f02bb04d1aaff8e37f4b00ed39ee445412fbd0e34 not found: ID does not exist" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.498108 4792 scope.go:117] "RemoveContainer" containerID="397b59d785e3917d16a895a2d5a3a1112dd5a93072735c2c87d76a8c570eace8" Mar 01 10:02:05 crc kubenswrapper[4792]: E0301 10:02:05.498881 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"397b59d785e3917d16a895a2d5a3a1112dd5a93072735c2c87d76a8c570eace8\": container with ID starting with 397b59d785e3917d16a895a2d5a3a1112dd5a93072735c2c87d76a8c570eace8 not found: ID does not exist" containerID="397b59d785e3917d16a895a2d5a3a1112dd5a93072735c2c87d76a8c570eace8" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.498935 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"397b59d785e3917d16a895a2d5a3a1112dd5a93072735c2c87d76a8c570eace8"} err="failed to get container status \"397b59d785e3917d16a895a2d5a3a1112dd5a93072735c2c87d76a8c570eace8\": rpc error: code = NotFound desc = could not find container \"397b59d785e3917d16a895a2d5a3a1112dd5a93072735c2c87d76a8c570eace8\": container with ID starting with 397b59d785e3917d16a895a2d5a3a1112dd5a93072735c2c87d76a8c570eace8 not found: ID does not exist" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.498965 4792 scope.go:117] "RemoveContainer" containerID="60637fe04075519a9c42d81a400a742249a0bd0cb88a9744d6cb782e2e26c189" Mar 01 10:02:05 crc kubenswrapper[4792]: E0301 10:02:05.499229 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60637fe04075519a9c42d81a400a742249a0bd0cb88a9744d6cb782e2e26c189\": container with ID starting with 60637fe04075519a9c42d81a400a742249a0bd0cb88a9744d6cb782e2e26c189 not found: ID does not exist" containerID="60637fe04075519a9c42d81a400a742249a0bd0cb88a9744d6cb782e2e26c189" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.499257 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60637fe04075519a9c42d81a400a742249a0bd0cb88a9744d6cb782e2e26c189"} err="failed to get container status \"60637fe04075519a9c42d81a400a742249a0bd0cb88a9744d6cb782e2e26c189\": rpc error: code = NotFound desc = could not find container \"60637fe04075519a9c42d81a400a742249a0bd0cb88a9744d6cb782e2e26c189\": container with ID starting with 60637fe04075519a9c42d81a400a742249a0bd0cb88a9744d6cb782e2e26c189 not found: ID does not exist" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.501792 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-config-data" (OuterVolumeSpecName: "config-data") pod "d145fe82-e716-418e-990b-c139edc82fa5" (UID: "d145fe82-e716-418e-990b-c139edc82fa5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.503165 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e20671b-de40-40a8-8237-2bd9940b9af5" (UID: "2e20671b-de40-40a8-8237-2bd9940b9af5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.562919 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.562949 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.630042 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-config-data" (OuterVolumeSpecName: "config-data") pod "2e20671b-de40-40a8-8237-2bd9940b9af5" (UID: "2e20671b-de40-40a8-8237-2bd9940b9af5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.666614 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.708151 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.718439 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.746079 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 01 10:02:05 crc kubenswrapper[4792]: E0301 10:02:05.746516 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d145fe82-e716-418e-990b-c139edc82fa5" containerName="sg-core" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.746530 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d145fe82-e716-418e-990b-c139edc82fa5" containerName="sg-core" Mar 01 10:02:05 crc kubenswrapper[4792]: E0301 10:02:05.746545 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d145fe82-e716-418e-990b-c139edc82fa5" containerName="ceilometer-notification-agent" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.746551 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d145fe82-e716-418e-990b-c139edc82fa5" containerName="ceilometer-notification-agent" Mar 01 10:02:05 crc kubenswrapper[4792]: E0301 10:02:05.746568 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e20671b-de40-40a8-8237-2bd9940b9af5" containerName="manila-share" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.746574 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e20671b-de40-40a8-8237-2bd9940b9af5" containerName="manila-share" Mar 01 10:02:05 crc kubenswrapper[4792]: E0301 10:02:05.746599 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e20671b-de40-40a8-8237-2bd9940b9af5" containerName="probe" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.746604 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e20671b-de40-40a8-8237-2bd9940b9af5" containerName="probe" Mar 01 10:02:05 crc kubenswrapper[4792]: E0301 10:02:05.746612 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d145fe82-e716-418e-990b-c139edc82fa5" containerName="ceilometer-central-agent" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.746619 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d145fe82-e716-418e-990b-c139edc82fa5" containerName="ceilometer-central-agent" Mar 01 10:02:05 crc kubenswrapper[4792]: E0301 10:02:05.746629 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d145fe82-e716-418e-990b-c139edc82fa5" containerName="proxy-httpd" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.746634 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d145fe82-e716-418e-990b-c139edc82fa5" containerName="proxy-httpd" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.746788 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d145fe82-e716-418e-990b-c139edc82fa5" containerName="ceilometer-notification-agent" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.746800 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e20671b-de40-40a8-8237-2bd9940b9af5" containerName="manila-share" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.746815 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d145fe82-e716-418e-990b-c139edc82fa5" containerName="proxy-httpd" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.746823 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d145fe82-e716-418e-990b-c139edc82fa5" containerName="ceilometer-central-agent" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.746832 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d145fe82-e716-418e-990b-c139edc82fa5" containerName="sg-core" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.746841 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e20671b-de40-40a8-8237-2bd9940b9af5" containerName="probe" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.748359 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.757424 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.757474 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.757432 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.770549 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63238274-bc2e-4686-8371-e891944269f9-run-httpd\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.770605 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63238274-bc2e-4686-8371-e891944269f9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.770628 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63238274-bc2e-4686-8371-e891944269f9-config-data\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.770688 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4g9h\" (UniqueName: \"kubernetes.io/projected/63238274-bc2e-4686-8371-e891944269f9-kube-api-access-k4g9h\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.770712 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63238274-bc2e-4686-8371-e891944269f9-scripts\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.770758 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63238274-bc2e-4686-8371-e891944269f9-log-httpd\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.770776 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/63238274-bc2e-4686-8371-e891944269f9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.770803 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/63238274-bc2e-4686-8371-e891944269f9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.794536 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.872538 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4g9h\" (UniqueName: \"kubernetes.io/projected/63238274-bc2e-4686-8371-e891944269f9-kube-api-access-k4g9h\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.872607 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63238274-bc2e-4686-8371-e891944269f9-scripts\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.872705 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63238274-bc2e-4686-8371-e891944269f9-log-httpd\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.872736 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/63238274-bc2e-4686-8371-e891944269f9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.872793 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/63238274-bc2e-4686-8371-e891944269f9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.872832 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63238274-bc2e-4686-8371-e891944269f9-run-httpd\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.872886 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63238274-bc2e-4686-8371-e891944269f9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.872929 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63238274-bc2e-4686-8371-e891944269f9-config-data\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.873162 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63238274-bc2e-4686-8371-e891944269f9-log-httpd\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.873401 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63238274-bc2e-4686-8371-e891944269f9-run-httpd\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.880801 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/63238274-bc2e-4686-8371-e891944269f9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.889514 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63238274-bc2e-4686-8371-e891944269f9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.890683 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/63238274-bc2e-4686-8371-e891944269f9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.890892 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63238274-bc2e-4686-8371-e891944269f9-config-data\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.893854 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63238274-bc2e-4686-8371-e891944269f9-scripts\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.901399 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4g9h\" (UniqueName: \"kubernetes.io/projected/63238274-bc2e-4686-8371-e891944269f9-kube-api-access-k4g9h\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.993024 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539322-cwkrx" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.085786 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xjg6\" (UniqueName: \"kubernetes.io/projected/d1097b0e-1156-4f3c-b1e9-6f7b83d0e07b-kube-api-access-7xjg6\") pod \"d1097b0e-1156-4f3c-b1e9-6f7b83d0e07b\" (UID: \"d1097b0e-1156-4f3c-b1e9-6f7b83d0e07b\") " Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.089830 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1097b0e-1156-4f3c-b1e9-6f7b83d0e07b-kube-api-access-7xjg6" (OuterVolumeSpecName: "kube-api-access-7xjg6") pod "d1097b0e-1156-4f3c-b1e9-6f7b83d0e07b" (UID: "d1097b0e-1156-4f3c-b1e9-6f7b83d0e07b"). InnerVolumeSpecName "kube-api-access-7xjg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.115342 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.188275 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xjg6\" (UniqueName: \"kubernetes.io/projected/d1097b0e-1156-4f3c-b1e9-6f7b83d0e07b-kube-api-access-7xjg6\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.324289 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539322-cwkrx" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.325857 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539322-cwkrx" event={"ID":"d1097b0e-1156-4f3c-b1e9-6f7b83d0e07b","Type":"ContainerDied","Data":"5a7d9c9a77b56bc28c264fe19e619434f70526367aaa6c6a9caab4e4dd2ae71c"} Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.325900 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a7d9c9a77b56bc28c264fe19e619434f70526367aaa6c6a9caab4e4dd2ae71c" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.350406 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.415424 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539316-8swf8"] Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.455508 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.480560 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539316-8swf8"] Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.514133 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-share-share1-0"] Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.527873 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Mar 01 10:02:06 crc kubenswrapper[4792]: E0301 10:02:06.528298 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1097b0e-1156-4f3c-b1e9-6f7b83d0e07b" containerName="oc" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.528310 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1097b0e-1156-4f3c-b1e9-6f7b83d0e07b" containerName="oc" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.528470 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1097b0e-1156-4f3c-b1e9-6f7b83d0e07b" containerName="oc" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.529387 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.531849 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.555762 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.601722 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03462f2f-874f-496a-934b-9fa6e2c55850-scripts\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.601769 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03462f2f-874f-496a-934b-9fa6e2c55850-config-data\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.601795 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/03462f2f-874f-496a-934b-9fa6e2c55850-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.601858 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03462f2f-874f-496a-934b-9fa6e2c55850-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.601896 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/03462f2f-874f-496a-934b-9fa6e2c55850-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.602086 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd7l6\" (UniqueName: \"kubernetes.io/projected/03462f2f-874f-496a-934b-9fa6e2c55850-kube-api-access-rd7l6\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.602119 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/03462f2f-874f-496a-934b-9fa6e2c55850-ceph\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.602326 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03462f2f-874f-496a-934b-9fa6e2c55850-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.628414 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.704222 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03462f2f-874f-496a-934b-9fa6e2c55850-scripts\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.704282 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03462f2f-874f-496a-934b-9fa6e2c55850-config-data\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.704323 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/03462f2f-874f-496a-934b-9fa6e2c55850-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.704366 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03462f2f-874f-496a-934b-9fa6e2c55850-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.704419 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/03462f2f-874f-496a-934b-9fa6e2c55850-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.704492 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd7l6\" (UniqueName: \"kubernetes.io/projected/03462f2f-874f-496a-934b-9fa6e2c55850-kube-api-access-rd7l6\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.704517 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/03462f2f-874f-496a-934b-9fa6e2c55850-ceph\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.704660 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03462f2f-874f-496a-934b-9fa6e2c55850-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.706344 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/03462f2f-874f-496a-934b-9fa6e2c55850-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.706403 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/03462f2f-874f-496a-934b-9fa6e2c55850-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.711239 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03462f2f-874f-496a-934b-9fa6e2c55850-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.711285 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/03462f2f-874f-496a-934b-9fa6e2c55850-ceph\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.715866 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03462f2f-874f-496a-934b-9fa6e2c55850-config-data\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.716413 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03462f2f-874f-496a-934b-9fa6e2c55850-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.717852 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03462f2f-874f-496a-934b-9fa6e2c55850-scripts\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.725512 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd7l6\" (UniqueName: \"kubernetes.io/projected/03462f2f-874f-496a-934b-9fa6e2c55850-kube-api-access-rd7l6\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.868608 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 01 10:02:07 crc kubenswrapper[4792]: I0301 10:02:07.323611 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 01 10:02:07 crc kubenswrapper[4792]: I0301 10:02:07.362657 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63238274-bc2e-4686-8371-e891944269f9","Type":"ContainerStarted","Data":"cb5cb3199883bd66de434a4d6ee46e1ac35df0ce429d377c147e61cfcded4c58"} Mar 01 10:02:07 crc kubenswrapper[4792]: I0301 10:02:07.363447 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63238274-bc2e-4686-8371-e891944269f9","Type":"ContainerStarted","Data":"8e22199d500f0805c8cb6570bc524a00a0d8e763a509e63429d8035d5688ce87"} Mar 01 10:02:07 crc kubenswrapper[4792]: I0301 10:02:07.364670 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"03462f2f-874f-496a-934b-9fa6e2c55850","Type":"ContainerStarted","Data":"d3e1d07f4f3d51eda7e71efc89eb8ff096e9663bfa6ae3162512081082bb1fdd"} Mar 01 10:02:07 crc kubenswrapper[4792]: I0301 10:02:07.419523 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e20671b-de40-40a8-8237-2bd9940b9af5" path="/var/lib/kubelet/pods/2e20671b-de40-40a8-8237-2bd9940b9af5/volumes" Mar 01 10:02:07 crc kubenswrapper[4792]: I0301 10:02:07.420397 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d145fe82-e716-418e-990b-c139edc82fa5" path="/var/lib/kubelet/pods/d145fe82-e716-418e-990b-c139edc82fa5/volumes" Mar 01 10:02:07 crc kubenswrapper[4792]: I0301 10:02:07.421065 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9273c82-13c3-43c5-b90e-16fdb09f082e" path="/var/lib/kubelet/pods/e9273c82-13c3-43c5-b90e-16fdb09f082e/volumes" Mar 01 10:02:08 crc kubenswrapper[4792]: I0301 10:02:08.375747 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63238274-bc2e-4686-8371-e891944269f9","Type":"ContainerStarted","Data":"d6c30ffe6f1157993f4a62a4542ac5064a07854da7cd41adb3804ed8d39849e0"} Mar 01 10:02:08 crc kubenswrapper[4792]: I0301 10:02:08.378891 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"03462f2f-874f-496a-934b-9fa6e2c55850","Type":"ContainerStarted","Data":"8795b79f2c94dd66df278ff8afdb3d312a999dfa7aaefa0d77fb6d40b12819e4"} Mar 01 10:02:08 crc kubenswrapper[4792]: I0301 10:02:08.378956 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"03462f2f-874f-496a-934b-9fa6e2c55850","Type":"ContainerStarted","Data":"6a042cba6eb3d9aaa9069cad10940f945ae7ed290d26f8640b67514c0c058ab7"} Mar 01 10:02:08 crc kubenswrapper[4792]: I0301 10:02:08.411717 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=2.411699345 podStartE2EDuration="2.411699345s" podCreationTimestamp="2026-03-01 10:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 10:02:08.398854175 +0000 UTC m=+3257.640733372" watchObservedRunningTime="2026-03-01 10:02:08.411699345 +0000 UTC m=+3257.653578542" Mar 01 10:02:08 crc kubenswrapper[4792]: I0301 10:02:08.857088 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Mar 01 10:02:09 crc kubenswrapper[4792]: I0301 10:02:09.391753 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63238274-bc2e-4686-8371-e891944269f9","Type":"ContainerStarted","Data":"8d216a9216980eb8491eb4fee4c11bec75c712da7d6e97f8a3cdee3fb22149e7"} Mar 01 10:02:10 crc kubenswrapper[4792]: I0301 10:02:10.406751 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63238274-bc2e-4686-8371-e891944269f9","Type":"ContainerStarted","Data":"789df2a55bd98a76a43c542f546f7179c9a04fc9a4b5dec0e2e4ed322abb8b0f"} Mar 01 10:02:10 crc kubenswrapper[4792]: I0301 10:02:10.407952 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 01 10:02:10 crc kubenswrapper[4792]: I0301 10:02:10.437794 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.285034405 podStartE2EDuration="5.437777683s" podCreationTimestamp="2026-03-01 10:02:05 +0000 UTC" firstStartedPulling="2026-03-01 10:02:06.650140301 +0000 UTC m=+3255.892019498" lastFinishedPulling="2026-03-01 10:02:09.802883579 +0000 UTC m=+3259.044762776" observedRunningTime="2026-03-01 10:02:10.431716192 +0000 UTC m=+3259.673595399" watchObservedRunningTime="2026-03-01 10:02:10.437777683 +0000 UTC m=+3259.679656890" Mar 01 10:02:12 crc kubenswrapper[4792]: I0301 10:02:12.409412 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:02:12 crc kubenswrapper[4792]: E0301 10:02:12.410145 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:02:14 crc kubenswrapper[4792]: I0301 10:02:14.321800 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-689c76c966-7mbkl" podUID="67e060ef-1cc6-4b39-8622-bbcc183bdda0" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.12:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.12:8443: connect: connection refused" Mar 01 10:02:14 crc kubenswrapper[4792]: I0301 10:02:14.321926 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:02:16 crc kubenswrapper[4792]: I0301 10:02:16.870274 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Mar 01 10:02:20 crc kubenswrapper[4792]: I0301 10:02:20.552611 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Mar 01 10:02:21 crc kubenswrapper[4792]: W0301 10:02:21.546237 4792 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1097b0e_1156_4f3c_b1e9_6f7b83d0e07b.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1097b0e_1156_4f3c_b1e9_6f7b83d0e07b.slice: no such file or directory Mar 01 10:02:21 crc kubenswrapper[4792]: W0301 10:02:21.556371 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd145fe82_e716_418e_990b_c139edc82fa5.slice/crio-397b59d785e3917d16a895a2d5a3a1112dd5a93072735c2c87d76a8c570eace8.scope WatchSource:0}: Error finding container 397b59d785e3917d16a895a2d5a3a1112dd5a93072735c2c87d76a8c570eace8: Status 404 returned error can't find the container with id 397b59d785e3917d16a895a2d5a3a1112dd5a93072735c2c87d76a8c570eace8 Mar 01 10:02:21 crc kubenswrapper[4792]: W0301 10:02:21.559824 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd145fe82_e716_418e_990b_c139edc82fa5.slice/crio-1ae470c9722b44df4620ea0f02bb04d1aaff8e37f4b00ed39ee445412fbd0e34.scope WatchSource:0}: Error finding container 1ae470c9722b44df4620ea0f02bb04d1aaff8e37f4b00ed39ee445412fbd0e34: Status 404 returned error can't find the container with id 1ae470c9722b44df4620ea0f02bb04d1aaff8e37f4b00ed39ee445412fbd0e34 Mar 01 10:02:21 crc kubenswrapper[4792]: W0301 10:02:21.560281 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd145fe82_e716_418e_990b_c139edc82fa5.slice/crio-db37dac072d6d2caae6fc34c918a3cef14b10a312e58745dfb92fa4708db31e6.scope WatchSource:0}: Error finding container db37dac072d6d2caae6fc34c918a3cef14b10a312e58745dfb92fa4708db31e6: Status 404 returned error can't find the container with id db37dac072d6d2caae6fc34c918a3cef14b10a312e58745dfb92fa4708db31e6 Mar 01 10:02:21 crc kubenswrapper[4792]: E0301 10:02:21.828521 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e20671b_de40_40a8_8237_2bd9940b9af5.slice/crio-conmon-f8623324113a06ed0f840a7abbd929eaf57c541d1aae8f4b05f1bf6b431b24fe.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e20671b_de40_40a8_8237_2bd9940b9af5.slice/crio-conmon-e9350b459945620c5d55700a03e5aa49740d1a4854b26be997bf075b28191bfe.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e20671b_de40_40a8_8237_2bd9940b9af5.slice/crio-f8623324113a06ed0f840a7abbd929eaf57c541d1aae8f4b05f1bf6b431b24fe.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd145fe82_e716_418e_990b_c139edc82fa5.slice/crio-ce25965b24241e019e5342dcd3aa81c21c3982896b8fa1aed06c530d8002ff34\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e20671b_de40_40a8_8237_2bd9940b9af5.slice/crio-e9350b459945620c5d55700a03e5aa49740d1a4854b26be997bf075b28191bfe.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd145fe82_e716_418e_990b_c139edc82fa5.slice/crio-conmon-60637fe04075519a9c42d81a400a742249a0bd0cb88a9744d6cb782e2e26c189.scope\": RecentStats: unable to find data in memory cache], [\"/system.slice/rpm-ostreed.service\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e20671b_de40_40a8_8237_2bd9940b9af5.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67e060ef_1cc6_4b39_8622_bbcc183bdda0.slice/crio-7eefa3b27a0172eca37fa354a49dffdfcea57a296cf27db9b99db7f3391b92ce.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd145fe82_e716_418e_990b_c139edc82fa5.slice/crio-conmon-1ae470c9722b44df4620ea0f02bb04d1aaff8e37f4b00ed39ee445412fbd0e34.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e20671b_de40_40a8_8237_2bd9940b9af5.slice/crio-4f751b65cd13e8f9d7bbd49d2e059dd52782b4f71c9b59600e77ee78c852e20a\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd145fe82_e716_418e_990b_c139edc82fa5.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd145fe82_e716_418e_990b_c139edc82fa5.slice/crio-conmon-db37dac072d6d2caae6fc34c918a3cef14b10a312e58745dfb92fa4708db31e6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd145fe82_e716_418e_990b_c139edc82fa5.slice/crio-60637fe04075519a9c42d81a400a742249a0bd0cb88a9744d6cb782e2e26c189.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67e060ef_1cc6_4b39_8622_bbcc183bdda0.slice/crio-conmon-7eefa3b27a0172eca37fa354a49dffdfcea57a296cf27db9b99db7f3391b92ce.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd145fe82_e716_418e_990b_c139edc82fa5.slice/crio-conmon-397b59d785e3917d16a895a2d5a3a1112dd5a93072735c2c87d76a8c570eace8.scope\": RecentStats: unable to find data in memory cache]" Mar 01 10:02:21 crc kubenswrapper[4792]: I0301 10:02:21.940973 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.011479 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67e060ef-1cc6-4b39-8622-bbcc183bdda0-combined-ca-bundle\") pod \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.011857 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/67e060ef-1cc6-4b39-8622-bbcc183bdda0-horizon-secret-key\") pod \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.011917 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/67e060ef-1cc6-4b39-8622-bbcc183bdda0-horizon-tls-certs\") pod \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.011951 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rv6tz\" (UniqueName: \"kubernetes.io/projected/67e060ef-1cc6-4b39-8622-bbcc183bdda0-kube-api-access-rv6tz\") pod \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.012013 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67e060ef-1cc6-4b39-8622-bbcc183bdda0-logs\") pod \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.012112 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67e060ef-1cc6-4b39-8622-bbcc183bdda0-scripts\") pod \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.012168 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/67e060ef-1cc6-4b39-8622-bbcc183bdda0-config-data\") pod \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.013782 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67e060ef-1cc6-4b39-8622-bbcc183bdda0-logs" (OuterVolumeSpecName: "logs") pod "67e060ef-1cc6-4b39-8622-bbcc183bdda0" (UID: "67e060ef-1cc6-4b39-8622-bbcc183bdda0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.031185 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67e060ef-1cc6-4b39-8622-bbcc183bdda0-kube-api-access-rv6tz" (OuterVolumeSpecName: "kube-api-access-rv6tz") pod "67e060ef-1cc6-4b39-8622-bbcc183bdda0" (UID: "67e060ef-1cc6-4b39-8622-bbcc183bdda0"). InnerVolumeSpecName "kube-api-access-rv6tz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.031629 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67e060ef-1cc6-4b39-8622-bbcc183bdda0-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "67e060ef-1cc6-4b39-8622-bbcc183bdda0" (UID: "67e060ef-1cc6-4b39-8622-bbcc183bdda0"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.034444 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67e060ef-1cc6-4b39-8622-bbcc183bdda0-config-data" (OuterVolumeSpecName: "config-data") pod "67e060ef-1cc6-4b39-8622-bbcc183bdda0" (UID: "67e060ef-1cc6-4b39-8622-bbcc183bdda0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.040092 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67e060ef-1cc6-4b39-8622-bbcc183bdda0-scripts" (OuterVolumeSpecName: "scripts") pod "67e060ef-1cc6-4b39-8622-bbcc183bdda0" (UID: "67e060ef-1cc6-4b39-8622-bbcc183bdda0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.047263 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67e060ef-1cc6-4b39-8622-bbcc183bdda0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "67e060ef-1cc6-4b39-8622-bbcc183bdda0" (UID: "67e060ef-1cc6-4b39-8622-bbcc183bdda0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.081632 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67e060ef-1cc6-4b39-8622-bbcc183bdda0-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "67e060ef-1cc6-4b39-8622-bbcc183bdda0" (UID: "67e060ef-1cc6-4b39-8622-bbcc183bdda0"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.114229 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/67e060ef-1cc6-4b39-8622-bbcc183bdda0-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.114267 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67e060ef-1cc6-4b39-8622-bbcc183bdda0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.114278 4792 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/67e060ef-1cc6-4b39-8622-bbcc183bdda0-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.114286 4792 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/67e060ef-1cc6-4b39-8622-bbcc183bdda0-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.114296 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rv6tz\" (UniqueName: \"kubernetes.io/projected/67e060ef-1cc6-4b39-8622-bbcc183bdda0-kube-api-access-rv6tz\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.114307 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67e060ef-1cc6-4b39-8622-bbcc183bdda0-logs\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.114314 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67e060ef-1cc6-4b39-8622-bbcc183bdda0-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.547959 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.548341 4792 generic.go:334] "Generic (PLEG): container finished" podID="67e060ef-1cc6-4b39-8622-bbcc183bdda0" containerID="7eefa3b27a0172eca37fa354a49dffdfcea57a296cf27db9b99db7f3391b92ce" exitCode=137 Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.548223 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-689c76c966-7mbkl" event={"ID":"67e060ef-1cc6-4b39-8622-bbcc183bdda0","Type":"ContainerDied","Data":"7eefa3b27a0172eca37fa354a49dffdfcea57a296cf27db9b99db7f3391b92ce"} Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.551595 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-689c76c966-7mbkl" event={"ID":"67e060ef-1cc6-4b39-8622-bbcc183bdda0","Type":"ContainerDied","Data":"ca973dc9b124eaa6124a0d372262dadb60856c6d51900f904c6c257a734ebfb0"} Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.551625 4792 scope.go:117] "RemoveContainer" containerID="809c1ff3a35d23863d9e774c5c257d32ce27f54609e7871a540c404967d6246b" Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.611226 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-689c76c966-7mbkl"] Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.627768 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-689c76c966-7mbkl"] Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.748432 4792 scope.go:117] "RemoveContainer" containerID="7eefa3b27a0172eca37fa354a49dffdfcea57a296cf27db9b99db7f3391b92ce" Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.781586 4792 scope.go:117] "RemoveContainer" containerID="809c1ff3a35d23863d9e774c5c257d32ce27f54609e7871a540c404967d6246b" Mar 01 10:02:22 crc kubenswrapper[4792]: E0301 10:02:22.782729 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"809c1ff3a35d23863d9e774c5c257d32ce27f54609e7871a540c404967d6246b\": container with ID starting with 809c1ff3a35d23863d9e774c5c257d32ce27f54609e7871a540c404967d6246b not found: ID does not exist" containerID="809c1ff3a35d23863d9e774c5c257d32ce27f54609e7871a540c404967d6246b" Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.782774 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"809c1ff3a35d23863d9e774c5c257d32ce27f54609e7871a540c404967d6246b"} err="failed to get container status \"809c1ff3a35d23863d9e774c5c257d32ce27f54609e7871a540c404967d6246b\": rpc error: code = NotFound desc = could not find container \"809c1ff3a35d23863d9e774c5c257d32ce27f54609e7871a540c404967d6246b\": container with ID starting with 809c1ff3a35d23863d9e774c5c257d32ce27f54609e7871a540c404967d6246b not found: ID does not exist" Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.782803 4792 scope.go:117] "RemoveContainer" containerID="7eefa3b27a0172eca37fa354a49dffdfcea57a296cf27db9b99db7f3391b92ce" Mar 01 10:02:22 crc kubenswrapper[4792]: E0301 10:02:22.783430 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7eefa3b27a0172eca37fa354a49dffdfcea57a296cf27db9b99db7f3391b92ce\": container with ID starting with 7eefa3b27a0172eca37fa354a49dffdfcea57a296cf27db9b99db7f3391b92ce not found: ID does not exist" containerID="7eefa3b27a0172eca37fa354a49dffdfcea57a296cf27db9b99db7f3391b92ce" Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.783462 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7eefa3b27a0172eca37fa354a49dffdfcea57a296cf27db9b99db7f3391b92ce"} err="failed to get container status \"7eefa3b27a0172eca37fa354a49dffdfcea57a296cf27db9b99db7f3391b92ce\": rpc error: code = NotFound desc = could not find container \"7eefa3b27a0172eca37fa354a49dffdfcea57a296cf27db9b99db7f3391b92ce\": container with ID starting with 7eefa3b27a0172eca37fa354a49dffdfcea57a296cf27db9b99db7f3391b92ce not found: ID does not exist" Mar 01 10:02:23 crc kubenswrapper[4792]: I0301 10:02:23.424192 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67e060ef-1cc6-4b39-8622-bbcc183bdda0" path="/var/lib/kubelet/pods/67e060ef-1cc6-4b39-8622-bbcc183bdda0/volumes" Mar 01 10:02:24 crc kubenswrapper[4792]: I0301 10:02:24.408732 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:02:24 crc kubenswrapper[4792]: E0301 10:02:24.409259 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:02:28 crc kubenswrapper[4792]: I0301 10:02:28.411300 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Mar 01 10:02:32 crc kubenswrapper[4792]: I0301 10:02:32.164213 4792 scope.go:117] "RemoveContainer" containerID="3aeb9f44a1b454186ac24af4b6119b2ea036267663153223c161c28c89a3a926" Mar 01 10:02:36 crc kubenswrapper[4792]: I0301 10:02:36.133114 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 01 10:02:38 crc kubenswrapper[4792]: I0301 10:02:38.408926 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:02:38 crc kubenswrapper[4792]: E0301 10:02:38.410151 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:02:53 crc kubenswrapper[4792]: I0301 10:02:53.411091 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:02:53 crc kubenswrapper[4792]: E0301 10:02:53.411921 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:03:06 crc kubenswrapper[4792]: I0301 10:03:06.409183 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:03:06 crc kubenswrapper[4792]: E0301 10:03:06.410069 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:03:21 crc kubenswrapper[4792]: I0301 10:03:21.416371 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:03:21 crc kubenswrapper[4792]: E0301 10:03:21.417348 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:03:34 crc kubenswrapper[4792]: I0301 10:03:34.409793 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:03:34 crc kubenswrapper[4792]: E0301 10:03:34.410526 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.014732 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 01 10:03:38 crc kubenswrapper[4792]: E0301 10:03:38.015774 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67e060ef-1cc6-4b39-8622-bbcc183bdda0" containerName="horizon-log" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.015793 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="67e060ef-1cc6-4b39-8622-bbcc183bdda0" containerName="horizon-log" Mar 01 10:03:38 crc kubenswrapper[4792]: E0301 10:03:38.015836 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67e060ef-1cc6-4b39-8622-bbcc183bdda0" containerName="horizon" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.015848 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="67e060ef-1cc6-4b39-8622-bbcc183bdda0" containerName="horizon" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.016105 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="67e060ef-1cc6-4b39-8622-bbcc183bdda0" containerName="horizon-log" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.016129 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="67e060ef-1cc6-4b39-8622-bbcc183bdda0" containerName="horizon" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.016962 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.018763 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.018993 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.019260 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-nl48r" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.021397 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.041305 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.107748 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.108087 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.108246 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.108451 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.108758 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-config-data\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.108838 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.109075 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.109196 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.109362 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slzdj\" (UniqueName: \"kubernetes.io/projected/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-kube-api-access-slzdj\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.210651 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.210743 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-config-data\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.210779 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.210862 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.210886 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.210947 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slzdj\" (UniqueName: \"kubernetes.io/projected/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-kube-api-access-slzdj\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.211009 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.211403 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.211637 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.212271 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.212985 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.213135 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.213365 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-config-data\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.213453 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.216949 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.221806 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.221987 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.233077 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slzdj\" (UniqueName: \"kubernetes.io/projected/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-kube-api-access-slzdj\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.243597 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.341408 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.799738 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 01 10:03:38 crc kubenswrapper[4792]: W0301 10:03:38.804981 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee1c75ce_61f7_4ce5_a757_b7405d7135bd.slice/crio-56c23af9ad5afce4e187f9687e375e2437b0d33726c9ce5a2f51b334bc434432 WatchSource:0}: Error finding container 56c23af9ad5afce4e187f9687e375e2437b0d33726c9ce5a2f51b334bc434432: Status 404 returned error can't find the container with id 56c23af9ad5afce4e187f9687e375e2437b0d33726c9ce5a2f51b334bc434432 Mar 01 10:03:39 crc kubenswrapper[4792]: I0301 10:03:39.730460 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ee1c75ce-61f7-4ce5-a757-b7405d7135bd","Type":"ContainerStarted","Data":"56c23af9ad5afce4e187f9687e375e2437b0d33726c9ce5a2f51b334bc434432"} Mar 01 10:03:47 crc kubenswrapper[4792]: I0301 10:03:47.408708 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:03:47 crc kubenswrapper[4792]: E0301 10:03:47.409600 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:04:00 crc kubenswrapper[4792]: I0301 10:04:00.138201 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539324-wjq94"] Mar 01 10:04:00 crc kubenswrapper[4792]: I0301 10:04:00.140433 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539324-wjq94" Mar 01 10:04:00 crc kubenswrapper[4792]: I0301 10:04:00.142992 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 10:04:00 crc kubenswrapper[4792]: I0301 10:04:00.143069 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 10:04:00 crc kubenswrapper[4792]: I0301 10:04:00.143092 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 10:04:00 crc kubenswrapper[4792]: I0301 10:04:00.160320 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539324-wjq94"] Mar 01 10:04:00 crc kubenswrapper[4792]: I0301 10:04:00.179093 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txf4b\" (UniqueName: \"kubernetes.io/projected/41b7071d-7243-4ca4-82e6-c153c3001d1f-kube-api-access-txf4b\") pod \"auto-csr-approver-29539324-wjq94\" (UID: \"41b7071d-7243-4ca4-82e6-c153c3001d1f\") " pod="openshift-infra/auto-csr-approver-29539324-wjq94" Mar 01 10:04:00 crc kubenswrapper[4792]: I0301 10:04:00.280754 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txf4b\" (UniqueName: \"kubernetes.io/projected/41b7071d-7243-4ca4-82e6-c153c3001d1f-kube-api-access-txf4b\") pod \"auto-csr-approver-29539324-wjq94\" (UID: \"41b7071d-7243-4ca4-82e6-c153c3001d1f\") " pod="openshift-infra/auto-csr-approver-29539324-wjq94" Mar 01 10:04:00 crc kubenswrapper[4792]: I0301 10:04:00.301829 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txf4b\" (UniqueName: \"kubernetes.io/projected/41b7071d-7243-4ca4-82e6-c153c3001d1f-kube-api-access-txf4b\") pod \"auto-csr-approver-29539324-wjq94\" (UID: \"41b7071d-7243-4ca4-82e6-c153c3001d1f\") " pod="openshift-infra/auto-csr-approver-29539324-wjq94" Mar 01 10:04:00 crc kubenswrapper[4792]: I0301 10:04:00.460518 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539324-wjq94" Mar 01 10:04:02 crc kubenswrapper[4792]: I0301 10:04:02.410757 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:04:02 crc kubenswrapper[4792]: E0301 10:04:02.411364 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:04:12 crc kubenswrapper[4792]: E0301 10:04:12.419075 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 01 10:04:12 crc kubenswrapper[4792]: E0301 10:04:12.420680 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-slzdj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(ee1c75ce-61f7-4ce5-a757-b7405d7135bd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 01 10:04:12 crc kubenswrapper[4792]: E0301 10:04:12.421855 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="ee1c75ce-61f7-4ce5-a757-b7405d7135bd" Mar 01 10:04:12 crc kubenswrapper[4792]: I0301 10:04:12.844984 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539324-wjq94"] Mar 01 10:04:13 crc kubenswrapper[4792]: I0301 10:04:13.050632 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539324-wjq94" event={"ID":"41b7071d-7243-4ca4-82e6-c153c3001d1f","Type":"ContainerStarted","Data":"05ecd0e3b57f89e85c7cc446a097e5719dfaf3e02c4997b31325f714bb21377a"} Mar 01 10:04:13 crc kubenswrapper[4792]: E0301 10:04:13.052293 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="ee1c75ce-61f7-4ce5-a757-b7405d7135bd" Mar 01 10:04:13 crc kubenswrapper[4792]: I0301 10:04:13.408303 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:04:13 crc kubenswrapper[4792]: E0301 10:04:13.408631 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:04:15 crc kubenswrapper[4792]: I0301 10:04:15.072700 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539324-wjq94" event={"ID":"41b7071d-7243-4ca4-82e6-c153c3001d1f","Type":"ContainerStarted","Data":"e004ff88b9b0e0f691af76b372e4044089dce1eaaf6619be9de9fed1b4d58c18"} Mar 01 10:04:15 crc kubenswrapper[4792]: I0301 10:04:15.094893 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29539324-wjq94" podStartSLOduration=14.252390903 podStartE2EDuration="15.094871951s" podCreationTimestamp="2026-03-01 10:04:00 +0000 UTC" firstStartedPulling="2026-03-01 10:04:12.842069752 +0000 UTC m=+3382.083948949" lastFinishedPulling="2026-03-01 10:04:13.6845508 +0000 UTC m=+3382.926429997" observedRunningTime="2026-03-01 10:04:15.088141873 +0000 UTC m=+3384.330021100" watchObservedRunningTime="2026-03-01 10:04:15.094871951 +0000 UTC m=+3384.336751158" Mar 01 10:04:16 crc kubenswrapper[4792]: I0301 10:04:16.083872 4792 generic.go:334] "Generic (PLEG): container finished" podID="41b7071d-7243-4ca4-82e6-c153c3001d1f" containerID="e004ff88b9b0e0f691af76b372e4044089dce1eaaf6619be9de9fed1b4d58c18" exitCode=0 Mar 01 10:04:16 crc kubenswrapper[4792]: I0301 10:04:16.083982 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539324-wjq94" event={"ID":"41b7071d-7243-4ca4-82e6-c153c3001d1f","Type":"ContainerDied","Data":"e004ff88b9b0e0f691af76b372e4044089dce1eaaf6619be9de9fed1b4d58c18"} Mar 01 10:04:17 crc kubenswrapper[4792]: I0301 10:04:17.474127 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539324-wjq94" Mar 01 10:04:17 crc kubenswrapper[4792]: I0301 10:04:17.667296 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txf4b\" (UniqueName: \"kubernetes.io/projected/41b7071d-7243-4ca4-82e6-c153c3001d1f-kube-api-access-txf4b\") pod \"41b7071d-7243-4ca4-82e6-c153c3001d1f\" (UID: \"41b7071d-7243-4ca4-82e6-c153c3001d1f\") " Mar 01 10:04:17 crc kubenswrapper[4792]: I0301 10:04:17.673133 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41b7071d-7243-4ca4-82e6-c153c3001d1f-kube-api-access-txf4b" (OuterVolumeSpecName: "kube-api-access-txf4b") pod "41b7071d-7243-4ca4-82e6-c153c3001d1f" (UID: "41b7071d-7243-4ca4-82e6-c153c3001d1f"). InnerVolumeSpecName "kube-api-access-txf4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:04:17 crc kubenswrapper[4792]: I0301 10:04:17.770895 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txf4b\" (UniqueName: \"kubernetes.io/projected/41b7071d-7243-4ca4-82e6-c153c3001d1f-kube-api-access-txf4b\") on node \"crc\" DevicePath \"\"" Mar 01 10:04:18 crc kubenswrapper[4792]: I0301 10:04:18.101637 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539324-wjq94" event={"ID":"41b7071d-7243-4ca4-82e6-c153c3001d1f","Type":"ContainerDied","Data":"05ecd0e3b57f89e85c7cc446a097e5719dfaf3e02c4997b31325f714bb21377a"} Mar 01 10:04:18 crc kubenswrapper[4792]: I0301 10:04:18.101673 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05ecd0e3b57f89e85c7cc446a097e5719dfaf3e02c4997b31325f714bb21377a" Mar 01 10:04:18 crc kubenswrapper[4792]: I0301 10:04:18.101702 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539324-wjq94" Mar 01 10:04:18 crc kubenswrapper[4792]: I0301 10:04:18.155630 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539318-d9tmc"] Mar 01 10:04:18 crc kubenswrapper[4792]: I0301 10:04:18.164435 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539318-d9tmc"] Mar 01 10:04:19 crc kubenswrapper[4792]: I0301 10:04:19.419242 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf3aad6a-fbd9-4a24-a489-33507709811b" path="/var/lib/kubelet/pods/bf3aad6a-fbd9-4a24-a489-33507709811b/volumes" Mar 01 10:04:25 crc kubenswrapper[4792]: I0301 10:04:25.885297 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 01 10:04:27 crc kubenswrapper[4792]: I0301 10:04:27.190023 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ee1c75ce-61f7-4ce5-a757-b7405d7135bd","Type":"ContainerStarted","Data":"0b1c921f1338ea9b8f3dd9b08a6d658d40119ab101643c7964b03d38bfa73f47"} Mar 01 10:04:27 crc kubenswrapper[4792]: I0301 10:04:27.218384 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.143779626 podStartE2EDuration="51.218366932s" podCreationTimestamp="2026-03-01 10:03:36 +0000 UTC" firstStartedPulling="2026-03-01 10:03:38.807720917 +0000 UTC m=+3348.049600114" lastFinishedPulling="2026-03-01 10:04:25.882308223 +0000 UTC m=+3395.124187420" observedRunningTime="2026-03-01 10:04:27.208047115 +0000 UTC m=+3396.449926322" watchObservedRunningTime="2026-03-01 10:04:27.218366932 +0000 UTC m=+3396.460246129" Mar 01 10:04:27 crc kubenswrapper[4792]: I0301 10:04:27.408535 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:04:27 crc kubenswrapper[4792]: E0301 10:04:27.408804 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:04:32 crc kubenswrapper[4792]: I0301 10:04:32.415460 4792 scope.go:117] "RemoveContainer" containerID="7152ca7878f74975420b6650bf54cd79c2b676e3ea865b2cbf55b92459e46fa6" Mar 01 10:04:42 crc kubenswrapper[4792]: I0301 10:04:42.409453 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:04:42 crc kubenswrapper[4792]: E0301 10:04:42.410332 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:04:56 crc kubenswrapper[4792]: I0301 10:04:56.408353 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:04:56 crc kubenswrapper[4792]: E0301 10:04:56.409595 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:05:00 crc kubenswrapper[4792]: I0301 10:05:00.106009 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-smtml"] Mar 01 10:05:00 crc kubenswrapper[4792]: E0301 10:05:00.107050 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41b7071d-7243-4ca4-82e6-c153c3001d1f" containerName="oc" Mar 01 10:05:00 crc kubenswrapper[4792]: I0301 10:05:00.107066 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="41b7071d-7243-4ca4-82e6-c153c3001d1f" containerName="oc" Mar 01 10:05:00 crc kubenswrapper[4792]: I0301 10:05:00.107313 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="41b7071d-7243-4ca4-82e6-c153c3001d1f" containerName="oc" Mar 01 10:05:00 crc kubenswrapper[4792]: I0301 10:05:00.110125 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-smtml" Mar 01 10:05:00 crc kubenswrapper[4792]: I0301 10:05:00.128593 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-smtml"] Mar 01 10:05:00 crc kubenswrapper[4792]: I0301 10:05:00.289959 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn2lb\" (UniqueName: \"kubernetes.io/projected/b24525d9-4be1-47e3-9588-e0747410912c-kube-api-access-gn2lb\") pod \"redhat-operators-smtml\" (UID: \"b24525d9-4be1-47e3-9588-e0747410912c\") " pod="openshift-marketplace/redhat-operators-smtml" Mar 01 10:05:00 crc kubenswrapper[4792]: I0301 10:05:00.290441 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b24525d9-4be1-47e3-9588-e0747410912c-utilities\") pod \"redhat-operators-smtml\" (UID: \"b24525d9-4be1-47e3-9588-e0747410912c\") " pod="openshift-marketplace/redhat-operators-smtml" Mar 01 10:05:00 crc kubenswrapper[4792]: I0301 10:05:00.290505 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b24525d9-4be1-47e3-9588-e0747410912c-catalog-content\") pod \"redhat-operators-smtml\" (UID: \"b24525d9-4be1-47e3-9588-e0747410912c\") " pod="openshift-marketplace/redhat-operators-smtml" Mar 01 10:05:00 crc kubenswrapper[4792]: I0301 10:05:00.392279 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b24525d9-4be1-47e3-9588-e0747410912c-utilities\") pod \"redhat-operators-smtml\" (UID: \"b24525d9-4be1-47e3-9588-e0747410912c\") " pod="openshift-marketplace/redhat-operators-smtml" Mar 01 10:05:00 crc kubenswrapper[4792]: I0301 10:05:00.392365 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b24525d9-4be1-47e3-9588-e0747410912c-catalog-content\") pod \"redhat-operators-smtml\" (UID: \"b24525d9-4be1-47e3-9588-e0747410912c\") " pod="openshift-marketplace/redhat-operators-smtml" Mar 01 10:05:00 crc kubenswrapper[4792]: I0301 10:05:00.392496 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn2lb\" (UniqueName: \"kubernetes.io/projected/b24525d9-4be1-47e3-9588-e0747410912c-kube-api-access-gn2lb\") pod \"redhat-operators-smtml\" (UID: \"b24525d9-4be1-47e3-9588-e0747410912c\") " pod="openshift-marketplace/redhat-operators-smtml" Mar 01 10:05:00 crc kubenswrapper[4792]: I0301 10:05:00.392889 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b24525d9-4be1-47e3-9588-e0747410912c-utilities\") pod \"redhat-operators-smtml\" (UID: \"b24525d9-4be1-47e3-9588-e0747410912c\") " pod="openshift-marketplace/redhat-operators-smtml" Mar 01 10:05:00 crc kubenswrapper[4792]: I0301 10:05:00.393049 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b24525d9-4be1-47e3-9588-e0747410912c-catalog-content\") pod \"redhat-operators-smtml\" (UID: \"b24525d9-4be1-47e3-9588-e0747410912c\") " pod="openshift-marketplace/redhat-operators-smtml" Mar 01 10:05:00 crc kubenswrapper[4792]: I0301 10:05:00.415848 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn2lb\" (UniqueName: \"kubernetes.io/projected/b24525d9-4be1-47e3-9588-e0747410912c-kube-api-access-gn2lb\") pod \"redhat-operators-smtml\" (UID: \"b24525d9-4be1-47e3-9588-e0747410912c\") " pod="openshift-marketplace/redhat-operators-smtml" Mar 01 10:05:00 crc kubenswrapper[4792]: I0301 10:05:00.427815 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-smtml" Mar 01 10:05:00 crc kubenswrapper[4792]: I0301 10:05:00.880201 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-smtml"] Mar 01 10:05:01 crc kubenswrapper[4792]: I0301 10:05:01.547141 4792 generic.go:334] "Generic (PLEG): container finished" podID="b24525d9-4be1-47e3-9588-e0747410912c" containerID="578e35be83b3d72216917a0e56b3301462f3ee60a7f7deba8b5ade43a2c43a92" exitCode=0 Mar 01 10:05:01 crc kubenswrapper[4792]: I0301 10:05:01.547400 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smtml" event={"ID":"b24525d9-4be1-47e3-9588-e0747410912c","Type":"ContainerDied","Data":"578e35be83b3d72216917a0e56b3301462f3ee60a7f7deba8b5ade43a2c43a92"} Mar 01 10:05:01 crc kubenswrapper[4792]: I0301 10:05:01.548064 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smtml" event={"ID":"b24525d9-4be1-47e3-9588-e0747410912c","Type":"ContainerStarted","Data":"6dea6bb4d8bc1761ebada466cb6f422a987107e9b6cee2ade5e5ce671abc23fd"} Mar 01 10:05:02 crc kubenswrapper[4792]: I0301 10:05:02.557195 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smtml" event={"ID":"b24525d9-4be1-47e3-9588-e0747410912c","Type":"ContainerStarted","Data":"b6e10ea71933446956957d2f5deb2f862f01ab2963ae0eec5c79b22f02f940b1"} Mar 01 10:05:07 crc kubenswrapper[4792]: I0301 10:05:07.599253 4792 generic.go:334] "Generic (PLEG): container finished" podID="b24525d9-4be1-47e3-9588-e0747410912c" containerID="b6e10ea71933446956957d2f5deb2f862f01ab2963ae0eec5c79b22f02f940b1" exitCode=0 Mar 01 10:05:07 crc kubenswrapper[4792]: I0301 10:05:07.599331 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smtml" event={"ID":"b24525d9-4be1-47e3-9588-e0747410912c","Type":"ContainerDied","Data":"b6e10ea71933446956957d2f5deb2f862f01ab2963ae0eec5c79b22f02f940b1"} Mar 01 10:05:08 crc kubenswrapper[4792]: I0301 10:05:08.409727 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:05:08 crc kubenswrapper[4792]: E0301 10:05:08.411086 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:05:08 crc kubenswrapper[4792]: I0301 10:05:08.624315 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smtml" event={"ID":"b24525d9-4be1-47e3-9588-e0747410912c","Type":"ContainerStarted","Data":"8b0cfdb40aa117b03ff8e1e7997effe581bdac2ed23c8e645490c3672c724a55"} Mar 01 10:05:08 crc kubenswrapper[4792]: I0301 10:05:08.650808 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-smtml" podStartSLOduration=2.165407095 podStartE2EDuration="8.650784495s" podCreationTimestamp="2026-03-01 10:05:00 +0000 UTC" firstStartedPulling="2026-03-01 10:05:01.549994467 +0000 UTC m=+3430.791873664" lastFinishedPulling="2026-03-01 10:05:08.035371837 +0000 UTC m=+3437.277251064" observedRunningTime="2026-03-01 10:05:08.644594971 +0000 UTC m=+3437.886474178" watchObservedRunningTime="2026-03-01 10:05:08.650784495 +0000 UTC m=+3437.892663702" Mar 01 10:05:10 crc kubenswrapper[4792]: I0301 10:05:10.428892 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-smtml" Mar 01 10:05:10 crc kubenswrapper[4792]: I0301 10:05:10.429323 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-smtml" Mar 01 10:05:11 crc kubenswrapper[4792]: I0301 10:05:11.480178 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-smtml" podUID="b24525d9-4be1-47e3-9588-e0747410912c" containerName="registry-server" probeResult="failure" output=< Mar 01 10:05:11 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 10:05:11 crc kubenswrapper[4792]: > Mar 01 10:05:21 crc kubenswrapper[4792]: I0301 10:05:21.475547 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-smtml" podUID="b24525d9-4be1-47e3-9588-e0747410912c" containerName="registry-server" probeResult="failure" output=< Mar 01 10:05:21 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 10:05:21 crc kubenswrapper[4792]: > Mar 01 10:05:22 crc kubenswrapper[4792]: I0301 10:05:22.408646 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:05:22 crc kubenswrapper[4792]: E0301 10:05:22.409242 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:05:31 crc kubenswrapper[4792]: I0301 10:05:31.472725 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-smtml" podUID="b24525d9-4be1-47e3-9588-e0747410912c" containerName="registry-server" probeResult="failure" output=< Mar 01 10:05:31 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 10:05:31 crc kubenswrapper[4792]: > Mar 01 10:05:36 crc kubenswrapper[4792]: I0301 10:05:36.409197 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:05:36 crc kubenswrapper[4792]: E0301 10:05:36.410683 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:05:41 crc kubenswrapper[4792]: I0301 10:05:41.469815 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-smtml" podUID="b24525d9-4be1-47e3-9588-e0747410912c" containerName="registry-server" probeResult="failure" output=< Mar 01 10:05:41 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 10:05:41 crc kubenswrapper[4792]: > Mar 01 10:05:50 crc kubenswrapper[4792]: I0301 10:05:50.409111 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:05:50 crc kubenswrapper[4792]: E0301 10:05:50.409947 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:05:50 crc kubenswrapper[4792]: I0301 10:05:50.478183 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-smtml" Mar 01 10:05:50 crc kubenswrapper[4792]: I0301 10:05:50.533756 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-smtml" Mar 01 10:05:50 crc kubenswrapper[4792]: I0301 10:05:50.712549 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-smtml"] Mar 01 10:05:52 crc kubenswrapper[4792]: I0301 10:05:52.002629 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-smtml" podUID="b24525d9-4be1-47e3-9588-e0747410912c" containerName="registry-server" containerID="cri-o://8b0cfdb40aa117b03ff8e1e7997effe581bdac2ed23c8e645490c3672c724a55" gracePeriod=2 Mar 01 10:05:52 crc kubenswrapper[4792]: I0301 10:05:52.561459 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-smtml" Mar 01 10:05:52 crc kubenswrapper[4792]: I0301 10:05:52.619840 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gn2lb\" (UniqueName: \"kubernetes.io/projected/b24525d9-4be1-47e3-9588-e0747410912c-kube-api-access-gn2lb\") pod \"b24525d9-4be1-47e3-9588-e0747410912c\" (UID: \"b24525d9-4be1-47e3-9588-e0747410912c\") " Mar 01 10:05:52 crc kubenswrapper[4792]: I0301 10:05:52.620090 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b24525d9-4be1-47e3-9588-e0747410912c-utilities\") pod \"b24525d9-4be1-47e3-9588-e0747410912c\" (UID: \"b24525d9-4be1-47e3-9588-e0747410912c\") " Mar 01 10:05:52 crc kubenswrapper[4792]: I0301 10:05:52.620430 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b24525d9-4be1-47e3-9588-e0747410912c-catalog-content\") pod \"b24525d9-4be1-47e3-9588-e0747410912c\" (UID: \"b24525d9-4be1-47e3-9588-e0747410912c\") " Mar 01 10:05:52 crc kubenswrapper[4792]: I0301 10:05:52.621426 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b24525d9-4be1-47e3-9588-e0747410912c-utilities" (OuterVolumeSpecName: "utilities") pod "b24525d9-4be1-47e3-9588-e0747410912c" (UID: "b24525d9-4be1-47e3-9588-e0747410912c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:05:52 crc kubenswrapper[4792]: I0301 10:05:52.627811 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b24525d9-4be1-47e3-9588-e0747410912c-kube-api-access-gn2lb" (OuterVolumeSpecName: "kube-api-access-gn2lb") pod "b24525d9-4be1-47e3-9588-e0747410912c" (UID: "b24525d9-4be1-47e3-9588-e0747410912c"). InnerVolumeSpecName "kube-api-access-gn2lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:05:52 crc kubenswrapper[4792]: I0301 10:05:52.723112 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b24525d9-4be1-47e3-9588-e0747410912c-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 10:05:52 crc kubenswrapper[4792]: I0301 10:05:52.723147 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gn2lb\" (UniqueName: \"kubernetes.io/projected/b24525d9-4be1-47e3-9588-e0747410912c-kube-api-access-gn2lb\") on node \"crc\" DevicePath \"\"" Mar 01 10:05:52 crc kubenswrapper[4792]: I0301 10:05:52.793042 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b24525d9-4be1-47e3-9588-e0747410912c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b24525d9-4be1-47e3-9588-e0747410912c" (UID: "b24525d9-4be1-47e3-9588-e0747410912c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:05:52 crc kubenswrapper[4792]: I0301 10:05:52.825620 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b24525d9-4be1-47e3-9588-e0747410912c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 10:05:53 crc kubenswrapper[4792]: I0301 10:05:53.014323 4792 generic.go:334] "Generic (PLEG): container finished" podID="b24525d9-4be1-47e3-9588-e0747410912c" containerID="8b0cfdb40aa117b03ff8e1e7997effe581bdac2ed23c8e645490c3672c724a55" exitCode=0 Mar 01 10:05:53 crc kubenswrapper[4792]: I0301 10:05:53.014465 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smtml" event={"ID":"b24525d9-4be1-47e3-9588-e0747410912c","Type":"ContainerDied","Data":"8b0cfdb40aa117b03ff8e1e7997effe581bdac2ed23c8e645490c3672c724a55"} Mar 01 10:05:53 crc kubenswrapper[4792]: I0301 10:05:53.014739 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smtml" event={"ID":"b24525d9-4be1-47e3-9588-e0747410912c","Type":"ContainerDied","Data":"6dea6bb4d8bc1761ebada466cb6f422a987107e9b6cee2ade5e5ce671abc23fd"} Mar 01 10:05:53 crc kubenswrapper[4792]: I0301 10:05:53.014769 4792 scope.go:117] "RemoveContainer" containerID="8b0cfdb40aa117b03ff8e1e7997effe581bdac2ed23c8e645490c3672c724a55" Mar 01 10:05:53 crc kubenswrapper[4792]: I0301 10:05:53.014547 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-smtml" Mar 01 10:05:53 crc kubenswrapper[4792]: I0301 10:05:53.040038 4792 scope.go:117] "RemoveContainer" containerID="b6e10ea71933446956957d2f5deb2f862f01ab2963ae0eec5c79b22f02f940b1" Mar 01 10:05:53 crc kubenswrapper[4792]: I0301 10:05:53.067157 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-smtml"] Mar 01 10:05:53 crc kubenswrapper[4792]: I0301 10:05:53.070332 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-smtml"] Mar 01 10:05:53 crc kubenswrapper[4792]: I0301 10:05:53.074165 4792 scope.go:117] "RemoveContainer" containerID="578e35be83b3d72216917a0e56b3301462f3ee60a7f7deba8b5ade43a2c43a92" Mar 01 10:05:53 crc kubenswrapper[4792]: I0301 10:05:53.144109 4792 scope.go:117] "RemoveContainer" containerID="8b0cfdb40aa117b03ff8e1e7997effe581bdac2ed23c8e645490c3672c724a55" Mar 01 10:05:53 crc kubenswrapper[4792]: E0301 10:05:53.146304 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b0cfdb40aa117b03ff8e1e7997effe581bdac2ed23c8e645490c3672c724a55\": container with ID starting with 8b0cfdb40aa117b03ff8e1e7997effe581bdac2ed23c8e645490c3672c724a55 not found: ID does not exist" containerID="8b0cfdb40aa117b03ff8e1e7997effe581bdac2ed23c8e645490c3672c724a55" Mar 01 10:05:53 crc kubenswrapper[4792]: I0301 10:05:53.146347 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b0cfdb40aa117b03ff8e1e7997effe581bdac2ed23c8e645490c3672c724a55"} err="failed to get container status \"8b0cfdb40aa117b03ff8e1e7997effe581bdac2ed23c8e645490c3672c724a55\": rpc error: code = NotFound desc = could not find container \"8b0cfdb40aa117b03ff8e1e7997effe581bdac2ed23c8e645490c3672c724a55\": container with ID starting with 8b0cfdb40aa117b03ff8e1e7997effe581bdac2ed23c8e645490c3672c724a55 not found: ID does not exist" Mar 01 10:05:53 crc kubenswrapper[4792]: I0301 10:05:53.146374 4792 scope.go:117] "RemoveContainer" containerID="b6e10ea71933446956957d2f5deb2f862f01ab2963ae0eec5c79b22f02f940b1" Mar 01 10:05:53 crc kubenswrapper[4792]: E0301 10:05:53.147472 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6e10ea71933446956957d2f5deb2f862f01ab2963ae0eec5c79b22f02f940b1\": container with ID starting with b6e10ea71933446956957d2f5deb2f862f01ab2963ae0eec5c79b22f02f940b1 not found: ID does not exist" containerID="b6e10ea71933446956957d2f5deb2f862f01ab2963ae0eec5c79b22f02f940b1" Mar 01 10:05:53 crc kubenswrapper[4792]: I0301 10:05:53.147529 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6e10ea71933446956957d2f5deb2f862f01ab2963ae0eec5c79b22f02f940b1"} err="failed to get container status \"b6e10ea71933446956957d2f5deb2f862f01ab2963ae0eec5c79b22f02f940b1\": rpc error: code = NotFound desc = could not find container \"b6e10ea71933446956957d2f5deb2f862f01ab2963ae0eec5c79b22f02f940b1\": container with ID starting with b6e10ea71933446956957d2f5deb2f862f01ab2963ae0eec5c79b22f02f940b1 not found: ID does not exist" Mar 01 10:05:53 crc kubenswrapper[4792]: I0301 10:05:53.147559 4792 scope.go:117] "RemoveContainer" containerID="578e35be83b3d72216917a0e56b3301462f3ee60a7f7deba8b5ade43a2c43a92" Mar 01 10:05:53 crc kubenswrapper[4792]: E0301 10:05:53.148359 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"578e35be83b3d72216917a0e56b3301462f3ee60a7f7deba8b5ade43a2c43a92\": container with ID starting with 578e35be83b3d72216917a0e56b3301462f3ee60a7f7deba8b5ade43a2c43a92 not found: ID does not exist" containerID="578e35be83b3d72216917a0e56b3301462f3ee60a7f7deba8b5ade43a2c43a92" Mar 01 10:05:53 crc kubenswrapper[4792]: I0301 10:05:53.148386 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"578e35be83b3d72216917a0e56b3301462f3ee60a7f7deba8b5ade43a2c43a92"} err="failed to get container status \"578e35be83b3d72216917a0e56b3301462f3ee60a7f7deba8b5ade43a2c43a92\": rpc error: code = NotFound desc = could not find container \"578e35be83b3d72216917a0e56b3301462f3ee60a7f7deba8b5ade43a2c43a92\": container with ID starting with 578e35be83b3d72216917a0e56b3301462f3ee60a7f7deba8b5ade43a2c43a92 not found: ID does not exist" Mar 01 10:05:53 crc kubenswrapper[4792]: I0301 10:05:53.423890 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b24525d9-4be1-47e3-9588-e0747410912c" path="/var/lib/kubelet/pods/b24525d9-4be1-47e3-9588-e0747410912c/volumes" Mar 01 10:06:00 crc kubenswrapper[4792]: I0301 10:06:00.147880 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539326-7hz59"] Mar 01 10:06:00 crc kubenswrapper[4792]: E0301 10:06:00.148766 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b24525d9-4be1-47e3-9588-e0747410912c" containerName="extract-content" Mar 01 10:06:00 crc kubenswrapper[4792]: I0301 10:06:00.148778 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b24525d9-4be1-47e3-9588-e0747410912c" containerName="extract-content" Mar 01 10:06:00 crc kubenswrapper[4792]: E0301 10:06:00.148805 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b24525d9-4be1-47e3-9588-e0747410912c" containerName="registry-server" Mar 01 10:06:00 crc kubenswrapper[4792]: I0301 10:06:00.148810 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b24525d9-4be1-47e3-9588-e0747410912c" containerName="registry-server" Mar 01 10:06:00 crc kubenswrapper[4792]: E0301 10:06:00.148833 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b24525d9-4be1-47e3-9588-e0747410912c" containerName="extract-utilities" Mar 01 10:06:00 crc kubenswrapper[4792]: I0301 10:06:00.148839 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b24525d9-4be1-47e3-9588-e0747410912c" containerName="extract-utilities" Mar 01 10:06:00 crc kubenswrapper[4792]: I0301 10:06:00.149036 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="b24525d9-4be1-47e3-9588-e0747410912c" containerName="registry-server" Mar 01 10:06:00 crc kubenswrapper[4792]: I0301 10:06:00.149660 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539326-7hz59" Mar 01 10:06:00 crc kubenswrapper[4792]: I0301 10:06:00.152429 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 10:06:00 crc kubenswrapper[4792]: I0301 10:06:00.152616 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 10:06:00 crc kubenswrapper[4792]: I0301 10:06:00.155240 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 10:06:00 crc kubenswrapper[4792]: I0301 10:06:00.157692 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539326-7hz59"] Mar 01 10:06:00 crc kubenswrapper[4792]: I0301 10:06:00.261020 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fk2t\" (UniqueName: \"kubernetes.io/projected/e772cc9c-90e7-4e3a-b48d-bb13a4aa0cc8-kube-api-access-5fk2t\") pod \"auto-csr-approver-29539326-7hz59\" (UID: \"e772cc9c-90e7-4e3a-b48d-bb13a4aa0cc8\") " pod="openshift-infra/auto-csr-approver-29539326-7hz59" Mar 01 10:06:00 crc kubenswrapper[4792]: I0301 10:06:00.362805 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fk2t\" (UniqueName: \"kubernetes.io/projected/e772cc9c-90e7-4e3a-b48d-bb13a4aa0cc8-kube-api-access-5fk2t\") pod \"auto-csr-approver-29539326-7hz59\" (UID: \"e772cc9c-90e7-4e3a-b48d-bb13a4aa0cc8\") " pod="openshift-infra/auto-csr-approver-29539326-7hz59" Mar 01 10:06:00 crc kubenswrapper[4792]: I0301 10:06:00.407097 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fk2t\" (UniqueName: \"kubernetes.io/projected/e772cc9c-90e7-4e3a-b48d-bb13a4aa0cc8-kube-api-access-5fk2t\") pod \"auto-csr-approver-29539326-7hz59\" (UID: \"e772cc9c-90e7-4e3a-b48d-bb13a4aa0cc8\") " pod="openshift-infra/auto-csr-approver-29539326-7hz59" Mar 01 10:06:00 crc kubenswrapper[4792]: I0301 10:06:00.498989 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539326-7hz59" Mar 01 10:06:00 crc kubenswrapper[4792]: I0301 10:06:00.982868 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539326-7hz59"] Mar 01 10:06:01 crc kubenswrapper[4792]: I0301 10:06:01.085342 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539326-7hz59" event={"ID":"e772cc9c-90e7-4e3a-b48d-bb13a4aa0cc8","Type":"ContainerStarted","Data":"10e06fef5e07f2fd1cf59f8e8f7565d20e61b5a4b1dd074783b869da3bce167a"} Mar 01 10:06:01 crc kubenswrapper[4792]: I0301 10:06:01.417963 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:06:01 crc kubenswrapper[4792]: E0301 10:06:01.418356 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:06:03 crc kubenswrapper[4792]: I0301 10:06:03.100679 4792 generic.go:334] "Generic (PLEG): container finished" podID="e772cc9c-90e7-4e3a-b48d-bb13a4aa0cc8" containerID="985260a2ba2153789f87b2fc888d57bf9b851f86fd15aaf0dca4797eeb86773f" exitCode=0 Mar 01 10:06:03 crc kubenswrapper[4792]: I0301 10:06:03.101047 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539326-7hz59" event={"ID":"e772cc9c-90e7-4e3a-b48d-bb13a4aa0cc8","Type":"ContainerDied","Data":"985260a2ba2153789f87b2fc888d57bf9b851f86fd15aaf0dca4797eeb86773f"} Mar 01 10:06:04 crc kubenswrapper[4792]: I0301 10:06:04.520211 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539326-7hz59" Mar 01 10:06:04 crc kubenswrapper[4792]: I0301 10:06:04.543374 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fk2t\" (UniqueName: \"kubernetes.io/projected/e772cc9c-90e7-4e3a-b48d-bb13a4aa0cc8-kube-api-access-5fk2t\") pod \"e772cc9c-90e7-4e3a-b48d-bb13a4aa0cc8\" (UID: \"e772cc9c-90e7-4e3a-b48d-bb13a4aa0cc8\") " Mar 01 10:06:04 crc kubenswrapper[4792]: I0301 10:06:04.594173 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e772cc9c-90e7-4e3a-b48d-bb13a4aa0cc8-kube-api-access-5fk2t" (OuterVolumeSpecName: "kube-api-access-5fk2t") pod "e772cc9c-90e7-4e3a-b48d-bb13a4aa0cc8" (UID: "e772cc9c-90e7-4e3a-b48d-bb13a4aa0cc8"). InnerVolumeSpecName "kube-api-access-5fk2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:06:04 crc kubenswrapper[4792]: I0301 10:06:04.646110 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fk2t\" (UniqueName: \"kubernetes.io/projected/e772cc9c-90e7-4e3a-b48d-bb13a4aa0cc8-kube-api-access-5fk2t\") on node \"crc\" DevicePath \"\"" Mar 01 10:06:05 crc kubenswrapper[4792]: I0301 10:06:05.117455 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539326-7hz59" event={"ID":"e772cc9c-90e7-4e3a-b48d-bb13a4aa0cc8","Type":"ContainerDied","Data":"10e06fef5e07f2fd1cf59f8e8f7565d20e61b5a4b1dd074783b869da3bce167a"} Mar 01 10:06:05 crc kubenswrapper[4792]: I0301 10:06:05.117749 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10e06fef5e07f2fd1cf59f8e8f7565d20e61b5a4b1dd074783b869da3bce167a" Mar 01 10:06:05 crc kubenswrapper[4792]: I0301 10:06:05.117488 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539326-7hz59" Mar 01 10:06:05 crc kubenswrapper[4792]: I0301 10:06:05.585326 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539320-bzqcr"] Mar 01 10:06:05 crc kubenswrapper[4792]: I0301 10:06:05.593294 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539320-bzqcr"] Mar 01 10:06:07 crc kubenswrapper[4792]: I0301 10:06:07.421671 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5c085ec-b23e-4ad9-ae76-9775921b667d" path="/var/lib/kubelet/pods/a5c085ec-b23e-4ad9-ae76-9775921b667d/volumes" Mar 01 10:06:13 crc kubenswrapper[4792]: I0301 10:06:13.408723 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:06:13 crc kubenswrapper[4792]: E0301 10:06:13.409641 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:06:27 crc kubenswrapper[4792]: I0301 10:06:27.414034 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:06:27 crc kubenswrapper[4792]: E0301 10:06:27.414677 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:06:32 crc kubenswrapper[4792]: I0301 10:06:32.527520 4792 scope.go:117] "RemoveContainer" containerID="4994cd62771f4057b6d1f58071d3828ce7f1350994c490b87bf0e5cb1e97dff9" Mar 01 10:06:39 crc kubenswrapper[4792]: I0301 10:06:39.409675 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:06:40 crc kubenswrapper[4792]: I0301 10:06:40.548851 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerStarted","Data":"947fc3b8131dfb1dafd7693a1530cb8db816d30e8340c8877587b0f93967d46d"} Mar 01 10:07:09 crc kubenswrapper[4792]: I0301 10:07:09.907563 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-48qp5"] Mar 01 10:07:09 crc kubenswrapper[4792]: E0301 10:07:09.908393 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e772cc9c-90e7-4e3a-b48d-bb13a4aa0cc8" containerName="oc" Mar 01 10:07:09 crc kubenswrapper[4792]: I0301 10:07:09.908406 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e772cc9c-90e7-4e3a-b48d-bb13a4aa0cc8" containerName="oc" Mar 01 10:07:09 crc kubenswrapper[4792]: I0301 10:07:09.908591 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e772cc9c-90e7-4e3a-b48d-bb13a4aa0cc8" containerName="oc" Mar 01 10:07:09 crc kubenswrapper[4792]: I0301 10:07:09.910570 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-48qp5" Mar 01 10:07:09 crc kubenswrapper[4792]: I0301 10:07:09.924884 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-48qp5"] Mar 01 10:07:10 crc kubenswrapper[4792]: I0301 10:07:10.089645 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvpl7\" (UniqueName: \"kubernetes.io/projected/682f4383-d3fb-4efe-89f5-e496b4b3b71b-kube-api-access-xvpl7\") pod \"community-operators-48qp5\" (UID: \"682f4383-d3fb-4efe-89f5-e496b4b3b71b\") " pod="openshift-marketplace/community-operators-48qp5" Mar 01 10:07:10 crc kubenswrapper[4792]: I0301 10:07:10.089711 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/682f4383-d3fb-4efe-89f5-e496b4b3b71b-catalog-content\") pod \"community-operators-48qp5\" (UID: \"682f4383-d3fb-4efe-89f5-e496b4b3b71b\") " pod="openshift-marketplace/community-operators-48qp5" Mar 01 10:07:10 crc kubenswrapper[4792]: I0301 10:07:10.089819 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/682f4383-d3fb-4efe-89f5-e496b4b3b71b-utilities\") pod \"community-operators-48qp5\" (UID: \"682f4383-d3fb-4efe-89f5-e496b4b3b71b\") " pod="openshift-marketplace/community-operators-48qp5" Mar 01 10:07:10 crc kubenswrapper[4792]: I0301 10:07:10.192006 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvpl7\" (UniqueName: \"kubernetes.io/projected/682f4383-d3fb-4efe-89f5-e496b4b3b71b-kube-api-access-xvpl7\") pod \"community-operators-48qp5\" (UID: \"682f4383-d3fb-4efe-89f5-e496b4b3b71b\") " pod="openshift-marketplace/community-operators-48qp5" Mar 01 10:07:10 crc kubenswrapper[4792]: I0301 10:07:10.192066 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/682f4383-d3fb-4efe-89f5-e496b4b3b71b-catalog-content\") pod \"community-operators-48qp5\" (UID: \"682f4383-d3fb-4efe-89f5-e496b4b3b71b\") " pod="openshift-marketplace/community-operators-48qp5" Mar 01 10:07:10 crc kubenswrapper[4792]: I0301 10:07:10.192196 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/682f4383-d3fb-4efe-89f5-e496b4b3b71b-utilities\") pod \"community-operators-48qp5\" (UID: \"682f4383-d3fb-4efe-89f5-e496b4b3b71b\") " pod="openshift-marketplace/community-operators-48qp5" Mar 01 10:07:10 crc kubenswrapper[4792]: I0301 10:07:10.192728 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/682f4383-d3fb-4efe-89f5-e496b4b3b71b-utilities\") pod \"community-operators-48qp5\" (UID: \"682f4383-d3fb-4efe-89f5-e496b4b3b71b\") " pod="openshift-marketplace/community-operators-48qp5" Mar 01 10:07:10 crc kubenswrapper[4792]: I0301 10:07:10.193012 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/682f4383-d3fb-4efe-89f5-e496b4b3b71b-catalog-content\") pod \"community-operators-48qp5\" (UID: \"682f4383-d3fb-4efe-89f5-e496b4b3b71b\") " pod="openshift-marketplace/community-operators-48qp5" Mar 01 10:07:10 crc kubenswrapper[4792]: I0301 10:07:10.218211 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvpl7\" (UniqueName: \"kubernetes.io/projected/682f4383-d3fb-4efe-89f5-e496b4b3b71b-kube-api-access-xvpl7\") pod \"community-operators-48qp5\" (UID: \"682f4383-d3fb-4efe-89f5-e496b4b3b71b\") " pod="openshift-marketplace/community-operators-48qp5" Mar 01 10:07:10 crc kubenswrapper[4792]: I0301 10:07:10.238424 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-48qp5" Mar 01 10:07:10 crc kubenswrapper[4792]: I0301 10:07:10.721144 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-48qp5"] Mar 01 10:07:10 crc kubenswrapper[4792]: I0301 10:07:10.793762 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-48qp5" event={"ID":"682f4383-d3fb-4efe-89f5-e496b4b3b71b","Type":"ContainerStarted","Data":"94eba7eaa028ed08ef077cfc6746bc96f898cc9f246702b2a5a689eae3288e03"} Mar 01 10:07:11 crc kubenswrapper[4792]: I0301 10:07:11.803581 4792 generic.go:334] "Generic (PLEG): container finished" podID="682f4383-d3fb-4efe-89f5-e496b4b3b71b" containerID="550d4cbb086003d878b362fc46f3972d5e8a3737e8bfdb31dd607b8a5db535b2" exitCode=0 Mar 01 10:07:11 crc kubenswrapper[4792]: I0301 10:07:11.803659 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-48qp5" event={"ID":"682f4383-d3fb-4efe-89f5-e496b4b3b71b","Type":"ContainerDied","Data":"550d4cbb086003d878b362fc46f3972d5e8a3737e8bfdb31dd607b8a5db535b2"} Mar 01 10:07:11 crc kubenswrapper[4792]: I0301 10:07:11.808226 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 01 10:07:12 crc kubenswrapper[4792]: I0301 10:07:12.813489 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-48qp5" event={"ID":"682f4383-d3fb-4efe-89f5-e496b4b3b71b","Type":"ContainerStarted","Data":"be7a136c39cab1fb688e8478bf8ada2e256e2ff41b168a20058a43540f860198"} Mar 01 10:07:14 crc kubenswrapper[4792]: I0301 10:07:14.830717 4792 generic.go:334] "Generic (PLEG): container finished" podID="682f4383-d3fb-4efe-89f5-e496b4b3b71b" containerID="be7a136c39cab1fb688e8478bf8ada2e256e2ff41b168a20058a43540f860198" exitCode=0 Mar 01 10:07:14 crc kubenswrapper[4792]: I0301 10:07:14.830774 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-48qp5" event={"ID":"682f4383-d3fb-4efe-89f5-e496b4b3b71b","Type":"ContainerDied","Data":"be7a136c39cab1fb688e8478bf8ada2e256e2ff41b168a20058a43540f860198"} Mar 01 10:07:15 crc kubenswrapper[4792]: I0301 10:07:15.841181 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-48qp5" event={"ID":"682f4383-d3fb-4efe-89f5-e496b4b3b71b","Type":"ContainerStarted","Data":"539ab3bdb0f8e33c45ad9a4e194bc21761a2d32030aa18bd3fd289463fab5c69"} Mar 01 10:07:20 crc kubenswrapper[4792]: I0301 10:07:20.021636 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-48qp5" podStartSLOduration=7.6135256909999995 podStartE2EDuration="11.021616673s" podCreationTimestamp="2026-03-01 10:07:09 +0000 UTC" firstStartedPulling="2026-03-01 10:07:11.808010779 +0000 UTC m=+3561.049889976" lastFinishedPulling="2026-03-01 10:07:15.216101761 +0000 UTC m=+3564.457980958" observedRunningTime="2026-03-01 10:07:15.86283305 +0000 UTC m=+3565.104712257" watchObservedRunningTime="2026-03-01 10:07:20.021616673 +0000 UTC m=+3569.263495870" Mar 01 10:07:20 crc kubenswrapper[4792]: I0301 10:07:20.030786 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pph6g"] Mar 01 10:07:20 crc kubenswrapper[4792]: I0301 10:07:20.032680 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pph6g" Mar 01 10:07:20 crc kubenswrapper[4792]: I0301 10:07:20.056114 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pph6g"] Mar 01 10:07:20 crc kubenswrapper[4792]: I0301 10:07:20.216814 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef-catalog-content\") pod \"certified-operators-pph6g\" (UID: \"8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef\") " pod="openshift-marketplace/certified-operators-pph6g" Mar 01 10:07:20 crc kubenswrapper[4792]: I0301 10:07:20.216931 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch59q\" (UniqueName: \"kubernetes.io/projected/8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef-kube-api-access-ch59q\") pod \"certified-operators-pph6g\" (UID: \"8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef\") " pod="openshift-marketplace/certified-operators-pph6g" Mar 01 10:07:20 crc kubenswrapper[4792]: I0301 10:07:20.216994 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef-utilities\") pod \"certified-operators-pph6g\" (UID: \"8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef\") " pod="openshift-marketplace/certified-operators-pph6g" Mar 01 10:07:20 crc kubenswrapper[4792]: I0301 10:07:20.239343 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-48qp5" Mar 01 10:07:20 crc kubenswrapper[4792]: I0301 10:07:20.239821 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-48qp5" Mar 01 10:07:20 crc kubenswrapper[4792]: I0301 10:07:20.319700 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef-catalog-content\") pod \"certified-operators-pph6g\" (UID: \"8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef\") " pod="openshift-marketplace/certified-operators-pph6g" Mar 01 10:07:20 crc kubenswrapper[4792]: I0301 10:07:20.319754 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch59q\" (UniqueName: \"kubernetes.io/projected/8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef-kube-api-access-ch59q\") pod \"certified-operators-pph6g\" (UID: \"8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef\") " pod="openshift-marketplace/certified-operators-pph6g" Mar 01 10:07:20 crc kubenswrapper[4792]: I0301 10:07:20.319779 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef-utilities\") pod \"certified-operators-pph6g\" (UID: \"8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef\") " pod="openshift-marketplace/certified-operators-pph6g" Mar 01 10:07:20 crc kubenswrapper[4792]: I0301 10:07:20.321274 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef-utilities\") pod \"certified-operators-pph6g\" (UID: \"8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef\") " pod="openshift-marketplace/certified-operators-pph6g" Mar 01 10:07:20 crc kubenswrapper[4792]: I0301 10:07:20.321371 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef-catalog-content\") pod \"certified-operators-pph6g\" (UID: \"8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef\") " pod="openshift-marketplace/certified-operators-pph6g" Mar 01 10:07:20 crc kubenswrapper[4792]: I0301 10:07:20.347648 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch59q\" (UniqueName: \"kubernetes.io/projected/8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef-kube-api-access-ch59q\") pod \"certified-operators-pph6g\" (UID: \"8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef\") " pod="openshift-marketplace/certified-operators-pph6g" Mar 01 10:07:20 crc kubenswrapper[4792]: I0301 10:07:20.358236 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pph6g" Mar 01 10:07:20 crc kubenswrapper[4792]: I0301 10:07:20.930548 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pph6g"] Mar 01 10:07:21 crc kubenswrapper[4792]: I0301 10:07:21.287869 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-48qp5" podUID="682f4383-d3fb-4efe-89f5-e496b4b3b71b" containerName="registry-server" probeResult="failure" output=< Mar 01 10:07:21 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 10:07:21 crc kubenswrapper[4792]: > Mar 01 10:07:21 crc kubenswrapper[4792]: I0301 10:07:21.904713 4792 generic.go:334] "Generic (PLEG): container finished" podID="8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef" containerID="bdb69b17e8cbcb56946bda3a57e20355267927603056c7235362b2f0563bbc74" exitCode=0 Mar 01 10:07:21 crc kubenswrapper[4792]: I0301 10:07:21.904769 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pph6g" event={"ID":"8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef","Type":"ContainerDied","Data":"bdb69b17e8cbcb56946bda3a57e20355267927603056c7235362b2f0563bbc74"} Mar 01 10:07:21 crc kubenswrapper[4792]: I0301 10:07:21.904819 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pph6g" event={"ID":"8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef","Type":"ContainerStarted","Data":"20eeae96dbcf40cf915287f68860afaea37e4de6c7d4d59e23e4ddf89a89c82c"} Mar 01 10:07:22 crc kubenswrapper[4792]: I0301 10:07:22.914494 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pph6g" event={"ID":"8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef","Type":"ContainerStarted","Data":"e98ec606238ece22cf5bf54f0d9b14ebe2b48e8479d47b0d013ac5c426816a04"} Mar 01 10:07:25 crc kubenswrapper[4792]: I0301 10:07:25.029255 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ltk8j"] Mar 01 10:07:25 crc kubenswrapper[4792]: I0301 10:07:25.033969 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ltk8j" Mar 01 10:07:25 crc kubenswrapper[4792]: I0301 10:07:25.058886 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ltk8j"] Mar 01 10:07:25 crc kubenswrapper[4792]: I0301 10:07:25.230926 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3512046c-8b15-4ab0-8c94-39d0d0f2e73c-utilities\") pod \"redhat-marketplace-ltk8j\" (UID: \"3512046c-8b15-4ab0-8c94-39d0d0f2e73c\") " pod="openshift-marketplace/redhat-marketplace-ltk8j" Mar 01 10:07:25 crc kubenswrapper[4792]: I0301 10:07:25.231420 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-974g4\" (UniqueName: \"kubernetes.io/projected/3512046c-8b15-4ab0-8c94-39d0d0f2e73c-kube-api-access-974g4\") pod \"redhat-marketplace-ltk8j\" (UID: \"3512046c-8b15-4ab0-8c94-39d0d0f2e73c\") " pod="openshift-marketplace/redhat-marketplace-ltk8j" Mar 01 10:07:25 crc kubenswrapper[4792]: I0301 10:07:25.231494 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3512046c-8b15-4ab0-8c94-39d0d0f2e73c-catalog-content\") pod \"redhat-marketplace-ltk8j\" (UID: \"3512046c-8b15-4ab0-8c94-39d0d0f2e73c\") " pod="openshift-marketplace/redhat-marketplace-ltk8j" Mar 01 10:07:25 crc kubenswrapper[4792]: I0301 10:07:25.334375 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3512046c-8b15-4ab0-8c94-39d0d0f2e73c-utilities\") pod \"redhat-marketplace-ltk8j\" (UID: \"3512046c-8b15-4ab0-8c94-39d0d0f2e73c\") " pod="openshift-marketplace/redhat-marketplace-ltk8j" Mar 01 10:07:25 crc kubenswrapper[4792]: I0301 10:07:25.334458 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-974g4\" (UniqueName: \"kubernetes.io/projected/3512046c-8b15-4ab0-8c94-39d0d0f2e73c-kube-api-access-974g4\") pod \"redhat-marketplace-ltk8j\" (UID: \"3512046c-8b15-4ab0-8c94-39d0d0f2e73c\") " pod="openshift-marketplace/redhat-marketplace-ltk8j" Mar 01 10:07:25 crc kubenswrapper[4792]: I0301 10:07:25.334529 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3512046c-8b15-4ab0-8c94-39d0d0f2e73c-catalog-content\") pod \"redhat-marketplace-ltk8j\" (UID: \"3512046c-8b15-4ab0-8c94-39d0d0f2e73c\") " pod="openshift-marketplace/redhat-marketplace-ltk8j" Mar 01 10:07:25 crc kubenswrapper[4792]: I0301 10:07:25.335105 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3512046c-8b15-4ab0-8c94-39d0d0f2e73c-utilities\") pod \"redhat-marketplace-ltk8j\" (UID: \"3512046c-8b15-4ab0-8c94-39d0d0f2e73c\") " pod="openshift-marketplace/redhat-marketplace-ltk8j" Mar 01 10:07:25 crc kubenswrapper[4792]: I0301 10:07:25.335143 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3512046c-8b15-4ab0-8c94-39d0d0f2e73c-catalog-content\") pod \"redhat-marketplace-ltk8j\" (UID: \"3512046c-8b15-4ab0-8c94-39d0d0f2e73c\") " pod="openshift-marketplace/redhat-marketplace-ltk8j" Mar 01 10:07:25 crc kubenswrapper[4792]: I0301 10:07:25.355934 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-974g4\" (UniqueName: \"kubernetes.io/projected/3512046c-8b15-4ab0-8c94-39d0d0f2e73c-kube-api-access-974g4\") pod \"redhat-marketplace-ltk8j\" (UID: \"3512046c-8b15-4ab0-8c94-39d0d0f2e73c\") " pod="openshift-marketplace/redhat-marketplace-ltk8j" Mar 01 10:07:25 crc kubenswrapper[4792]: I0301 10:07:25.363212 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ltk8j" Mar 01 10:07:25 crc kubenswrapper[4792]: I0301 10:07:25.858324 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ltk8j"] Mar 01 10:07:25 crc kubenswrapper[4792]: W0301 10:07:25.861995 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3512046c_8b15_4ab0_8c94_39d0d0f2e73c.slice/crio-7a85f6553115dd71b80a2c8f43818f4072ecec30d5eb3e65b978374e0bc127b2 WatchSource:0}: Error finding container 7a85f6553115dd71b80a2c8f43818f4072ecec30d5eb3e65b978374e0bc127b2: Status 404 returned error can't find the container with id 7a85f6553115dd71b80a2c8f43818f4072ecec30d5eb3e65b978374e0bc127b2 Mar 01 10:07:25 crc kubenswrapper[4792]: I0301 10:07:25.940318 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ltk8j" event={"ID":"3512046c-8b15-4ab0-8c94-39d0d0f2e73c","Type":"ContainerStarted","Data":"7a85f6553115dd71b80a2c8f43818f4072ecec30d5eb3e65b978374e0bc127b2"} Mar 01 10:07:25 crc kubenswrapper[4792]: I0301 10:07:25.943119 4792 generic.go:334] "Generic (PLEG): container finished" podID="8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef" containerID="e98ec606238ece22cf5bf54f0d9b14ebe2b48e8479d47b0d013ac5c426816a04" exitCode=0 Mar 01 10:07:25 crc kubenswrapper[4792]: I0301 10:07:25.943147 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pph6g" event={"ID":"8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef","Type":"ContainerDied","Data":"e98ec606238ece22cf5bf54f0d9b14ebe2b48e8479d47b0d013ac5c426816a04"} Mar 01 10:07:26 crc kubenswrapper[4792]: I0301 10:07:26.954518 4792 generic.go:334] "Generic (PLEG): container finished" podID="3512046c-8b15-4ab0-8c94-39d0d0f2e73c" containerID="b863829c371b01a1972e917a6d1591c043a273261668f7d679b1616373a821a3" exitCode=0 Mar 01 10:07:26 crc kubenswrapper[4792]: I0301 10:07:26.954678 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ltk8j" event={"ID":"3512046c-8b15-4ab0-8c94-39d0d0f2e73c","Type":"ContainerDied","Data":"b863829c371b01a1972e917a6d1591c043a273261668f7d679b1616373a821a3"} Mar 01 10:07:26 crc kubenswrapper[4792]: I0301 10:07:26.958647 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pph6g" event={"ID":"8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef","Type":"ContainerStarted","Data":"f2df48ec621324b00f73adb4b8eeec69ea7344052fadf1865efb512b64a14a49"} Mar 01 10:07:26 crc kubenswrapper[4792]: I0301 10:07:26.997366 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pph6g" podStartSLOduration=2.550711879 podStartE2EDuration="6.997346475s" podCreationTimestamp="2026-03-01 10:07:20 +0000 UTC" firstStartedPulling="2026-03-01 10:07:21.907263881 +0000 UTC m=+3571.149143068" lastFinishedPulling="2026-03-01 10:07:26.353898467 +0000 UTC m=+3575.595777664" observedRunningTime="2026-03-01 10:07:26.995744625 +0000 UTC m=+3576.237623822" watchObservedRunningTime="2026-03-01 10:07:26.997346475 +0000 UTC m=+3576.239225672" Mar 01 10:07:27 crc kubenswrapper[4792]: I0301 10:07:27.983837 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ltk8j" event={"ID":"3512046c-8b15-4ab0-8c94-39d0d0f2e73c","Type":"ContainerStarted","Data":"01a152634348be67850a0ff44b28b264a62b707c7ec30ef00d95e9f0a4621d55"} Mar 01 10:07:30 crc kubenswrapper[4792]: I0301 10:07:30.000554 4792 generic.go:334] "Generic (PLEG): container finished" podID="3512046c-8b15-4ab0-8c94-39d0d0f2e73c" containerID="01a152634348be67850a0ff44b28b264a62b707c7ec30ef00d95e9f0a4621d55" exitCode=0 Mar 01 10:07:30 crc kubenswrapper[4792]: I0301 10:07:30.000810 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ltk8j" event={"ID":"3512046c-8b15-4ab0-8c94-39d0d0f2e73c","Type":"ContainerDied","Data":"01a152634348be67850a0ff44b28b264a62b707c7ec30ef00d95e9f0a4621d55"} Mar 01 10:07:30 crc kubenswrapper[4792]: I0301 10:07:30.287857 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-48qp5" Mar 01 10:07:30 crc kubenswrapper[4792]: I0301 10:07:30.353201 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-48qp5" Mar 01 10:07:30 crc kubenswrapper[4792]: I0301 10:07:30.389697 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pph6g" Mar 01 10:07:30 crc kubenswrapper[4792]: I0301 10:07:30.389737 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pph6g" Mar 01 10:07:31 crc kubenswrapper[4792]: I0301 10:07:31.011268 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ltk8j" event={"ID":"3512046c-8b15-4ab0-8c94-39d0d0f2e73c","Type":"ContainerStarted","Data":"caf38347042ec7f347ab71967d0dc62fd40a1d00a3ad82bbffdf18b0ab66e178"} Mar 01 10:07:31 crc kubenswrapper[4792]: I0301 10:07:31.052834 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ltk8j" podStartSLOduration=3.5594541939999997 podStartE2EDuration="7.052813392s" podCreationTimestamp="2026-03-01 10:07:24 +0000 UTC" firstStartedPulling="2026-03-01 10:07:26.957699886 +0000 UTC m=+3576.199579083" lastFinishedPulling="2026-03-01 10:07:30.451059084 +0000 UTC m=+3579.692938281" observedRunningTime="2026-03-01 10:07:31.052043073 +0000 UTC m=+3580.293922270" watchObservedRunningTime="2026-03-01 10:07:31.052813392 +0000 UTC m=+3580.294692589" Mar 01 10:07:31 crc kubenswrapper[4792]: I0301 10:07:31.451505 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-pph6g" podUID="8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef" containerName="registry-server" probeResult="failure" output=< Mar 01 10:07:31 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 10:07:31 crc kubenswrapper[4792]: > Mar 01 10:07:32 crc kubenswrapper[4792]: I0301 10:07:32.593372 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-48qp5"] Mar 01 10:07:32 crc kubenswrapper[4792]: I0301 10:07:32.594260 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-48qp5" podUID="682f4383-d3fb-4efe-89f5-e496b4b3b71b" containerName="registry-server" containerID="cri-o://539ab3bdb0f8e33c45ad9a4e194bc21761a2d32030aa18bd3fd289463fab5c69" gracePeriod=2 Mar 01 10:07:33 crc kubenswrapper[4792]: I0301 10:07:33.063151 4792 generic.go:334] "Generic (PLEG): container finished" podID="682f4383-d3fb-4efe-89f5-e496b4b3b71b" containerID="539ab3bdb0f8e33c45ad9a4e194bc21761a2d32030aa18bd3fd289463fab5c69" exitCode=0 Mar 01 10:07:33 crc kubenswrapper[4792]: I0301 10:07:33.063445 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-48qp5" event={"ID":"682f4383-d3fb-4efe-89f5-e496b4b3b71b","Type":"ContainerDied","Data":"539ab3bdb0f8e33c45ad9a4e194bc21761a2d32030aa18bd3fd289463fab5c69"} Mar 01 10:07:33 crc kubenswrapper[4792]: I0301 10:07:33.175610 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-48qp5" Mar 01 10:07:33 crc kubenswrapper[4792]: I0301 10:07:33.319878 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvpl7\" (UniqueName: \"kubernetes.io/projected/682f4383-d3fb-4efe-89f5-e496b4b3b71b-kube-api-access-xvpl7\") pod \"682f4383-d3fb-4efe-89f5-e496b4b3b71b\" (UID: \"682f4383-d3fb-4efe-89f5-e496b4b3b71b\") " Mar 01 10:07:33 crc kubenswrapper[4792]: I0301 10:07:33.319988 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/682f4383-d3fb-4efe-89f5-e496b4b3b71b-catalog-content\") pod \"682f4383-d3fb-4efe-89f5-e496b4b3b71b\" (UID: \"682f4383-d3fb-4efe-89f5-e496b4b3b71b\") " Mar 01 10:07:33 crc kubenswrapper[4792]: I0301 10:07:33.320054 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/682f4383-d3fb-4efe-89f5-e496b4b3b71b-utilities\") pod \"682f4383-d3fb-4efe-89f5-e496b4b3b71b\" (UID: \"682f4383-d3fb-4efe-89f5-e496b4b3b71b\") " Mar 01 10:07:33 crc kubenswrapper[4792]: I0301 10:07:33.320837 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/682f4383-d3fb-4efe-89f5-e496b4b3b71b-utilities" (OuterVolumeSpecName: "utilities") pod "682f4383-d3fb-4efe-89f5-e496b4b3b71b" (UID: "682f4383-d3fb-4efe-89f5-e496b4b3b71b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:07:33 crc kubenswrapper[4792]: I0301 10:07:33.328102 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/682f4383-d3fb-4efe-89f5-e496b4b3b71b-kube-api-access-xvpl7" (OuterVolumeSpecName: "kube-api-access-xvpl7") pod "682f4383-d3fb-4efe-89f5-e496b4b3b71b" (UID: "682f4383-d3fb-4efe-89f5-e496b4b3b71b"). InnerVolumeSpecName "kube-api-access-xvpl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:07:33 crc kubenswrapper[4792]: I0301 10:07:33.368059 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/682f4383-d3fb-4efe-89f5-e496b4b3b71b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "682f4383-d3fb-4efe-89f5-e496b4b3b71b" (UID: "682f4383-d3fb-4efe-89f5-e496b4b3b71b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:07:33 crc kubenswrapper[4792]: I0301 10:07:33.422861 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvpl7\" (UniqueName: \"kubernetes.io/projected/682f4383-d3fb-4efe-89f5-e496b4b3b71b-kube-api-access-xvpl7\") on node \"crc\" DevicePath \"\"" Mar 01 10:07:33 crc kubenswrapper[4792]: I0301 10:07:33.423156 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/682f4383-d3fb-4efe-89f5-e496b4b3b71b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 10:07:33 crc kubenswrapper[4792]: I0301 10:07:33.423230 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/682f4383-d3fb-4efe-89f5-e496b4b3b71b-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 10:07:34 crc kubenswrapper[4792]: I0301 10:07:34.074300 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-48qp5" event={"ID":"682f4383-d3fb-4efe-89f5-e496b4b3b71b","Type":"ContainerDied","Data":"94eba7eaa028ed08ef077cfc6746bc96f898cc9f246702b2a5a689eae3288e03"} Mar 01 10:07:34 crc kubenswrapper[4792]: I0301 10:07:34.074355 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-48qp5" Mar 01 10:07:34 crc kubenswrapper[4792]: I0301 10:07:34.074638 4792 scope.go:117] "RemoveContainer" containerID="539ab3bdb0f8e33c45ad9a4e194bc21761a2d32030aa18bd3fd289463fab5c69" Mar 01 10:07:34 crc kubenswrapper[4792]: I0301 10:07:34.095491 4792 scope.go:117] "RemoveContainer" containerID="be7a136c39cab1fb688e8478bf8ada2e256e2ff41b168a20058a43540f860198" Mar 01 10:07:34 crc kubenswrapper[4792]: I0301 10:07:34.110152 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-48qp5"] Mar 01 10:07:34 crc kubenswrapper[4792]: I0301 10:07:34.119706 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-48qp5"] Mar 01 10:07:34 crc kubenswrapper[4792]: I0301 10:07:34.158275 4792 scope.go:117] "RemoveContainer" containerID="550d4cbb086003d878b362fc46f3972d5e8a3737e8bfdb31dd607b8a5db535b2" Mar 01 10:07:35 crc kubenswrapper[4792]: I0301 10:07:35.364637 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ltk8j" Mar 01 10:07:35 crc kubenswrapper[4792]: I0301 10:07:35.366315 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ltk8j" Mar 01 10:07:35 crc kubenswrapper[4792]: I0301 10:07:35.420486 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="682f4383-d3fb-4efe-89f5-e496b4b3b71b" path="/var/lib/kubelet/pods/682f4383-d3fb-4efe-89f5-e496b4b3b71b/volumes" Mar 01 10:07:36 crc kubenswrapper[4792]: I0301 10:07:36.411400 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-ltk8j" podUID="3512046c-8b15-4ab0-8c94-39d0d0f2e73c" containerName="registry-server" probeResult="failure" output=< Mar 01 10:07:36 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 10:07:36 crc kubenswrapper[4792]: > Mar 01 10:07:41 crc kubenswrapper[4792]: I0301 10:07:41.437247 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-pph6g" podUID="8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef" containerName="registry-server" probeResult="failure" output=< Mar 01 10:07:41 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 10:07:41 crc kubenswrapper[4792]: > Mar 01 10:07:45 crc kubenswrapper[4792]: I0301 10:07:45.440836 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ltk8j" Mar 01 10:07:45 crc kubenswrapper[4792]: I0301 10:07:45.500977 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ltk8j" Mar 01 10:07:47 crc kubenswrapper[4792]: I0301 10:07:47.173502 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ltk8j"] Mar 01 10:07:47 crc kubenswrapper[4792]: I0301 10:07:47.221205 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ltk8j" podUID="3512046c-8b15-4ab0-8c94-39d0d0f2e73c" containerName="registry-server" containerID="cri-o://caf38347042ec7f347ab71967d0dc62fd40a1d00a3ad82bbffdf18b0ab66e178" gracePeriod=2 Mar 01 10:07:47 crc kubenswrapper[4792]: I0301 10:07:47.862680 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ltk8j" Mar 01 10:07:47 crc kubenswrapper[4792]: I0301 10:07:47.901392 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3512046c-8b15-4ab0-8c94-39d0d0f2e73c-utilities\") pod \"3512046c-8b15-4ab0-8c94-39d0d0f2e73c\" (UID: \"3512046c-8b15-4ab0-8c94-39d0d0f2e73c\") " Mar 01 10:07:47 crc kubenswrapper[4792]: I0301 10:07:47.901453 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-974g4\" (UniqueName: \"kubernetes.io/projected/3512046c-8b15-4ab0-8c94-39d0d0f2e73c-kube-api-access-974g4\") pod \"3512046c-8b15-4ab0-8c94-39d0d0f2e73c\" (UID: \"3512046c-8b15-4ab0-8c94-39d0d0f2e73c\") " Mar 01 10:07:47 crc kubenswrapper[4792]: I0301 10:07:47.901620 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3512046c-8b15-4ab0-8c94-39d0d0f2e73c-catalog-content\") pod \"3512046c-8b15-4ab0-8c94-39d0d0f2e73c\" (UID: \"3512046c-8b15-4ab0-8c94-39d0d0f2e73c\") " Mar 01 10:07:47 crc kubenswrapper[4792]: I0301 10:07:47.902305 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3512046c-8b15-4ab0-8c94-39d0d0f2e73c-utilities" (OuterVolumeSpecName: "utilities") pod "3512046c-8b15-4ab0-8c94-39d0d0f2e73c" (UID: "3512046c-8b15-4ab0-8c94-39d0d0f2e73c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:07:47 crc kubenswrapper[4792]: I0301 10:07:47.910856 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3512046c-8b15-4ab0-8c94-39d0d0f2e73c-kube-api-access-974g4" (OuterVolumeSpecName: "kube-api-access-974g4") pod "3512046c-8b15-4ab0-8c94-39d0d0f2e73c" (UID: "3512046c-8b15-4ab0-8c94-39d0d0f2e73c"). InnerVolumeSpecName "kube-api-access-974g4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:07:47 crc kubenswrapper[4792]: I0301 10:07:47.929361 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3512046c-8b15-4ab0-8c94-39d0d0f2e73c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3512046c-8b15-4ab0-8c94-39d0d0f2e73c" (UID: "3512046c-8b15-4ab0-8c94-39d0d0f2e73c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:07:48 crc kubenswrapper[4792]: I0301 10:07:48.003735 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3512046c-8b15-4ab0-8c94-39d0d0f2e73c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 10:07:48 crc kubenswrapper[4792]: I0301 10:07:48.003778 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3512046c-8b15-4ab0-8c94-39d0d0f2e73c-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 10:07:48 crc kubenswrapper[4792]: I0301 10:07:48.003791 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-974g4\" (UniqueName: \"kubernetes.io/projected/3512046c-8b15-4ab0-8c94-39d0d0f2e73c-kube-api-access-974g4\") on node \"crc\" DevicePath \"\"" Mar 01 10:07:48 crc kubenswrapper[4792]: I0301 10:07:48.230645 4792 generic.go:334] "Generic (PLEG): container finished" podID="3512046c-8b15-4ab0-8c94-39d0d0f2e73c" containerID="caf38347042ec7f347ab71967d0dc62fd40a1d00a3ad82bbffdf18b0ab66e178" exitCode=0 Mar 01 10:07:48 crc kubenswrapper[4792]: I0301 10:07:48.230712 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ltk8j" Mar 01 10:07:48 crc kubenswrapper[4792]: I0301 10:07:48.230713 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ltk8j" event={"ID":"3512046c-8b15-4ab0-8c94-39d0d0f2e73c","Type":"ContainerDied","Data":"caf38347042ec7f347ab71967d0dc62fd40a1d00a3ad82bbffdf18b0ab66e178"} Mar 01 10:07:48 crc kubenswrapper[4792]: I0301 10:07:48.231697 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ltk8j" event={"ID":"3512046c-8b15-4ab0-8c94-39d0d0f2e73c","Type":"ContainerDied","Data":"7a85f6553115dd71b80a2c8f43818f4072ecec30d5eb3e65b978374e0bc127b2"} Mar 01 10:07:48 crc kubenswrapper[4792]: I0301 10:07:48.231729 4792 scope.go:117] "RemoveContainer" containerID="caf38347042ec7f347ab71967d0dc62fd40a1d00a3ad82bbffdf18b0ab66e178" Mar 01 10:07:48 crc kubenswrapper[4792]: I0301 10:07:48.273556 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ltk8j"] Mar 01 10:07:48 crc kubenswrapper[4792]: I0301 10:07:48.277110 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ltk8j"] Mar 01 10:07:48 crc kubenswrapper[4792]: I0301 10:07:48.287620 4792 scope.go:117] "RemoveContainer" containerID="01a152634348be67850a0ff44b28b264a62b707c7ec30ef00d95e9f0a4621d55" Mar 01 10:07:48 crc kubenswrapper[4792]: I0301 10:07:48.311480 4792 scope.go:117] "RemoveContainer" containerID="b863829c371b01a1972e917a6d1591c043a273261668f7d679b1616373a821a3" Mar 01 10:07:48 crc kubenswrapper[4792]: I0301 10:07:48.365133 4792 scope.go:117] "RemoveContainer" containerID="caf38347042ec7f347ab71967d0dc62fd40a1d00a3ad82bbffdf18b0ab66e178" Mar 01 10:07:48 crc kubenswrapper[4792]: E0301 10:07:48.365800 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"caf38347042ec7f347ab71967d0dc62fd40a1d00a3ad82bbffdf18b0ab66e178\": container with ID starting with caf38347042ec7f347ab71967d0dc62fd40a1d00a3ad82bbffdf18b0ab66e178 not found: ID does not exist" containerID="caf38347042ec7f347ab71967d0dc62fd40a1d00a3ad82bbffdf18b0ab66e178" Mar 01 10:07:48 crc kubenswrapper[4792]: I0301 10:07:48.366026 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"caf38347042ec7f347ab71967d0dc62fd40a1d00a3ad82bbffdf18b0ab66e178"} err="failed to get container status \"caf38347042ec7f347ab71967d0dc62fd40a1d00a3ad82bbffdf18b0ab66e178\": rpc error: code = NotFound desc = could not find container \"caf38347042ec7f347ab71967d0dc62fd40a1d00a3ad82bbffdf18b0ab66e178\": container with ID starting with caf38347042ec7f347ab71967d0dc62fd40a1d00a3ad82bbffdf18b0ab66e178 not found: ID does not exist" Mar 01 10:07:48 crc kubenswrapper[4792]: I0301 10:07:48.366058 4792 scope.go:117] "RemoveContainer" containerID="01a152634348be67850a0ff44b28b264a62b707c7ec30ef00d95e9f0a4621d55" Mar 01 10:07:48 crc kubenswrapper[4792]: E0301 10:07:48.366384 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01a152634348be67850a0ff44b28b264a62b707c7ec30ef00d95e9f0a4621d55\": container with ID starting with 01a152634348be67850a0ff44b28b264a62b707c7ec30ef00d95e9f0a4621d55 not found: ID does not exist" containerID="01a152634348be67850a0ff44b28b264a62b707c7ec30ef00d95e9f0a4621d55" Mar 01 10:07:48 crc kubenswrapper[4792]: I0301 10:07:48.366407 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01a152634348be67850a0ff44b28b264a62b707c7ec30ef00d95e9f0a4621d55"} err="failed to get container status \"01a152634348be67850a0ff44b28b264a62b707c7ec30ef00d95e9f0a4621d55\": rpc error: code = NotFound desc = could not find container \"01a152634348be67850a0ff44b28b264a62b707c7ec30ef00d95e9f0a4621d55\": container with ID starting with 01a152634348be67850a0ff44b28b264a62b707c7ec30ef00d95e9f0a4621d55 not found: ID does not exist" Mar 01 10:07:48 crc kubenswrapper[4792]: I0301 10:07:48.366421 4792 scope.go:117] "RemoveContainer" containerID="b863829c371b01a1972e917a6d1591c043a273261668f7d679b1616373a821a3" Mar 01 10:07:48 crc kubenswrapper[4792]: E0301 10:07:48.366931 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b863829c371b01a1972e917a6d1591c043a273261668f7d679b1616373a821a3\": container with ID starting with b863829c371b01a1972e917a6d1591c043a273261668f7d679b1616373a821a3 not found: ID does not exist" containerID="b863829c371b01a1972e917a6d1591c043a273261668f7d679b1616373a821a3" Mar 01 10:07:48 crc kubenswrapper[4792]: I0301 10:07:48.366972 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b863829c371b01a1972e917a6d1591c043a273261668f7d679b1616373a821a3"} err="failed to get container status \"b863829c371b01a1972e917a6d1591c043a273261668f7d679b1616373a821a3\": rpc error: code = NotFound desc = could not find container \"b863829c371b01a1972e917a6d1591c043a273261668f7d679b1616373a821a3\": container with ID starting with b863829c371b01a1972e917a6d1591c043a273261668f7d679b1616373a821a3 not found: ID does not exist" Mar 01 10:07:49 crc kubenswrapper[4792]: I0301 10:07:49.418671 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3512046c-8b15-4ab0-8c94-39d0d0f2e73c" path="/var/lib/kubelet/pods/3512046c-8b15-4ab0-8c94-39d0d0f2e73c/volumes" Mar 01 10:07:50 crc kubenswrapper[4792]: I0301 10:07:50.427315 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pph6g" Mar 01 10:07:50 crc kubenswrapper[4792]: I0301 10:07:50.504245 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pph6g" Mar 01 10:07:51 crc kubenswrapper[4792]: I0301 10:07:51.375693 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pph6g"] Mar 01 10:07:52 crc kubenswrapper[4792]: I0301 10:07:52.268655 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pph6g" podUID="8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef" containerName="registry-server" containerID="cri-o://f2df48ec621324b00f73adb4b8eeec69ea7344052fadf1865efb512b64a14a49" gracePeriod=2 Mar 01 10:07:52 crc kubenswrapper[4792]: I0301 10:07:52.881923 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pph6g" Mar 01 10:07:52 crc kubenswrapper[4792]: I0301 10:07:52.902396 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ch59q\" (UniqueName: \"kubernetes.io/projected/8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef-kube-api-access-ch59q\") pod \"8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef\" (UID: \"8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef\") " Mar 01 10:07:52 crc kubenswrapper[4792]: I0301 10:07:52.902592 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef-catalog-content\") pod \"8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef\" (UID: \"8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef\") " Mar 01 10:07:52 crc kubenswrapper[4792]: I0301 10:07:52.902641 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef-utilities\") pod \"8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef\" (UID: \"8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef\") " Mar 01 10:07:52 crc kubenswrapper[4792]: I0301 10:07:52.904302 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef-utilities" (OuterVolumeSpecName: "utilities") pod "8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef" (UID: "8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:07:52 crc kubenswrapper[4792]: I0301 10:07:52.916207 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef-kube-api-access-ch59q" (OuterVolumeSpecName: "kube-api-access-ch59q") pod "8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef" (UID: "8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef"). InnerVolumeSpecName "kube-api-access-ch59q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:07:53 crc kubenswrapper[4792]: I0301 10:07:53.020754 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ch59q\" (UniqueName: \"kubernetes.io/projected/8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef-kube-api-access-ch59q\") on node \"crc\" DevicePath \"\"" Mar 01 10:07:53 crc kubenswrapper[4792]: I0301 10:07:53.020784 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 10:07:53 crc kubenswrapper[4792]: I0301 10:07:53.059119 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef" (UID: "8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:07:53 crc kubenswrapper[4792]: I0301 10:07:53.125687 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 10:07:53 crc kubenswrapper[4792]: I0301 10:07:53.278973 4792 generic.go:334] "Generic (PLEG): container finished" podID="8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef" containerID="f2df48ec621324b00f73adb4b8eeec69ea7344052fadf1865efb512b64a14a49" exitCode=0 Mar 01 10:07:53 crc kubenswrapper[4792]: I0301 10:07:53.279017 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pph6g" event={"ID":"8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef","Type":"ContainerDied","Data":"f2df48ec621324b00f73adb4b8eeec69ea7344052fadf1865efb512b64a14a49"} Mar 01 10:07:53 crc kubenswrapper[4792]: I0301 10:07:53.279045 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pph6g" event={"ID":"8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef","Type":"ContainerDied","Data":"20eeae96dbcf40cf915287f68860afaea37e4de6c7d4d59e23e4ddf89a89c82c"} Mar 01 10:07:53 crc kubenswrapper[4792]: I0301 10:07:53.279054 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pph6g" Mar 01 10:07:53 crc kubenswrapper[4792]: I0301 10:07:53.279061 4792 scope.go:117] "RemoveContainer" containerID="f2df48ec621324b00f73adb4b8eeec69ea7344052fadf1865efb512b64a14a49" Mar 01 10:07:53 crc kubenswrapper[4792]: I0301 10:07:53.296509 4792 scope.go:117] "RemoveContainer" containerID="e98ec606238ece22cf5bf54f0d9b14ebe2b48e8479d47b0d013ac5c426816a04" Mar 01 10:07:53 crc kubenswrapper[4792]: I0301 10:07:53.319208 4792 scope.go:117] "RemoveContainer" containerID="bdb69b17e8cbcb56946bda3a57e20355267927603056c7235362b2f0563bbc74" Mar 01 10:07:53 crc kubenswrapper[4792]: I0301 10:07:53.323545 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pph6g"] Mar 01 10:07:53 crc kubenswrapper[4792]: I0301 10:07:53.332103 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pph6g"] Mar 01 10:07:53 crc kubenswrapper[4792]: I0301 10:07:53.369228 4792 scope.go:117] "RemoveContainer" containerID="f2df48ec621324b00f73adb4b8eeec69ea7344052fadf1865efb512b64a14a49" Mar 01 10:07:53 crc kubenswrapper[4792]: E0301 10:07:53.369555 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2df48ec621324b00f73adb4b8eeec69ea7344052fadf1865efb512b64a14a49\": container with ID starting with f2df48ec621324b00f73adb4b8eeec69ea7344052fadf1865efb512b64a14a49 not found: ID does not exist" containerID="f2df48ec621324b00f73adb4b8eeec69ea7344052fadf1865efb512b64a14a49" Mar 01 10:07:53 crc kubenswrapper[4792]: I0301 10:07:53.369586 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2df48ec621324b00f73adb4b8eeec69ea7344052fadf1865efb512b64a14a49"} err="failed to get container status \"f2df48ec621324b00f73adb4b8eeec69ea7344052fadf1865efb512b64a14a49\": rpc error: code = NotFound desc = could not find container \"f2df48ec621324b00f73adb4b8eeec69ea7344052fadf1865efb512b64a14a49\": container with ID starting with f2df48ec621324b00f73adb4b8eeec69ea7344052fadf1865efb512b64a14a49 not found: ID does not exist" Mar 01 10:07:53 crc kubenswrapper[4792]: I0301 10:07:53.369607 4792 scope.go:117] "RemoveContainer" containerID="e98ec606238ece22cf5bf54f0d9b14ebe2b48e8479d47b0d013ac5c426816a04" Mar 01 10:07:53 crc kubenswrapper[4792]: E0301 10:07:53.369933 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e98ec606238ece22cf5bf54f0d9b14ebe2b48e8479d47b0d013ac5c426816a04\": container with ID starting with e98ec606238ece22cf5bf54f0d9b14ebe2b48e8479d47b0d013ac5c426816a04 not found: ID does not exist" containerID="e98ec606238ece22cf5bf54f0d9b14ebe2b48e8479d47b0d013ac5c426816a04" Mar 01 10:07:53 crc kubenswrapper[4792]: I0301 10:07:53.369953 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e98ec606238ece22cf5bf54f0d9b14ebe2b48e8479d47b0d013ac5c426816a04"} err="failed to get container status \"e98ec606238ece22cf5bf54f0d9b14ebe2b48e8479d47b0d013ac5c426816a04\": rpc error: code = NotFound desc = could not find container \"e98ec606238ece22cf5bf54f0d9b14ebe2b48e8479d47b0d013ac5c426816a04\": container with ID starting with e98ec606238ece22cf5bf54f0d9b14ebe2b48e8479d47b0d013ac5c426816a04 not found: ID does not exist" Mar 01 10:07:53 crc kubenswrapper[4792]: I0301 10:07:53.369970 4792 scope.go:117] "RemoveContainer" containerID="bdb69b17e8cbcb56946bda3a57e20355267927603056c7235362b2f0563bbc74" Mar 01 10:07:53 crc kubenswrapper[4792]: E0301 10:07:53.370263 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdb69b17e8cbcb56946bda3a57e20355267927603056c7235362b2f0563bbc74\": container with ID starting with bdb69b17e8cbcb56946bda3a57e20355267927603056c7235362b2f0563bbc74 not found: ID does not exist" containerID="bdb69b17e8cbcb56946bda3a57e20355267927603056c7235362b2f0563bbc74" Mar 01 10:07:53 crc kubenswrapper[4792]: I0301 10:07:53.370285 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdb69b17e8cbcb56946bda3a57e20355267927603056c7235362b2f0563bbc74"} err="failed to get container status \"bdb69b17e8cbcb56946bda3a57e20355267927603056c7235362b2f0563bbc74\": rpc error: code = NotFound desc = could not find container \"bdb69b17e8cbcb56946bda3a57e20355267927603056c7235362b2f0563bbc74\": container with ID starting with bdb69b17e8cbcb56946bda3a57e20355267927603056c7235362b2f0563bbc74 not found: ID does not exist" Mar 01 10:07:53 crc kubenswrapper[4792]: I0301 10:07:53.418626 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef" path="/var/lib/kubelet/pods/8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef/volumes" Mar 01 10:08:00 crc kubenswrapper[4792]: I0301 10:08:00.146925 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539328-6t6hj"] Mar 01 10:08:00 crc kubenswrapper[4792]: E0301 10:08:00.147895 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3512046c-8b15-4ab0-8c94-39d0d0f2e73c" containerName="extract-utilities" Mar 01 10:08:00 crc kubenswrapper[4792]: I0301 10:08:00.147930 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3512046c-8b15-4ab0-8c94-39d0d0f2e73c" containerName="extract-utilities" Mar 01 10:08:00 crc kubenswrapper[4792]: E0301 10:08:00.147941 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3512046c-8b15-4ab0-8c94-39d0d0f2e73c" containerName="extract-content" Mar 01 10:08:00 crc kubenswrapper[4792]: I0301 10:08:00.147949 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3512046c-8b15-4ab0-8c94-39d0d0f2e73c" containerName="extract-content" Mar 01 10:08:00 crc kubenswrapper[4792]: E0301 10:08:00.147959 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="682f4383-d3fb-4efe-89f5-e496b4b3b71b" containerName="extract-utilities" Mar 01 10:08:00 crc kubenswrapper[4792]: I0301 10:08:00.147968 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="682f4383-d3fb-4efe-89f5-e496b4b3b71b" containerName="extract-utilities" Mar 01 10:08:00 crc kubenswrapper[4792]: E0301 10:08:00.147993 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef" containerName="extract-utilities" Mar 01 10:08:00 crc kubenswrapper[4792]: I0301 10:08:00.148002 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef" containerName="extract-utilities" Mar 01 10:08:00 crc kubenswrapper[4792]: E0301 10:08:00.148015 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3512046c-8b15-4ab0-8c94-39d0d0f2e73c" containerName="registry-server" Mar 01 10:08:00 crc kubenswrapper[4792]: I0301 10:08:00.148023 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3512046c-8b15-4ab0-8c94-39d0d0f2e73c" containerName="registry-server" Mar 01 10:08:00 crc kubenswrapper[4792]: E0301 10:08:00.148038 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="682f4383-d3fb-4efe-89f5-e496b4b3b71b" containerName="registry-server" Mar 01 10:08:00 crc kubenswrapper[4792]: I0301 10:08:00.148045 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="682f4383-d3fb-4efe-89f5-e496b4b3b71b" containerName="registry-server" Mar 01 10:08:00 crc kubenswrapper[4792]: E0301 10:08:00.148067 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef" containerName="extract-content" Mar 01 10:08:00 crc kubenswrapper[4792]: I0301 10:08:00.148075 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef" containerName="extract-content" Mar 01 10:08:00 crc kubenswrapper[4792]: E0301 10:08:00.148091 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="682f4383-d3fb-4efe-89f5-e496b4b3b71b" containerName="extract-content" Mar 01 10:08:00 crc kubenswrapper[4792]: I0301 10:08:00.148099 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="682f4383-d3fb-4efe-89f5-e496b4b3b71b" containerName="extract-content" Mar 01 10:08:00 crc kubenswrapper[4792]: E0301 10:08:00.148118 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef" containerName="registry-server" Mar 01 10:08:00 crc kubenswrapper[4792]: I0301 10:08:00.148125 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef" containerName="registry-server" Mar 01 10:08:00 crc kubenswrapper[4792]: I0301 10:08:00.148336 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="3512046c-8b15-4ab0-8c94-39d0d0f2e73c" containerName="registry-server" Mar 01 10:08:00 crc kubenswrapper[4792]: I0301 10:08:00.148356 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="682f4383-d3fb-4efe-89f5-e496b4b3b71b" containerName="registry-server" Mar 01 10:08:00 crc kubenswrapper[4792]: I0301 10:08:00.148383 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef" containerName="registry-server" Mar 01 10:08:00 crc kubenswrapper[4792]: I0301 10:08:00.149141 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539328-6t6hj" Mar 01 10:08:00 crc kubenswrapper[4792]: I0301 10:08:00.151943 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 10:08:00 crc kubenswrapper[4792]: I0301 10:08:00.152197 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 10:08:00 crc kubenswrapper[4792]: I0301 10:08:00.152412 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 10:08:00 crc kubenswrapper[4792]: I0301 10:08:00.159351 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539328-6t6hj"] Mar 01 10:08:00 crc kubenswrapper[4792]: I0301 10:08:00.259462 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmmjq\" (UniqueName: \"kubernetes.io/projected/42e71344-31b1-4817-b2e4-dd9aebb9d38e-kube-api-access-qmmjq\") pod \"auto-csr-approver-29539328-6t6hj\" (UID: \"42e71344-31b1-4817-b2e4-dd9aebb9d38e\") " pod="openshift-infra/auto-csr-approver-29539328-6t6hj" Mar 01 10:08:00 crc kubenswrapper[4792]: I0301 10:08:00.361154 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmmjq\" (UniqueName: \"kubernetes.io/projected/42e71344-31b1-4817-b2e4-dd9aebb9d38e-kube-api-access-qmmjq\") pod \"auto-csr-approver-29539328-6t6hj\" (UID: \"42e71344-31b1-4817-b2e4-dd9aebb9d38e\") " pod="openshift-infra/auto-csr-approver-29539328-6t6hj" Mar 01 10:08:00 crc kubenswrapper[4792]: I0301 10:08:00.394955 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmmjq\" (UniqueName: \"kubernetes.io/projected/42e71344-31b1-4817-b2e4-dd9aebb9d38e-kube-api-access-qmmjq\") pod \"auto-csr-approver-29539328-6t6hj\" (UID: \"42e71344-31b1-4817-b2e4-dd9aebb9d38e\") " pod="openshift-infra/auto-csr-approver-29539328-6t6hj" Mar 01 10:08:00 crc kubenswrapper[4792]: I0301 10:08:00.468328 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539328-6t6hj" Mar 01 10:08:00 crc kubenswrapper[4792]: I0301 10:08:00.952371 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539328-6t6hj"] Mar 01 10:08:01 crc kubenswrapper[4792]: I0301 10:08:01.341476 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539328-6t6hj" event={"ID":"42e71344-31b1-4817-b2e4-dd9aebb9d38e","Type":"ContainerStarted","Data":"df7225161bec7252d57bf5e2340343fa6752a0760a457d189ea4e699057f74c6"} Mar 01 10:08:02 crc kubenswrapper[4792]: I0301 10:08:02.354485 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539328-6t6hj" event={"ID":"42e71344-31b1-4817-b2e4-dd9aebb9d38e","Type":"ContainerStarted","Data":"c3e0ad9fbd90b282daf14548418069de81001946b8230adb9c02a7933419e3fe"} Mar 01 10:08:02 crc kubenswrapper[4792]: I0301 10:08:02.377572 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29539328-6t6hj" podStartSLOduration=1.572232772 podStartE2EDuration="2.377546733s" podCreationTimestamp="2026-03-01 10:08:00 +0000 UTC" firstStartedPulling="2026-03-01 10:08:00.971705255 +0000 UTC m=+3610.213584452" lastFinishedPulling="2026-03-01 10:08:01.777019216 +0000 UTC m=+3611.018898413" observedRunningTime="2026-03-01 10:08:02.370554009 +0000 UTC m=+3611.612433226" watchObservedRunningTime="2026-03-01 10:08:02.377546733 +0000 UTC m=+3611.619425940" Mar 01 10:08:03 crc kubenswrapper[4792]: I0301 10:08:03.364173 4792 generic.go:334] "Generic (PLEG): container finished" podID="42e71344-31b1-4817-b2e4-dd9aebb9d38e" containerID="c3e0ad9fbd90b282daf14548418069de81001946b8230adb9c02a7933419e3fe" exitCode=0 Mar 01 10:08:03 crc kubenswrapper[4792]: I0301 10:08:03.364217 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539328-6t6hj" event={"ID":"42e71344-31b1-4817-b2e4-dd9aebb9d38e","Type":"ContainerDied","Data":"c3e0ad9fbd90b282daf14548418069de81001946b8230adb9c02a7933419e3fe"} Mar 01 10:08:04 crc kubenswrapper[4792]: I0301 10:08:04.759981 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539328-6t6hj" Mar 01 10:08:04 crc kubenswrapper[4792]: I0301 10:08:04.851787 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmmjq\" (UniqueName: \"kubernetes.io/projected/42e71344-31b1-4817-b2e4-dd9aebb9d38e-kube-api-access-qmmjq\") pod \"42e71344-31b1-4817-b2e4-dd9aebb9d38e\" (UID: \"42e71344-31b1-4817-b2e4-dd9aebb9d38e\") " Mar 01 10:08:04 crc kubenswrapper[4792]: I0301 10:08:04.857371 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42e71344-31b1-4817-b2e4-dd9aebb9d38e-kube-api-access-qmmjq" (OuterVolumeSpecName: "kube-api-access-qmmjq") pod "42e71344-31b1-4817-b2e4-dd9aebb9d38e" (UID: "42e71344-31b1-4817-b2e4-dd9aebb9d38e"). InnerVolumeSpecName "kube-api-access-qmmjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:08:04 crc kubenswrapper[4792]: I0301 10:08:04.954407 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmmjq\" (UniqueName: \"kubernetes.io/projected/42e71344-31b1-4817-b2e4-dd9aebb9d38e-kube-api-access-qmmjq\") on node \"crc\" DevicePath \"\"" Mar 01 10:08:05 crc kubenswrapper[4792]: I0301 10:08:05.386957 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539328-6t6hj" event={"ID":"42e71344-31b1-4817-b2e4-dd9aebb9d38e","Type":"ContainerDied","Data":"df7225161bec7252d57bf5e2340343fa6752a0760a457d189ea4e699057f74c6"} Mar 01 10:08:05 crc kubenswrapper[4792]: I0301 10:08:05.387013 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df7225161bec7252d57bf5e2340343fa6752a0760a457d189ea4e699057f74c6" Mar 01 10:08:05 crc kubenswrapper[4792]: I0301 10:08:05.387010 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539328-6t6hj" Mar 01 10:08:05 crc kubenswrapper[4792]: I0301 10:08:05.842328 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539322-cwkrx"] Mar 01 10:08:05 crc kubenswrapper[4792]: I0301 10:08:05.870645 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539322-cwkrx"] Mar 01 10:08:07 crc kubenswrapper[4792]: I0301 10:08:07.419060 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1097b0e-1156-4f3c-b1e9-6f7b83d0e07b" path="/var/lib/kubelet/pods/d1097b0e-1156-4f3c-b1e9-6f7b83d0e07b/volumes" Mar 01 10:08:32 crc kubenswrapper[4792]: I0301 10:08:32.696983 4792 scope.go:117] "RemoveContainer" containerID="ec82ee72087ccca998ba33c4f0000c5036f1693b6f0bbb69a46ea60c4b7efd7e" Mar 01 10:08:32 crc kubenswrapper[4792]: I0301 10:08:32.773062 4792 scope.go:117] "RemoveContainer" containerID="e9350b459945620c5d55700a03e5aa49740d1a4854b26be997bf075b28191bfe" Mar 01 10:08:32 crc kubenswrapper[4792]: I0301 10:08:32.806219 4792 scope.go:117] "RemoveContainer" containerID="f8623324113a06ed0f840a7abbd929eaf57c541d1aae8f4b05f1bf6b431b24fe" Mar 01 10:09:04 crc kubenswrapper[4792]: I0301 10:09:04.943478 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 10:09:04 crc kubenswrapper[4792]: I0301 10:09:04.944760 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 10:09:34 crc kubenswrapper[4792]: I0301 10:09:34.942668 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 10:09:34 crc kubenswrapper[4792]: I0301 10:09:34.943098 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 10:10:00 crc kubenswrapper[4792]: I0301 10:10:00.193316 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539330-wqct7"] Mar 01 10:10:00 crc kubenswrapper[4792]: E0301 10:10:00.193997 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42e71344-31b1-4817-b2e4-dd9aebb9d38e" containerName="oc" Mar 01 10:10:00 crc kubenswrapper[4792]: I0301 10:10:00.194010 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="42e71344-31b1-4817-b2e4-dd9aebb9d38e" containerName="oc" Mar 01 10:10:00 crc kubenswrapper[4792]: I0301 10:10:00.194221 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="42e71344-31b1-4817-b2e4-dd9aebb9d38e" containerName="oc" Mar 01 10:10:00 crc kubenswrapper[4792]: I0301 10:10:00.194884 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539330-wqct7" Mar 01 10:10:00 crc kubenswrapper[4792]: I0301 10:10:00.198314 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 10:10:00 crc kubenswrapper[4792]: I0301 10:10:00.198503 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 10:10:00 crc kubenswrapper[4792]: I0301 10:10:00.198629 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 10:10:00 crc kubenswrapper[4792]: I0301 10:10:00.206772 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539330-wqct7"] Mar 01 10:10:00 crc kubenswrapper[4792]: I0301 10:10:00.295507 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9chm\" (UniqueName: \"kubernetes.io/projected/a9025376-622f-4b7e-94d5-c5b136e139d8-kube-api-access-r9chm\") pod \"auto-csr-approver-29539330-wqct7\" (UID: \"a9025376-622f-4b7e-94d5-c5b136e139d8\") " pod="openshift-infra/auto-csr-approver-29539330-wqct7" Mar 01 10:10:00 crc kubenswrapper[4792]: I0301 10:10:00.396979 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9chm\" (UniqueName: \"kubernetes.io/projected/a9025376-622f-4b7e-94d5-c5b136e139d8-kube-api-access-r9chm\") pod \"auto-csr-approver-29539330-wqct7\" (UID: \"a9025376-622f-4b7e-94d5-c5b136e139d8\") " pod="openshift-infra/auto-csr-approver-29539330-wqct7" Mar 01 10:10:00 crc kubenswrapper[4792]: I0301 10:10:00.415374 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9chm\" (UniqueName: \"kubernetes.io/projected/a9025376-622f-4b7e-94d5-c5b136e139d8-kube-api-access-r9chm\") pod \"auto-csr-approver-29539330-wqct7\" (UID: \"a9025376-622f-4b7e-94d5-c5b136e139d8\") " pod="openshift-infra/auto-csr-approver-29539330-wqct7" Mar 01 10:10:00 crc kubenswrapper[4792]: I0301 10:10:00.514550 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539330-wqct7" Mar 01 10:10:00 crc kubenswrapper[4792]: I0301 10:10:00.986002 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539330-wqct7"] Mar 01 10:10:01 crc kubenswrapper[4792]: I0301 10:10:01.394197 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539330-wqct7" event={"ID":"a9025376-622f-4b7e-94d5-c5b136e139d8","Type":"ContainerStarted","Data":"b1a908e36847c130d22fa7138ad124b33ebf296414af538320c02b60d31d23bc"} Mar 01 10:10:02 crc kubenswrapper[4792]: I0301 10:10:02.403010 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539330-wqct7" event={"ID":"a9025376-622f-4b7e-94d5-c5b136e139d8","Type":"ContainerStarted","Data":"efa17f624abb4097f736a6d5cc9c3131b915c07d36d54a5857fd159d31a5a613"} Mar 01 10:10:02 crc kubenswrapper[4792]: I0301 10:10:02.420196 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29539330-wqct7" podStartSLOduration=1.272680491 podStartE2EDuration="2.420174711s" podCreationTimestamp="2026-03-01 10:10:00 +0000 UTC" firstStartedPulling="2026-03-01 10:10:00.988252172 +0000 UTC m=+3730.230131409" lastFinishedPulling="2026-03-01 10:10:02.135746432 +0000 UTC m=+3731.377625629" observedRunningTime="2026-03-01 10:10:02.420028538 +0000 UTC m=+3731.661907735" watchObservedRunningTime="2026-03-01 10:10:02.420174711 +0000 UTC m=+3731.662053898" Mar 01 10:10:03 crc kubenswrapper[4792]: I0301 10:10:03.418933 4792 generic.go:334] "Generic (PLEG): container finished" podID="a9025376-622f-4b7e-94d5-c5b136e139d8" containerID="efa17f624abb4097f736a6d5cc9c3131b915c07d36d54a5857fd159d31a5a613" exitCode=0 Mar 01 10:10:03 crc kubenswrapper[4792]: I0301 10:10:03.419004 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539330-wqct7" event={"ID":"a9025376-622f-4b7e-94d5-c5b136e139d8","Type":"ContainerDied","Data":"efa17f624abb4097f736a6d5cc9c3131b915c07d36d54a5857fd159d31a5a613"} Mar 01 10:10:04 crc kubenswrapper[4792]: I0301 10:10:04.834456 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539330-wqct7" Mar 01 10:10:04 crc kubenswrapper[4792]: I0301 10:10:04.891332 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9chm\" (UniqueName: \"kubernetes.io/projected/a9025376-622f-4b7e-94d5-c5b136e139d8-kube-api-access-r9chm\") pod \"a9025376-622f-4b7e-94d5-c5b136e139d8\" (UID: \"a9025376-622f-4b7e-94d5-c5b136e139d8\") " Mar 01 10:10:04 crc kubenswrapper[4792]: I0301 10:10:04.897105 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9025376-622f-4b7e-94d5-c5b136e139d8-kube-api-access-r9chm" (OuterVolumeSpecName: "kube-api-access-r9chm") pod "a9025376-622f-4b7e-94d5-c5b136e139d8" (UID: "a9025376-622f-4b7e-94d5-c5b136e139d8"). InnerVolumeSpecName "kube-api-access-r9chm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:10:04 crc kubenswrapper[4792]: I0301 10:10:04.944365 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 10:10:04 crc kubenswrapper[4792]: I0301 10:10:04.944584 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 10:10:04 crc kubenswrapper[4792]: I0301 10:10:04.944718 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 10:10:04 crc kubenswrapper[4792]: I0301 10:10:04.948867 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"947fc3b8131dfb1dafd7693a1530cb8db816d30e8340c8877587b0f93967d46d"} pod="openshift-machine-config-operator/machine-config-daemon-bqszv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 01 10:10:04 crc kubenswrapper[4792]: I0301 10:10:04.949107 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" containerID="cri-o://947fc3b8131dfb1dafd7693a1530cb8db816d30e8340c8877587b0f93967d46d" gracePeriod=600 Mar 01 10:10:04 crc kubenswrapper[4792]: I0301 10:10:04.996109 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9chm\" (UniqueName: \"kubernetes.io/projected/a9025376-622f-4b7e-94d5-c5b136e139d8-kube-api-access-r9chm\") on node \"crc\" DevicePath \"\"" Mar 01 10:10:05 crc kubenswrapper[4792]: I0301 10:10:05.443579 4792 generic.go:334] "Generic (PLEG): container finished" podID="9105f6b0-6f16-47aa-8009-73736a90b765" containerID="947fc3b8131dfb1dafd7693a1530cb8db816d30e8340c8877587b0f93967d46d" exitCode=0 Mar 01 10:10:05 crc kubenswrapper[4792]: I0301 10:10:05.443737 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerDied","Data":"947fc3b8131dfb1dafd7693a1530cb8db816d30e8340c8877587b0f93967d46d"} Mar 01 10:10:05 crc kubenswrapper[4792]: I0301 10:10:05.443857 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerStarted","Data":"5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de"} Mar 01 10:10:05 crc kubenswrapper[4792]: I0301 10:10:05.443874 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:10:05 crc kubenswrapper[4792]: I0301 10:10:05.449482 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539330-wqct7" Mar 01 10:10:05 crc kubenswrapper[4792]: I0301 10:10:05.449396 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539330-wqct7" event={"ID":"a9025376-622f-4b7e-94d5-c5b136e139d8","Type":"ContainerDied","Data":"b1a908e36847c130d22fa7138ad124b33ebf296414af538320c02b60d31d23bc"} Mar 01 10:10:05 crc kubenswrapper[4792]: I0301 10:10:05.454938 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1a908e36847c130d22fa7138ad124b33ebf296414af538320c02b60d31d23bc" Mar 01 10:10:05 crc kubenswrapper[4792]: I0301 10:10:05.900216 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539324-wjq94"] Mar 01 10:10:05 crc kubenswrapper[4792]: I0301 10:10:05.908858 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539324-wjq94"] Mar 01 10:10:07 crc kubenswrapper[4792]: I0301 10:10:07.429484 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41b7071d-7243-4ca4-82e6-c153c3001d1f" path="/var/lib/kubelet/pods/41b7071d-7243-4ca4-82e6-c153c3001d1f/volumes" Mar 01 10:10:32 crc kubenswrapper[4792]: I0301 10:10:32.962793 4792 scope.go:117] "RemoveContainer" containerID="e004ff88b9b0e0f691af76b372e4044089dce1eaaf6619be9de9fed1b4d58c18" Mar 01 10:10:56 crc kubenswrapper[4792]: I0301 10:10:56.039443 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-svb22"] Mar 01 10:10:56 crc kubenswrapper[4792]: I0301 10:10:56.048699 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-svb22"] Mar 01 10:10:56 crc kubenswrapper[4792]: I0301 10:10:56.058571 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-fa75-account-create-update-s2kng"] Mar 01 10:10:56 crc kubenswrapper[4792]: I0301 10:10:56.068723 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-fa75-account-create-update-s2kng"] Mar 01 10:10:57 crc kubenswrapper[4792]: I0301 10:10:57.419392 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0441b486-847a-4f32-8df2-a1284f39ee5d" path="/var/lib/kubelet/pods/0441b486-847a-4f32-8df2-a1284f39ee5d/volumes" Mar 01 10:10:57 crc kubenswrapper[4792]: I0301 10:10:57.421681 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7736148-bc12-4621-a1d2-efc4a0143b42" path="/var/lib/kubelet/pods/c7736148-bc12-4621-a1d2-efc4a0143b42/volumes" Mar 01 10:11:31 crc kubenswrapper[4792]: I0301 10:11:31.047226 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-p2dtn"] Mar 01 10:11:31 crc kubenswrapper[4792]: I0301 10:11:31.059847 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-p2dtn"] Mar 01 10:11:31 crc kubenswrapper[4792]: I0301 10:11:31.420061 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01bf5dae-6217-4644-9c9b-65d3886a4dc1" path="/var/lib/kubelet/pods/01bf5dae-6217-4644-9c9b-65d3886a4dc1/volumes" Mar 01 10:11:33 crc kubenswrapper[4792]: I0301 10:11:33.036519 4792 scope.go:117] "RemoveContainer" containerID="bf57ceafd6066a28052f3666ed7d384740c5837c8329274397bd6fa48c44d661" Mar 01 10:11:33 crc kubenswrapper[4792]: I0301 10:11:33.076558 4792 scope.go:117] "RemoveContainer" containerID="a0b92e5e12c9f6c28a62c7a978997e469e5d99be007aba61824b2b6c8d62ffa5" Mar 01 10:11:33 crc kubenswrapper[4792]: I0301 10:11:33.123136 4792 scope.go:117] "RemoveContainer" containerID="31b1b88de471cccce9a774cf2494168f31cb583ae731b5c4c5efee9a815b5533" Mar 01 10:12:00 crc kubenswrapper[4792]: I0301 10:12:00.147402 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539332-f6brk"] Mar 01 10:12:00 crc kubenswrapper[4792]: E0301 10:12:00.148292 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9025376-622f-4b7e-94d5-c5b136e139d8" containerName="oc" Mar 01 10:12:00 crc kubenswrapper[4792]: I0301 10:12:00.148307 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9025376-622f-4b7e-94d5-c5b136e139d8" containerName="oc" Mar 01 10:12:00 crc kubenswrapper[4792]: I0301 10:12:00.148503 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9025376-622f-4b7e-94d5-c5b136e139d8" containerName="oc" Mar 01 10:12:00 crc kubenswrapper[4792]: I0301 10:12:00.149369 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539332-f6brk" Mar 01 10:12:00 crc kubenswrapper[4792]: I0301 10:12:00.152042 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 10:12:00 crc kubenswrapper[4792]: I0301 10:12:00.152322 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 10:12:00 crc kubenswrapper[4792]: I0301 10:12:00.156574 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539332-f6brk"] Mar 01 10:12:00 crc kubenswrapper[4792]: I0301 10:12:00.158021 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 10:12:00 crc kubenswrapper[4792]: I0301 10:12:00.283239 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj8q7\" (UniqueName: \"kubernetes.io/projected/5f9c6b5a-8834-45a7-bf9d-000bcfd068f3-kube-api-access-sj8q7\") pod \"auto-csr-approver-29539332-f6brk\" (UID: \"5f9c6b5a-8834-45a7-bf9d-000bcfd068f3\") " pod="openshift-infra/auto-csr-approver-29539332-f6brk" Mar 01 10:12:00 crc kubenswrapper[4792]: I0301 10:12:00.384622 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj8q7\" (UniqueName: \"kubernetes.io/projected/5f9c6b5a-8834-45a7-bf9d-000bcfd068f3-kube-api-access-sj8q7\") pod \"auto-csr-approver-29539332-f6brk\" (UID: \"5f9c6b5a-8834-45a7-bf9d-000bcfd068f3\") " pod="openshift-infra/auto-csr-approver-29539332-f6brk" Mar 01 10:12:00 crc kubenswrapper[4792]: I0301 10:12:00.409120 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj8q7\" (UniqueName: \"kubernetes.io/projected/5f9c6b5a-8834-45a7-bf9d-000bcfd068f3-kube-api-access-sj8q7\") pod \"auto-csr-approver-29539332-f6brk\" (UID: \"5f9c6b5a-8834-45a7-bf9d-000bcfd068f3\") " pod="openshift-infra/auto-csr-approver-29539332-f6brk" Mar 01 10:12:00 crc kubenswrapper[4792]: I0301 10:12:00.477793 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539332-f6brk" Mar 01 10:12:00 crc kubenswrapper[4792]: I0301 10:12:00.918786 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539332-f6brk"] Mar 01 10:12:01 crc kubenswrapper[4792]: I0301 10:12:01.450273 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539332-f6brk" event={"ID":"5f9c6b5a-8834-45a7-bf9d-000bcfd068f3","Type":"ContainerStarted","Data":"aaf8adc92bf2959fc5697ba19835736125cffd65ce1dccd42bd001b1bacd2428"} Mar 01 10:12:02 crc kubenswrapper[4792]: I0301 10:12:02.464800 4792 generic.go:334] "Generic (PLEG): container finished" podID="5f9c6b5a-8834-45a7-bf9d-000bcfd068f3" containerID="92df935b0380167f616deb42823e3a7a1744b9e85454d7327ad662b50fab1963" exitCode=0 Mar 01 10:12:02 crc kubenswrapper[4792]: I0301 10:12:02.465596 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539332-f6brk" event={"ID":"5f9c6b5a-8834-45a7-bf9d-000bcfd068f3","Type":"ContainerDied","Data":"92df935b0380167f616deb42823e3a7a1744b9e85454d7327ad662b50fab1963"} Mar 01 10:12:04 crc kubenswrapper[4792]: I0301 10:12:04.482504 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539332-f6brk" event={"ID":"5f9c6b5a-8834-45a7-bf9d-000bcfd068f3","Type":"ContainerDied","Data":"aaf8adc92bf2959fc5697ba19835736125cffd65ce1dccd42bd001b1bacd2428"} Mar 01 10:12:04 crc kubenswrapper[4792]: I0301 10:12:04.482774 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aaf8adc92bf2959fc5697ba19835736125cffd65ce1dccd42bd001b1bacd2428" Mar 01 10:12:04 crc kubenswrapper[4792]: I0301 10:12:04.506023 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539332-f6brk" Mar 01 10:12:04 crc kubenswrapper[4792]: I0301 10:12:04.572970 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sj8q7\" (UniqueName: \"kubernetes.io/projected/5f9c6b5a-8834-45a7-bf9d-000bcfd068f3-kube-api-access-sj8q7\") pod \"5f9c6b5a-8834-45a7-bf9d-000bcfd068f3\" (UID: \"5f9c6b5a-8834-45a7-bf9d-000bcfd068f3\") " Mar 01 10:12:04 crc kubenswrapper[4792]: I0301 10:12:04.587771 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f9c6b5a-8834-45a7-bf9d-000bcfd068f3-kube-api-access-sj8q7" (OuterVolumeSpecName: "kube-api-access-sj8q7") pod "5f9c6b5a-8834-45a7-bf9d-000bcfd068f3" (UID: "5f9c6b5a-8834-45a7-bf9d-000bcfd068f3"). InnerVolumeSpecName "kube-api-access-sj8q7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:12:04 crc kubenswrapper[4792]: I0301 10:12:04.675834 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sj8q7\" (UniqueName: \"kubernetes.io/projected/5f9c6b5a-8834-45a7-bf9d-000bcfd068f3-kube-api-access-sj8q7\") on node \"crc\" DevicePath \"\"" Mar 01 10:12:05 crc kubenswrapper[4792]: I0301 10:12:05.490420 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539332-f6brk" Mar 01 10:12:05 crc kubenswrapper[4792]: I0301 10:12:05.581631 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539326-7hz59"] Mar 01 10:12:05 crc kubenswrapper[4792]: I0301 10:12:05.589893 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539326-7hz59"] Mar 01 10:12:07 crc kubenswrapper[4792]: I0301 10:12:07.419544 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e772cc9c-90e7-4e3a-b48d-bb13a4aa0cc8" path="/var/lib/kubelet/pods/e772cc9c-90e7-4e3a-b48d-bb13a4aa0cc8/volumes" Mar 01 10:12:33 crc kubenswrapper[4792]: I0301 10:12:33.218467 4792 scope.go:117] "RemoveContainer" containerID="985260a2ba2153789f87b2fc888d57bf9b851f86fd15aaf0dca4797eeb86773f" Mar 01 10:12:34 crc kubenswrapper[4792]: I0301 10:12:34.943366 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 10:12:34 crc kubenswrapper[4792]: I0301 10:12:34.944046 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 10:13:04 crc kubenswrapper[4792]: I0301 10:13:04.942463 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 10:13:04 crc kubenswrapper[4792]: I0301 10:13:04.942948 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 10:13:34 crc kubenswrapper[4792]: I0301 10:13:34.943437 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 10:13:34 crc kubenswrapper[4792]: I0301 10:13:34.944032 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 10:13:34 crc kubenswrapper[4792]: I0301 10:13:34.944080 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 10:13:34 crc kubenswrapper[4792]: I0301 10:13:34.944930 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de"} pod="openshift-machine-config-operator/machine-config-daemon-bqszv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 01 10:13:34 crc kubenswrapper[4792]: I0301 10:13:34.944993 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" containerID="cri-o://5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" gracePeriod=600 Mar 01 10:13:35 crc kubenswrapper[4792]: E0301 10:13:35.091229 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:13:35 crc kubenswrapper[4792]: I0301 10:13:35.838823 4792 generic.go:334] "Generic (PLEG): container finished" podID="9105f6b0-6f16-47aa-8009-73736a90b765" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" exitCode=0 Mar 01 10:13:35 crc kubenswrapper[4792]: I0301 10:13:35.838871 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerDied","Data":"5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de"} Mar 01 10:13:35 crc kubenswrapper[4792]: I0301 10:13:35.838939 4792 scope.go:117] "RemoveContainer" containerID="947fc3b8131dfb1dafd7693a1530cb8db816d30e8340c8877587b0f93967d46d" Mar 01 10:13:35 crc kubenswrapper[4792]: I0301 10:13:35.840051 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:13:35 crc kubenswrapper[4792]: E0301 10:13:35.840402 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:13:50 crc kubenswrapper[4792]: I0301 10:13:50.409676 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:13:50 crc kubenswrapper[4792]: E0301 10:13:50.410502 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:14:00 crc kubenswrapper[4792]: I0301 10:14:00.152568 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539334-c4cd2"] Mar 01 10:14:00 crc kubenswrapper[4792]: E0301 10:14:00.154930 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f9c6b5a-8834-45a7-bf9d-000bcfd068f3" containerName="oc" Mar 01 10:14:00 crc kubenswrapper[4792]: I0301 10:14:00.155037 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f9c6b5a-8834-45a7-bf9d-000bcfd068f3" containerName="oc" Mar 01 10:14:00 crc kubenswrapper[4792]: I0301 10:14:00.155391 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f9c6b5a-8834-45a7-bf9d-000bcfd068f3" containerName="oc" Mar 01 10:14:00 crc kubenswrapper[4792]: I0301 10:14:00.158221 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539334-c4cd2" Mar 01 10:14:00 crc kubenswrapper[4792]: I0301 10:14:00.162225 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 10:14:00 crc kubenswrapper[4792]: I0301 10:14:00.162448 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 10:14:00 crc kubenswrapper[4792]: I0301 10:14:00.162598 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 10:14:00 crc kubenswrapper[4792]: I0301 10:14:00.178680 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539334-c4cd2"] Mar 01 10:14:00 crc kubenswrapper[4792]: I0301 10:14:00.335073 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdzn4\" (UniqueName: \"kubernetes.io/projected/807c2da3-ef0e-4e89-9457-37401354a8e9-kube-api-access-vdzn4\") pod \"auto-csr-approver-29539334-c4cd2\" (UID: \"807c2da3-ef0e-4e89-9457-37401354a8e9\") " pod="openshift-infra/auto-csr-approver-29539334-c4cd2" Mar 01 10:14:00 crc kubenswrapper[4792]: I0301 10:14:00.437694 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdzn4\" (UniqueName: \"kubernetes.io/projected/807c2da3-ef0e-4e89-9457-37401354a8e9-kube-api-access-vdzn4\") pod \"auto-csr-approver-29539334-c4cd2\" (UID: \"807c2da3-ef0e-4e89-9457-37401354a8e9\") " pod="openshift-infra/auto-csr-approver-29539334-c4cd2" Mar 01 10:14:00 crc kubenswrapper[4792]: I0301 10:14:00.456462 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdzn4\" (UniqueName: \"kubernetes.io/projected/807c2da3-ef0e-4e89-9457-37401354a8e9-kube-api-access-vdzn4\") pod \"auto-csr-approver-29539334-c4cd2\" (UID: \"807c2da3-ef0e-4e89-9457-37401354a8e9\") " pod="openshift-infra/auto-csr-approver-29539334-c4cd2" Mar 01 10:14:00 crc kubenswrapper[4792]: I0301 10:14:00.484391 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539334-c4cd2" Mar 01 10:14:00 crc kubenswrapper[4792]: I0301 10:14:00.950296 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539334-c4cd2"] Mar 01 10:14:00 crc kubenswrapper[4792]: I0301 10:14:00.953708 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 01 10:14:01 crc kubenswrapper[4792]: I0301 10:14:01.064071 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539334-c4cd2" event={"ID":"807c2da3-ef0e-4e89-9457-37401354a8e9","Type":"ContainerStarted","Data":"e137a30b501156f803311eaa6d17b90f3d610114a7971add24db537517a94c70"} Mar 01 10:14:02 crc kubenswrapper[4792]: I0301 10:14:02.072025 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539334-c4cd2" event={"ID":"807c2da3-ef0e-4e89-9457-37401354a8e9","Type":"ContainerStarted","Data":"b30f740be1f0c17125e3d3e62c51a8c03429e4e9dd6640d5fa13fe74408f6823"} Mar 01 10:14:02 crc kubenswrapper[4792]: I0301 10:14:02.088199 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29539334-c4cd2" podStartSLOduration=1.267282525 podStartE2EDuration="2.088182109s" podCreationTimestamp="2026-03-01 10:14:00 +0000 UTC" firstStartedPulling="2026-03-01 10:14:00.953516524 +0000 UTC m=+3970.195395721" lastFinishedPulling="2026-03-01 10:14:01.774416108 +0000 UTC m=+3971.016295305" observedRunningTime="2026-03-01 10:14:02.083232167 +0000 UTC m=+3971.325111364" watchObservedRunningTime="2026-03-01 10:14:02.088182109 +0000 UTC m=+3971.330061306" Mar 01 10:14:03 crc kubenswrapper[4792]: I0301 10:14:03.081630 4792 generic.go:334] "Generic (PLEG): container finished" podID="807c2da3-ef0e-4e89-9457-37401354a8e9" containerID="b30f740be1f0c17125e3d3e62c51a8c03429e4e9dd6640d5fa13fe74408f6823" exitCode=0 Mar 01 10:14:03 crc kubenswrapper[4792]: I0301 10:14:03.081662 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539334-c4cd2" event={"ID":"807c2da3-ef0e-4e89-9457-37401354a8e9","Type":"ContainerDied","Data":"b30f740be1f0c17125e3d3e62c51a8c03429e4e9dd6640d5fa13fe74408f6823"} Mar 01 10:14:03 crc kubenswrapper[4792]: I0301 10:14:03.408374 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:14:03 crc kubenswrapper[4792]: E0301 10:14:03.408735 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:14:04 crc kubenswrapper[4792]: I0301 10:14:04.470125 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539334-c4cd2" Mar 01 10:14:04 crc kubenswrapper[4792]: I0301 10:14:04.513758 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdzn4\" (UniqueName: \"kubernetes.io/projected/807c2da3-ef0e-4e89-9457-37401354a8e9-kube-api-access-vdzn4\") pod \"807c2da3-ef0e-4e89-9457-37401354a8e9\" (UID: \"807c2da3-ef0e-4e89-9457-37401354a8e9\") " Mar 01 10:14:04 crc kubenswrapper[4792]: I0301 10:14:04.522548 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/807c2da3-ef0e-4e89-9457-37401354a8e9-kube-api-access-vdzn4" (OuterVolumeSpecName: "kube-api-access-vdzn4") pod "807c2da3-ef0e-4e89-9457-37401354a8e9" (UID: "807c2da3-ef0e-4e89-9457-37401354a8e9"). InnerVolumeSpecName "kube-api-access-vdzn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:14:04 crc kubenswrapper[4792]: I0301 10:14:04.615787 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdzn4\" (UniqueName: \"kubernetes.io/projected/807c2da3-ef0e-4e89-9457-37401354a8e9-kube-api-access-vdzn4\") on node \"crc\" DevicePath \"\"" Mar 01 10:14:05 crc kubenswrapper[4792]: I0301 10:14:05.098976 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539334-c4cd2" event={"ID":"807c2da3-ef0e-4e89-9457-37401354a8e9","Type":"ContainerDied","Data":"e137a30b501156f803311eaa6d17b90f3d610114a7971add24db537517a94c70"} Mar 01 10:14:05 crc kubenswrapper[4792]: I0301 10:14:05.099009 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539334-c4cd2" Mar 01 10:14:05 crc kubenswrapper[4792]: I0301 10:14:05.099014 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e137a30b501156f803311eaa6d17b90f3d610114a7971add24db537517a94c70" Mar 01 10:14:05 crc kubenswrapper[4792]: I0301 10:14:05.533378 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539328-6t6hj"] Mar 01 10:14:05 crc kubenswrapper[4792]: I0301 10:14:05.541736 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539328-6t6hj"] Mar 01 10:14:07 crc kubenswrapper[4792]: I0301 10:14:07.420278 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42e71344-31b1-4817-b2e4-dd9aebb9d38e" path="/var/lib/kubelet/pods/42e71344-31b1-4817-b2e4-dd9aebb9d38e/volumes" Mar 01 10:14:18 crc kubenswrapper[4792]: I0301 10:14:18.409419 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:14:18 crc kubenswrapper[4792]: E0301 10:14:18.410062 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:14:31 crc kubenswrapper[4792]: I0301 10:14:31.415781 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:14:31 crc kubenswrapper[4792]: E0301 10:14:31.416447 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:14:33 crc kubenswrapper[4792]: I0301 10:14:33.745008 4792 scope.go:117] "RemoveContainer" containerID="c3e0ad9fbd90b282daf14548418069de81001946b8230adb9c02a7933419e3fe" Mar 01 10:14:43 crc kubenswrapper[4792]: I0301 10:14:43.409155 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:14:43 crc kubenswrapper[4792]: E0301 10:14:43.410053 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:14:55 crc kubenswrapper[4792]: I0301 10:14:55.408870 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:14:55 crc kubenswrapper[4792]: E0301 10:14:55.409764 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:15:00 crc kubenswrapper[4792]: I0301 10:15:00.208282 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539335-68xb5"] Mar 01 10:15:00 crc kubenswrapper[4792]: E0301 10:15:00.209303 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="807c2da3-ef0e-4e89-9457-37401354a8e9" containerName="oc" Mar 01 10:15:00 crc kubenswrapper[4792]: I0301 10:15:00.209321 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="807c2da3-ef0e-4e89-9457-37401354a8e9" containerName="oc" Mar 01 10:15:00 crc kubenswrapper[4792]: I0301 10:15:00.209534 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="807c2da3-ef0e-4e89-9457-37401354a8e9" containerName="oc" Mar 01 10:15:00 crc kubenswrapper[4792]: I0301 10:15:00.210373 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539335-68xb5" Mar 01 10:15:00 crc kubenswrapper[4792]: I0301 10:15:00.224282 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 01 10:15:00 crc kubenswrapper[4792]: I0301 10:15:00.225084 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 01 10:15:00 crc kubenswrapper[4792]: I0301 10:15:00.245874 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539335-68xb5"] Mar 01 10:15:00 crc kubenswrapper[4792]: I0301 10:15:00.311080 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d42ade11-691f-471f-ade0-0d9e12c70d1f-config-volume\") pod \"collect-profiles-29539335-68xb5\" (UID: \"d42ade11-691f-471f-ade0-0d9e12c70d1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539335-68xb5" Mar 01 10:15:00 crc kubenswrapper[4792]: I0301 10:15:00.311153 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d42ade11-691f-471f-ade0-0d9e12c70d1f-secret-volume\") pod \"collect-profiles-29539335-68xb5\" (UID: \"d42ade11-691f-471f-ade0-0d9e12c70d1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539335-68xb5" Mar 01 10:15:00 crc kubenswrapper[4792]: I0301 10:15:00.311171 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmcbv\" (UniqueName: \"kubernetes.io/projected/d42ade11-691f-471f-ade0-0d9e12c70d1f-kube-api-access-qmcbv\") pod \"collect-profiles-29539335-68xb5\" (UID: \"d42ade11-691f-471f-ade0-0d9e12c70d1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539335-68xb5" Mar 01 10:15:00 crc kubenswrapper[4792]: I0301 10:15:00.431482 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d42ade11-691f-471f-ade0-0d9e12c70d1f-config-volume\") pod \"collect-profiles-29539335-68xb5\" (UID: \"d42ade11-691f-471f-ade0-0d9e12c70d1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539335-68xb5" Mar 01 10:15:00 crc kubenswrapper[4792]: I0301 10:15:00.431615 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d42ade11-691f-471f-ade0-0d9e12c70d1f-secret-volume\") pod \"collect-profiles-29539335-68xb5\" (UID: \"d42ade11-691f-471f-ade0-0d9e12c70d1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539335-68xb5" Mar 01 10:15:00 crc kubenswrapper[4792]: I0301 10:15:00.431647 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmcbv\" (UniqueName: \"kubernetes.io/projected/d42ade11-691f-471f-ade0-0d9e12c70d1f-kube-api-access-qmcbv\") pod \"collect-profiles-29539335-68xb5\" (UID: \"d42ade11-691f-471f-ade0-0d9e12c70d1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539335-68xb5" Mar 01 10:15:00 crc kubenswrapper[4792]: I0301 10:15:00.433166 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d42ade11-691f-471f-ade0-0d9e12c70d1f-config-volume\") pod \"collect-profiles-29539335-68xb5\" (UID: \"d42ade11-691f-471f-ade0-0d9e12c70d1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539335-68xb5" Mar 01 10:15:00 crc kubenswrapper[4792]: I0301 10:15:00.475751 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d42ade11-691f-471f-ade0-0d9e12c70d1f-secret-volume\") pod \"collect-profiles-29539335-68xb5\" (UID: \"d42ade11-691f-471f-ade0-0d9e12c70d1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539335-68xb5" Mar 01 10:15:00 crc kubenswrapper[4792]: I0301 10:15:00.503948 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmcbv\" (UniqueName: \"kubernetes.io/projected/d42ade11-691f-471f-ade0-0d9e12c70d1f-kube-api-access-qmcbv\") pod \"collect-profiles-29539335-68xb5\" (UID: \"d42ade11-691f-471f-ade0-0d9e12c70d1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539335-68xb5" Mar 01 10:15:00 crc kubenswrapper[4792]: I0301 10:15:00.550156 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539335-68xb5" Mar 01 10:15:01 crc kubenswrapper[4792]: I0301 10:15:01.096662 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539335-68xb5"] Mar 01 10:15:01 crc kubenswrapper[4792]: I0301 10:15:01.623580 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539335-68xb5" event={"ID":"d42ade11-691f-471f-ade0-0d9e12c70d1f","Type":"ContainerStarted","Data":"889e7f5e3098bfec5bc19d7c90225344ee63adb7281cb525b526611dc8b03fc1"} Mar 01 10:15:01 crc kubenswrapper[4792]: I0301 10:15:01.625035 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539335-68xb5" event={"ID":"d42ade11-691f-471f-ade0-0d9e12c70d1f","Type":"ContainerStarted","Data":"57b5c9ebab15dec960db41e797d308da20fb69805b2d99394a6db01cb3b22ceb"} Mar 01 10:15:01 crc kubenswrapper[4792]: I0301 10:15:01.642117 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29539335-68xb5" podStartSLOduration=1.6420983310000001 podStartE2EDuration="1.642098331s" podCreationTimestamp="2026-03-01 10:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 10:15:01.636331638 +0000 UTC m=+4030.878210825" watchObservedRunningTime="2026-03-01 10:15:01.642098331 +0000 UTC m=+4030.883977528" Mar 01 10:15:02 crc kubenswrapper[4792]: I0301 10:15:02.632748 4792 generic.go:334] "Generic (PLEG): container finished" podID="d42ade11-691f-471f-ade0-0d9e12c70d1f" containerID="889e7f5e3098bfec5bc19d7c90225344ee63adb7281cb525b526611dc8b03fc1" exitCode=0 Mar 01 10:15:02 crc kubenswrapper[4792]: I0301 10:15:02.632819 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539335-68xb5" event={"ID":"d42ade11-691f-471f-ade0-0d9e12c70d1f","Type":"ContainerDied","Data":"889e7f5e3098bfec5bc19d7c90225344ee63adb7281cb525b526611dc8b03fc1"} Mar 01 10:15:04 crc kubenswrapper[4792]: I0301 10:15:04.133827 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539335-68xb5" Mar 01 10:15:04 crc kubenswrapper[4792]: I0301 10:15:04.213991 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmcbv\" (UniqueName: \"kubernetes.io/projected/d42ade11-691f-471f-ade0-0d9e12c70d1f-kube-api-access-qmcbv\") pod \"d42ade11-691f-471f-ade0-0d9e12c70d1f\" (UID: \"d42ade11-691f-471f-ade0-0d9e12c70d1f\") " Mar 01 10:15:04 crc kubenswrapper[4792]: I0301 10:15:04.214145 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d42ade11-691f-471f-ade0-0d9e12c70d1f-secret-volume\") pod \"d42ade11-691f-471f-ade0-0d9e12c70d1f\" (UID: \"d42ade11-691f-471f-ade0-0d9e12c70d1f\") " Mar 01 10:15:04 crc kubenswrapper[4792]: I0301 10:15:04.214436 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d42ade11-691f-471f-ade0-0d9e12c70d1f-config-volume\") pod \"d42ade11-691f-471f-ade0-0d9e12c70d1f\" (UID: \"d42ade11-691f-471f-ade0-0d9e12c70d1f\") " Mar 01 10:15:04 crc kubenswrapper[4792]: I0301 10:15:04.215158 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d42ade11-691f-471f-ade0-0d9e12c70d1f-config-volume" (OuterVolumeSpecName: "config-volume") pod "d42ade11-691f-471f-ade0-0d9e12c70d1f" (UID: "d42ade11-691f-471f-ade0-0d9e12c70d1f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 10:15:04 crc kubenswrapper[4792]: I0301 10:15:04.215826 4792 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d42ade11-691f-471f-ade0-0d9e12c70d1f-config-volume\") on node \"crc\" DevicePath \"\"" Mar 01 10:15:04 crc kubenswrapper[4792]: I0301 10:15:04.224119 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d42ade11-691f-471f-ade0-0d9e12c70d1f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d42ade11-691f-471f-ade0-0d9e12c70d1f" (UID: "d42ade11-691f-471f-ade0-0d9e12c70d1f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:15:04 crc kubenswrapper[4792]: I0301 10:15:04.224628 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d42ade11-691f-471f-ade0-0d9e12c70d1f-kube-api-access-qmcbv" (OuterVolumeSpecName: "kube-api-access-qmcbv") pod "d42ade11-691f-471f-ade0-0d9e12c70d1f" (UID: "d42ade11-691f-471f-ade0-0d9e12c70d1f"). InnerVolumeSpecName "kube-api-access-qmcbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:15:04 crc kubenswrapper[4792]: I0301 10:15:04.319043 4792 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d42ade11-691f-471f-ade0-0d9e12c70d1f-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 01 10:15:04 crc kubenswrapper[4792]: I0301 10:15:04.319090 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmcbv\" (UniqueName: \"kubernetes.io/projected/d42ade11-691f-471f-ade0-0d9e12c70d1f-kube-api-access-qmcbv\") on node \"crc\" DevicePath \"\"" Mar 01 10:15:04 crc kubenswrapper[4792]: I0301 10:15:04.524498 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539290-wch24"] Mar 01 10:15:04 crc kubenswrapper[4792]: I0301 10:15:04.550505 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539290-wch24"] Mar 01 10:15:04 crc kubenswrapper[4792]: I0301 10:15:04.657595 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539335-68xb5" event={"ID":"d42ade11-691f-471f-ade0-0d9e12c70d1f","Type":"ContainerDied","Data":"57b5c9ebab15dec960db41e797d308da20fb69805b2d99394a6db01cb3b22ceb"} Mar 01 10:15:04 crc kubenswrapper[4792]: I0301 10:15:04.657676 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57b5c9ebab15dec960db41e797d308da20fb69805b2d99394a6db01cb3b22ceb" Mar 01 10:15:04 crc kubenswrapper[4792]: I0301 10:15:04.657720 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539335-68xb5" Mar 01 10:15:05 crc kubenswrapper[4792]: I0301 10:15:05.424037 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29833925-b21b-44d4-954c-e3252e5e69c4" path="/var/lib/kubelet/pods/29833925-b21b-44d4-954c-e3252e5e69c4/volumes" Mar 01 10:15:06 crc kubenswrapper[4792]: I0301 10:15:06.409835 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:15:06 crc kubenswrapper[4792]: E0301 10:15:06.410686 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:15:19 crc kubenswrapper[4792]: I0301 10:15:19.409643 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:15:19 crc kubenswrapper[4792]: E0301 10:15:19.410498 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:15:33 crc kubenswrapper[4792]: I0301 10:15:33.842863 4792 scope.go:117] "RemoveContainer" containerID="2ef9903d192bdc03acaf2fe74facecbf1f886cfe12a9446e1e544403dd9c2365" Mar 01 10:15:34 crc kubenswrapper[4792]: I0301 10:15:34.409651 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:15:34 crc kubenswrapper[4792]: E0301 10:15:34.410157 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:15:34 crc kubenswrapper[4792]: I0301 10:15:34.918196 4792 generic.go:334] "Generic (PLEG): container finished" podID="ee1c75ce-61f7-4ce5-a757-b7405d7135bd" containerID="0b1c921f1338ea9b8f3dd9b08a6d658d40119ab101643c7964b03d38bfa73f47" exitCode=0 Mar 01 10:15:34 crc kubenswrapper[4792]: I0301 10:15:34.918249 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ee1c75ce-61f7-4ce5-a757-b7405d7135bd","Type":"ContainerDied","Data":"0b1c921f1338ea9b8f3dd9b08a6d658d40119ab101643c7964b03d38bfa73f47"} Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.414411 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.603171 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-test-operator-ephemeral-temporary\") pod \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.603222 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-ssh-key\") pod \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.603243 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-config-data\") pod \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.603322 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slzdj\" (UniqueName: \"kubernetes.io/projected/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-kube-api-access-slzdj\") pod \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.603388 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-test-operator-ephemeral-workdir\") pod \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.603485 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-openstack-config-secret\") pod \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.603515 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-openstack-config\") pod \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.603603 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-ca-certs\") pod \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.603655 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.606033 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "ee1c75ce-61f7-4ce5-a757-b7405d7135bd" (UID: "ee1c75ce-61f7-4ce5-a757-b7405d7135bd"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.609196 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "ee1c75ce-61f7-4ce5-a757-b7405d7135bd" (UID: "ee1c75ce-61f7-4ce5-a757-b7405d7135bd"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.609505 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "test-operator-logs") pod "ee1c75ce-61f7-4ce5-a757-b7405d7135bd" (UID: "ee1c75ce-61f7-4ce5-a757-b7405d7135bd"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.609970 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-config-data" (OuterVolumeSpecName: "config-data") pod "ee1c75ce-61f7-4ce5-a757-b7405d7135bd" (UID: "ee1c75ce-61f7-4ce5-a757-b7405d7135bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.610354 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-kube-api-access-slzdj" (OuterVolumeSpecName: "kube-api-access-slzdj") pod "ee1c75ce-61f7-4ce5-a757-b7405d7135bd" (UID: "ee1c75ce-61f7-4ce5-a757-b7405d7135bd"). InnerVolumeSpecName "kube-api-access-slzdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.644031 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "ee1c75ce-61f7-4ce5-a757-b7405d7135bd" (UID: "ee1c75ce-61f7-4ce5-a757-b7405d7135bd"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.647371 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ee1c75ce-61f7-4ce5-a757-b7405d7135bd" (UID: "ee1c75ce-61f7-4ce5-a757-b7405d7135bd"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.658549 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "ee1c75ce-61f7-4ce5-a757-b7405d7135bd" (UID: "ee1c75ce-61f7-4ce5-a757-b7405d7135bd"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.661045 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "ee1c75ce-61f7-4ce5-a757-b7405d7135bd" (UID: "ee1c75ce-61f7-4ce5-a757-b7405d7135bd"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.706155 4792 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.706704 4792 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.706725 4792 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.706750 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.706766 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.706779 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slzdj\" (UniqueName: \"kubernetes.io/projected/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-kube-api-access-slzdj\") on node \"crc\" DevicePath \"\"" Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.706792 4792 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.706804 4792 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.706817 4792 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.730083 4792 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.808100 4792 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.934025 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ee1c75ce-61f7-4ce5-a757-b7405d7135bd","Type":"ContainerDied","Data":"56c23af9ad5afce4e187f9687e375e2437b0d33726c9ce5a2f51b334bc434432"} Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.934296 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56c23af9ad5afce4e187f9687e375e2437b0d33726c9ce5a2f51b334bc434432" Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.934072 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 01 10:15:46 crc kubenswrapper[4792]: I0301 10:15:46.870256 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 01 10:15:46 crc kubenswrapper[4792]: E0301 10:15:46.871278 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d42ade11-691f-471f-ade0-0d9e12c70d1f" containerName="collect-profiles" Mar 01 10:15:46 crc kubenswrapper[4792]: I0301 10:15:46.871297 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d42ade11-691f-471f-ade0-0d9e12c70d1f" containerName="collect-profiles" Mar 01 10:15:46 crc kubenswrapper[4792]: E0301 10:15:46.871346 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee1c75ce-61f7-4ce5-a757-b7405d7135bd" containerName="tempest-tests-tempest-tests-runner" Mar 01 10:15:46 crc kubenswrapper[4792]: I0301 10:15:46.871356 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee1c75ce-61f7-4ce5-a757-b7405d7135bd" containerName="tempest-tests-tempest-tests-runner" Mar 01 10:15:46 crc kubenswrapper[4792]: I0301 10:15:46.871537 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d42ade11-691f-471f-ade0-0d9e12c70d1f" containerName="collect-profiles" Mar 01 10:15:46 crc kubenswrapper[4792]: I0301 10:15:46.871553 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee1c75ce-61f7-4ce5-a757-b7405d7135bd" containerName="tempest-tests-tempest-tests-runner" Mar 01 10:15:46 crc kubenswrapper[4792]: I0301 10:15:46.872300 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 01 10:15:46 crc kubenswrapper[4792]: I0301 10:15:46.874033 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-nl48r" Mar 01 10:15:46 crc kubenswrapper[4792]: I0301 10:15:46.894829 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 01 10:15:47 crc kubenswrapper[4792]: I0301 10:15:47.007773 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"478d8531-4e8e-4775-999d-42af4afef106\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 01 10:15:47 crc kubenswrapper[4792]: I0301 10:15:47.007931 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl7g4\" (UniqueName: \"kubernetes.io/projected/478d8531-4e8e-4775-999d-42af4afef106-kube-api-access-xl7g4\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"478d8531-4e8e-4775-999d-42af4afef106\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 01 10:15:47 crc kubenswrapper[4792]: I0301 10:15:47.109728 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"478d8531-4e8e-4775-999d-42af4afef106\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 01 10:15:47 crc kubenswrapper[4792]: I0301 10:15:47.109872 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl7g4\" (UniqueName: \"kubernetes.io/projected/478d8531-4e8e-4775-999d-42af4afef106-kube-api-access-xl7g4\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"478d8531-4e8e-4775-999d-42af4afef106\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 01 10:15:47 crc kubenswrapper[4792]: I0301 10:15:47.110978 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"478d8531-4e8e-4775-999d-42af4afef106\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 01 10:15:47 crc kubenswrapper[4792]: I0301 10:15:47.135855 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl7g4\" (UniqueName: \"kubernetes.io/projected/478d8531-4e8e-4775-999d-42af4afef106-kube-api-access-xl7g4\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"478d8531-4e8e-4775-999d-42af4afef106\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 01 10:15:47 crc kubenswrapper[4792]: I0301 10:15:47.139101 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"478d8531-4e8e-4775-999d-42af4afef106\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 01 10:15:47 crc kubenswrapper[4792]: I0301 10:15:47.188020 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 01 10:15:47 crc kubenswrapper[4792]: I0301 10:15:47.695714 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 01 10:15:48 crc kubenswrapper[4792]: I0301 10:15:48.033838 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"478d8531-4e8e-4775-999d-42af4afef106","Type":"ContainerStarted","Data":"6a3ed20822578164d0bed56841e7cdf7da3fe3a1374b8fe4d4a5d98a5767e2c1"} Mar 01 10:15:48 crc kubenswrapper[4792]: I0301 10:15:48.409057 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:15:48 crc kubenswrapper[4792]: E0301 10:15:48.409573 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:15:50 crc kubenswrapper[4792]: I0301 10:15:50.049383 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"478d8531-4e8e-4775-999d-42af4afef106","Type":"ContainerStarted","Data":"3bf971832b4d249de6ae0b44a82cfecf495c4144dfad64713432eabd53146ea2"} Mar 01 10:15:50 crc kubenswrapper[4792]: I0301 10:15:50.065923 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.785206419 podStartE2EDuration="4.065887807s" podCreationTimestamp="2026-03-01 10:15:46 +0000 UTC" firstStartedPulling="2026-03-01 10:15:47.680799497 +0000 UTC m=+4076.922678694" lastFinishedPulling="2026-03-01 10:15:48.961480885 +0000 UTC m=+4078.203360082" observedRunningTime="2026-03-01 10:15:50.063880647 +0000 UTC m=+4079.305759844" watchObservedRunningTime="2026-03-01 10:15:50.065887807 +0000 UTC m=+4079.307767004" Mar 01 10:16:00 crc kubenswrapper[4792]: I0301 10:16:00.143576 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539336-vnhz7"] Mar 01 10:16:00 crc kubenswrapper[4792]: I0301 10:16:00.145420 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539336-vnhz7" Mar 01 10:16:00 crc kubenswrapper[4792]: I0301 10:16:00.147941 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 10:16:00 crc kubenswrapper[4792]: I0301 10:16:00.148022 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 10:16:00 crc kubenswrapper[4792]: I0301 10:16:00.148404 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 10:16:00 crc kubenswrapper[4792]: I0301 10:16:00.152955 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539336-vnhz7"] Mar 01 10:16:00 crc kubenswrapper[4792]: I0301 10:16:00.310604 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkklg\" (UniqueName: \"kubernetes.io/projected/731500c7-53e0-431a-9ec7-7e56ef9c11ee-kube-api-access-bkklg\") pod \"auto-csr-approver-29539336-vnhz7\" (UID: \"731500c7-53e0-431a-9ec7-7e56ef9c11ee\") " pod="openshift-infra/auto-csr-approver-29539336-vnhz7" Mar 01 10:16:00 crc kubenswrapper[4792]: I0301 10:16:00.412814 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkklg\" (UniqueName: \"kubernetes.io/projected/731500c7-53e0-431a-9ec7-7e56ef9c11ee-kube-api-access-bkklg\") pod \"auto-csr-approver-29539336-vnhz7\" (UID: \"731500c7-53e0-431a-9ec7-7e56ef9c11ee\") " pod="openshift-infra/auto-csr-approver-29539336-vnhz7" Mar 01 10:16:00 crc kubenswrapper[4792]: I0301 10:16:00.438093 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkklg\" (UniqueName: \"kubernetes.io/projected/731500c7-53e0-431a-9ec7-7e56ef9c11ee-kube-api-access-bkklg\") pod \"auto-csr-approver-29539336-vnhz7\" (UID: \"731500c7-53e0-431a-9ec7-7e56ef9c11ee\") " pod="openshift-infra/auto-csr-approver-29539336-vnhz7" Mar 01 10:16:00 crc kubenswrapper[4792]: I0301 10:16:00.470056 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539336-vnhz7" Mar 01 10:16:00 crc kubenswrapper[4792]: I0301 10:16:00.939538 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539336-vnhz7"] Mar 01 10:16:01 crc kubenswrapper[4792]: I0301 10:16:01.176632 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539336-vnhz7" event={"ID":"731500c7-53e0-431a-9ec7-7e56ef9c11ee","Type":"ContainerStarted","Data":"f0a611fca5aa0d9b839a3b52b4e4a4b3d492bced131624b7eed8f41ebfde0b32"} Mar 01 10:16:02 crc kubenswrapper[4792]: I0301 10:16:02.185031 4792 generic.go:334] "Generic (PLEG): container finished" podID="731500c7-53e0-431a-9ec7-7e56ef9c11ee" containerID="457400b5efe76b08cc35b2db4e3b43b3bb9dd95107e6fe2815b55395cf597f33" exitCode=0 Mar 01 10:16:02 crc kubenswrapper[4792]: I0301 10:16:02.185083 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539336-vnhz7" event={"ID":"731500c7-53e0-431a-9ec7-7e56ef9c11ee","Type":"ContainerDied","Data":"457400b5efe76b08cc35b2db4e3b43b3bb9dd95107e6fe2815b55395cf597f33"} Mar 01 10:16:03 crc kubenswrapper[4792]: I0301 10:16:03.411477 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:16:03 crc kubenswrapper[4792]: E0301 10:16:03.412307 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:16:03 crc kubenswrapper[4792]: I0301 10:16:03.683522 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539336-vnhz7" Mar 01 10:16:03 crc kubenswrapper[4792]: I0301 10:16:03.803460 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkklg\" (UniqueName: \"kubernetes.io/projected/731500c7-53e0-431a-9ec7-7e56ef9c11ee-kube-api-access-bkklg\") pod \"731500c7-53e0-431a-9ec7-7e56ef9c11ee\" (UID: \"731500c7-53e0-431a-9ec7-7e56ef9c11ee\") " Mar 01 10:16:03 crc kubenswrapper[4792]: I0301 10:16:03.808252 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/731500c7-53e0-431a-9ec7-7e56ef9c11ee-kube-api-access-bkklg" (OuterVolumeSpecName: "kube-api-access-bkklg") pod "731500c7-53e0-431a-9ec7-7e56ef9c11ee" (UID: "731500c7-53e0-431a-9ec7-7e56ef9c11ee"). InnerVolumeSpecName "kube-api-access-bkklg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:16:03 crc kubenswrapper[4792]: I0301 10:16:03.907374 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkklg\" (UniqueName: \"kubernetes.io/projected/731500c7-53e0-431a-9ec7-7e56ef9c11ee-kube-api-access-bkklg\") on node \"crc\" DevicePath \"\"" Mar 01 10:16:04 crc kubenswrapper[4792]: I0301 10:16:04.203149 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539336-vnhz7" event={"ID":"731500c7-53e0-431a-9ec7-7e56ef9c11ee","Type":"ContainerDied","Data":"f0a611fca5aa0d9b839a3b52b4e4a4b3d492bced131624b7eed8f41ebfde0b32"} Mar 01 10:16:04 crc kubenswrapper[4792]: I0301 10:16:04.203200 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0a611fca5aa0d9b839a3b52b4e4a4b3d492bced131624b7eed8f41ebfde0b32" Mar 01 10:16:04 crc kubenswrapper[4792]: I0301 10:16:04.203201 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539336-vnhz7" Mar 01 10:16:04 crc kubenswrapper[4792]: I0301 10:16:04.759004 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539330-wqct7"] Mar 01 10:16:04 crc kubenswrapper[4792]: I0301 10:16:04.766580 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539330-wqct7"] Mar 01 10:16:05 crc kubenswrapper[4792]: I0301 10:16:05.421575 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9025376-622f-4b7e-94d5-c5b136e139d8" path="/var/lib/kubelet/pods/a9025376-622f-4b7e-94d5-c5b136e139d8/volumes" Mar 01 10:16:11 crc kubenswrapper[4792]: I0301 10:16:11.491296 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8szkd/must-gather-5ntfl"] Mar 01 10:16:11 crc kubenswrapper[4792]: E0301 10:16:11.492140 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="731500c7-53e0-431a-9ec7-7e56ef9c11ee" containerName="oc" Mar 01 10:16:11 crc kubenswrapper[4792]: I0301 10:16:11.492151 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="731500c7-53e0-431a-9ec7-7e56ef9c11ee" containerName="oc" Mar 01 10:16:11 crc kubenswrapper[4792]: I0301 10:16:11.492325 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="731500c7-53e0-431a-9ec7-7e56ef9c11ee" containerName="oc" Mar 01 10:16:11 crc kubenswrapper[4792]: I0301 10:16:11.493713 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8szkd/must-gather-5ntfl" Mar 01 10:16:11 crc kubenswrapper[4792]: I0301 10:16:11.496394 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-8szkd"/"default-dockercfg-sh7xn" Mar 01 10:16:11 crc kubenswrapper[4792]: I0301 10:16:11.496502 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-8szkd"/"openshift-service-ca.crt" Mar 01 10:16:11 crc kubenswrapper[4792]: I0301 10:16:11.496654 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-8szkd"/"kube-root-ca.crt" Mar 01 10:16:11 crc kubenswrapper[4792]: I0301 10:16:11.519168 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8szkd/must-gather-5ntfl"] Mar 01 10:16:11 crc kubenswrapper[4792]: I0301 10:16:11.661213 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58lp7\" (UniqueName: \"kubernetes.io/projected/c72d6020-9460-4198-863a-ec32bc90fee9-kube-api-access-58lp7\") pod \"must-gather-5ntfl\" (UID: \"c72d6020-9460-4198-863a-ec32bc90fee9\") " pod="openshift-must-gather-8szkd/must-gather-5ntfl" Mar 01 10:16:11 crc kubenswrapper[4792]: I0301 10:16:11.661321 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c72d6020-9460-4198-863a-ec32bc90fee9-must-gather-output\") pod \"must-gather-5ntfl\" (UID: \"c72d6020-9460-4198-863a-ec32bc90fee9\") " pod="openshift-must-gather-8szkd/must-gather-5ntfl" Mar 01 10:16:11 crc kubenswrapper[4792]: I0301 10:16:11.762883 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c72d6020-9460-4198-863a-ec32bc90fee9-must-gather-output\") pod \"must-gather-5ntfl\" (UID: \"c72d6020-9460-4198-863a-ec32bc90fee9\") " pod="openshift-must-gather-8szkd/must-gather-5ntfl" Mar 01 10:16:11 crc kubenswrapper[4792]: I0301 10:16:11.763247 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c72d6020-9460-4198-863a-ec32bc90fee9-must-gather-output\") pod \"must-gather-5ntfl\" (UID: \"c72d6020-9460-4198-863a-ec32bc90fee9\") " pod="openshift-must-gather-8szkd/must-gather-5ntfl" Mar 01 10:16:11 crc kubenswrapper[4792]: I0301 10:16:11.763531 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58lp7\" (UniqueName: \"kubernetes.io/projected/c72d6020-9460-4198-863a-ec32bc90fee9-kube-api-access-58lp7\") pod \"must-gather-5ntfl\" (UID: \"c72d6020-9460-4198-863a-ec32bc90fee9\") " pod="openshift-must-gather-8szkd/must-gather-5ntfl" Mar 01 10:16:11 crc kubenswrapper[4792]: I0301 10:16:11.781400 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58lp7\" (UniqueName: \"kubernetes.io/projected/c72d6020-9460-4198-863a-ec32bc90fee9-kube-api-access-58lp7\") pod \"must-gather-5ntfl\" (UID: \"c72d6020-9460-4198-863a-ec32bc90fee9\") " pod="openshift-must-gather-8szkd/must-gather-5ntfl" Mar 01 10:16:11 crc kubenswrapper[4792]: I0301 10:16:11.813423 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8szkd/must-gather-5ntfl" Mar 01 10:16:12 crc kubenswrapper[4792]: I0301 10:16:12.333360 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8szkd/must-gather-5ntfl"] Mar 01 10:16:13 crc kubenswrapper[4792]: I0301 10:16:13.283292 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8szkd/must-gather-5ntfl" event={"ID":"c72d6020-9460-4198-863a-ec32bc90fee9","Type":"ContainerStarted","Data":"740afa23c87b88c924b6a67375351371f8d7a97fff51d47250af7695c11c9757"} Mar 01 10:16:17 crc kubenswrapper[4792]: I0301 10:16:17.412311 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:16:17 crc kubenswrapper[4792]: E0301 10:16:17.413168 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:16:19 crc kubenswrapper[4792]: I0301 10:16:19.607405 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6z92d"] Mar 01 10:16:19 crc kubenswrapper[4792]: I0301 10:16:19.610547 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6z92d" Mar 01 10:16:19 crc kubenswrapper[4792]: I0301 10:16:19.616300 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6z92d"] Mar 01 10:16:19 crc kubenswrapper[4792]: I0301 10:16:19.679388 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fb9c2d9-68e1-4fca-9749-7abfe0778f3f-catalog-content\") pod \"redhat-operators-6z92d\" (UID: \"7fb9c2d9-68e1-4fca-9749-7abfe0778f3f\") " pod="openshift-marketplace/redhat-operators-6z92d" Mar 01 10:16:19 crc kubenswrapper[4792]: I0301 10:16:19.679626 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drslj\" (UniqueName: \"kubernetes.io/projected/7fb9c2d9-68e1-4fca-9749-7abfe0778f3f-kube-api-access-drslj\") pod \"redhat-operators-6z92d\" (UID: \"7fb9c2d9-68e1-4fca-9749-7abfe0778f3f\") " pod="openshift-marketplace/redhat-operators-6z92d" Mar 01 10:16:19 crc kubenswrapper[4792]: I0301 10:16:19.679985 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fb9c2d9-68e1-4fca-9749-7abfe0778f3f-utilities\") pod \"redhat-operators-6z92d\" (UID: \"7fb9c2d9-68e1-4fca-9749-7abfe0778f3f\") " pod="openshift-marketplace/redhat-operators-6z92d" Mar 01 10:16:19 crc kubenswrapper[4792]: I0301 10:16:19.782318 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fb9c2d9-68e1-4fca-9749-7abfe0778f3f-utilities\") pod \"redhat-operators-6z92d\" (UID: \"7fb9c2d9-68e1-4fca-9749-7abfe0778f3f\") " pod="openshift-marketplace/redhat-operators-6z92d" Mar 01 10:16:19 crc kubenswrapper[4792]: I0301 10:16:19.782649 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fb9c2d9-68e1-4fca-9749-7abfe0778f3f-catalog-content\") pod \"redhat-operators-6z92d\" (UID: \"7fb9c2d9-68e1-4fca-9749-7abfe0778f3f\") " pod="openshift-marketplace/redhat-operators-6z92d" Mar 01 10:16:19 crc kubenswrapper[4792]: I0301 10:16:19.782761 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drslj\" (UniqueName: \"kubernetes.io/projected/7fb9c2d9-68e1-4fca-9749-7abfe0778f3f-kube-api-access-drslj\") pod \"redhat-operators-6z92d\" (UID: \"7fb9c2d9-68e1-4fca-9749-7abfe0778f3f\") " pod="openshift-marketplace/redhat-operators-6z92d" Mar 01 10:16:19 crc kubenswrapper[4792]: I0301 10:16:19.782803 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fb9c2d9-68e1-4fca-9749-7abfe0778f3f-utilities\") pod \"redhat-operators-6z92d\" (UID: \"7fb9c2d9-68e1-4fca-9749-7abfe0778f3f\") " pod="openshift-marketplace/redhat-operators-6z92d" Mar 01 10:16:19 crc kubenswrapper[4792]: I0301 10:16:19.783090 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fb9c2d9-68e1-4fca-9749-7abfe0778f3f-catalog-content\") pod \"redhat-operators-6z92d\" (UID: \"7fb9c2d9-68e1-4fca-9749-7abfe0778f3f\") " pod="openshift-marketplace/redhat-operators-6z92d" Mar 01 10:16:19 crc kubenswrapper[4792]: I0301 10:16:19.810630 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drslj\" (UniqueName: \"kubernetes.io/projected/7fb9c2d9-68e1-4fca-9749-7abfe0778f3f-kube-api-access-drslj\") pod \"redhat-operators-6z92d\" (UID: \"7fb9c2d9-68e1-4fca-9749-7abfe0778f3f\") " pod="openshift-marketplace/redhat-operators-6z92d" Mar 01 10:16:19 crc kubenswrapper[4792]: I0301 10:16:19.947261 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6z92d" Mar 01 10:16:20 crc kubenswrapper[4792]: I0301 10:16:20.422109 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8szkd/must-gather-5ntfl" event={"ID":"c72d6020-9460-4198-863a-ec32bc90fee9","Type":"ContainerStarted","Data":"882ed7a11f9213b4a2466a64a60d624aeb96a25ba7c34fc56d839ecd11da0b1a"} Mar 01 10:16:20 crc kubenswrapper[4792]: I0301 10:16:20.422469 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8szkd/must-gather-5ntfl" event={"ID":"c72d6020-9460-4198-863a-ec32bc90fee9","Type":"ContainerStarted","Data":"3040d0fb9fd765268819a187e562939355666f759f1f77a68a46d8ecca69408c"} Mar 01 10:16:20 crc kubenswrapper[4792]: I0301 10:16:20.469525 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8szkd/must-gather-5ntfl" podStartSLOduration=2.323804022 podStartE2EDuration="9.469507181s" podCreationTimestamp="2026-03-01 10:16:11 +0000 UTC" firstStartedPulling="2026-03-01 10:16:12.335162831 +0000 UTC m=+4101.577042028" lastFinishedPulling="2026-03-01 10:16:19.48086599 +0000 UTC m=+4108.722745187" observedRunningTime="2026-03-01 10:16:20.445403702 +0000 UTC m=+4109.687282899" watchObservedRunningTime="2026-03-01 10:16:20.469507181 +0000 UTC m=+4109.711386378" Mar 01 10:16:20 crc kubenswrapper[4792]: I0301 10:16:20.499038 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6z92d"] Mar 01 10:16:21 crc kubenswrapper[4792]: I0301 10:16:21.453329 4792 generic.go:334] "Generic (PLEG): container finished" podID="7fb9c2d9-68e1-4fca-9749-7abfe0778f3f" containerID="12e62220e8f801f41c466262887e7df524da2d6dadb151112dbd3c99a23c9b8e" exitCode=0 Mar 01 10:16:21 crc kubenswrapper[4792]: I0301 10:16:21.454488 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6z92d" event={"ID":"7fb9c2d9-68e1-4fca-9749-7abfe0778f3f","Type":"ContainerDied","Data":"12e62220e8f801f41c466262887e7df524da2d6dadb151112dbd3c99a23c9b8e"} Mar 01 10:16:21 crc kubenswrapper[4792]: I0301 10:16:21.454514 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6z92d" event={"ID":"7fb9c2d9-68e1-4fca-9749-7abfe0778f3f","Type":"ContainerStarted","Data":"9653c39b1fc9d37ab3fa3dc401ab1732f3efcb27add0085798088c83918fe421"} Mar 01 10:16:22 crc kubenswrapper[4792]: I0301 10:16:22.467370 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6z92d" event={"ID":"7fb9c2d9-68e1-4fca-9749-7abfe0778f3f","Type":"ContainerStarted","Data":"e3667961a693dba37bb04e11233051f1743a9f4f35ff5d1573d0a8ac7aed1fd7"} Mar 01 10:16:25 crc kubenswrapper[4792]: E0301 10:16:25.358837 4792 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.89:37230->38.102.83.89:34111: write tcp 38.102.83.89:37230->38.102.83.89:34111: write: broken pipe Mar 01 10:16:25 crc kubenswrapper[4792]: E0301 10:16:25.512669 4792 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.89:37276->38.102.83.89:34111: write tcp 38.102.83.89:37276->38.102.83.89:34111: write: connection reset by peer Mar 01 10:16:27 crc kubenswrapper[4792]: I0301 10:16:27.188635 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8szkd/crc-debug-zgt6k"] Mar 01 10:16:27 crc kubenswrapper[4792]: I0301 10:16:27.190469 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8szkd/crc-debug-zgt6k" Mar 01 10:16:27 crc kubenswrapper[4792]: I0301 10:16:27.246723 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/21e1cd52-26d9-4c5d-b87b-b124e2b7cb06-host\") pod \"crc-debug-zgt6k\" (UID: \"21e1cd52-26d9-4c5d-b87b-b124e2b7cb06\") " pod="openshift-must-gather-8szkd/crc-debug-zgt6k" Mar 01 10:16:27 crc kubenswrapper[4792]: I0301 10:16:27.246850 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nbbf\" (UniqueName: \"kubernetes.io/projected/21e1cd52-26d9-4c5d-b87b-b124e2b7cb06-kube-api-access-9nbbf\") pod \"crc-debug-zgt6k\" (UID: \"21e1cd52-26d9-4c5d-b87b-b124e2b7cb06\") " pod="openshift-must-gather-8szkd/crc-debug-zgt6k" Mar 01 10:16:27 crc kubenswrapper[4792]: I0301 10:16:27.349360 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/21e1cd52-26d9-4c5d-b87b-b124e2b7cb06-host\") pod \"crc-debug-zgt6k\" (UID: \"21e1cd52-26d9-4c5d-b87b-b124e2b7cb06\") " pod="openshift-must-gather-8szkd/crc-debug-zgt6k" Mar 01 10:16:27 crc kubenswrapper[4792]: I0301 10:16:27.349507 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/21e1cd52-26d9-4c5d-b87b-b124e2b7cb06-host\") pod \"crc-debug-zgt6k\" (UID: \"21e1cd52-26d9-4c5d-b87b-b124e2b7cb06\") " pod="openshift-must-gather-8szkd/crc-debug-zgt6k" Mar 01 10:16:27 crc kubenswrapper[4792]: I0301 10:16:27.349757 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nbbf\" (UniqueName: \"kubernetes.io/projected/21e1cd52-26d9-4c5d-b87b-b124e2b7cb06-kube-api-access-9nbbf\") pod \"crc-debug-zgt6k\" (UID: \"21e1cd52-26d9-4c5d-b87b-b124e2b7cb06\") " pod="openshift-must-gather-8szkd/crc-debug-zgt6k" Mar 01 10:16:27 crc kubenswrapper[4792]: I0301 10:16:27.371176 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nbbf\" (UniqueName: \"kubernetes.io/projected/21e1cd52-26d9-4c5d-b87b-b124e2b7cb06-kube-api-access-9nbbf\") pod \"crc-debug-zgt6k\" (UID: \"21e1cd52-26d9-4c5d-b87b-b124e2b7cb06\") " pod="openshift-must-gather-8szkd/crc-debug-zgt6k" Mar 01 10:16:27 crc kubenswrapper[4792]: I0301 10:16:27.506856 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8szkd/crc-debug-zgt6k" Mar 01 10:16:27 crc kubenswrapper[4792]: I0301 10:16:27.512624 4792 generic.go:334] "Generic (PLEG): container finished" podID="7fb9c2d9-68e1-4fca-9749-7abfe0778f3f" containerID="e3667961a693dba37bb04e11233051f1743a9f4f35ff5d1573d0a8ac7aed1fd7" exitCode=0 Mar 01 10:16:27 crc kubenswrapper[4792]: I0301 10:16:27.512696 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6z92d" event={"ID":"7fb9c2d9-68e1-4fca-9749-7abfe0778f3f","Type":"ContainerDied","Data":"e3667961a693dba37bb04e11233051f1743a9f4f35ff5d1573d0a8ac7aed1fd7"} Mar 01 10:16:28 crc kubenswrapper[4792]: I0301 10:16:28.410009 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:16:28 crc kubenswrapper[4792]: E0301 10:16:28.410659 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:16:28 crc kubenswrapper[4792]: I0301 10:16:28.531120 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6z92d" event={"ID":"7fb9c2d9-68e1-4fca-9749-7abfe0778f3f","Type":"ContainerStarted","Data":"a190ba791fc79c3bb1e89f0f4c912141789982112b6c4f33837ef401b1d2da09"} Mar 01 10:16:28 crc kubenswrapper[4792]: I0301 10:16:28.533275 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8szkd/crc-debug-zgt6k" event={"ID":"21e1cd52-26d9-4c5d-b87b-b124e2b7cb06","Type":"ContainerStarted","Data":"434542fb734e119099c99109e2963f44067d819ceef6faa0696fa255b4e3073f"} Mar 01 10:16:28 crc kubenswrapper[4792]: I0301 10:16:28.557439 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6z92d" podStartSLOduration=3.06851316 podStartE2EDuration="9.557419906s" podCreationTimestamp="2026-03-01 10:16:19 +0000 UTC" firstStartedPulling="2026-03-01 10:16:21.459101366 +0000 UTC m=+4110.700980563" lastFinishedPulling="2026-03-01 10:16:27.948008112 +0000 UTC m=+4117.189887309" observedRunningTime="2026-03-01 10:16:28.54510314 +0000 UTC m=+4117.786982337" watchObservedRunningTime="2026-03-01 10:16:28.557419906 +0000 UTC m=+4117.799299103" Mar 01 10:16:29 crc kubenswrapper[4792]: I0301 10:16:29.947773 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6z92d" Mar 01 10:16:29 crc kubenswrapper[4792]: I0301 10:16:29.948093 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6z92d" Mar 01 10:16:31 crc kubenswrapper[4792]: I0301 10:16:31.000702 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6z92d" podUID="7fb9c2d9-68e1-4fca-9749-7abfe0778f3f" containerName="registry-server" probeResult="failure" output=< Mar 01 10:16:31 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 10:16:31 crc kubenswrapper[4792]: > Mar 01 10:16:33 crc kubenswrapper[4792]: I0301 10:16:33.936096 4792 scope.go:117] "RemoveContainer" containerID="efa17f624abb4097f736a6d5cc9c3131b915c07d36d54a5857fd159d31a5a613" Mar 01 10:16:40 crc kubenswrapper[4792]: I0301 10:16:40.409548 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:16:40 crc kubenswrapper[4792]: E0301 10:16:40.410354 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:16:40 crc kubenswrapper[4792]: I0301 10:16:40.653279 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8szkd/crc-debug-zgt6k" event={"ID":"21e1cd52-26d9-4c5d-b87b-b124e2b7cb06","Type":"ContainerStarted","Data":"12a4ef8d3fa3b5f92a7e689e1d88b84e48e9aed058fb8fe4d0d9b00831ceeea9"} Mar 01 10:16:40 crc kubenswrapper[4792]: I0301 10:16:40.671610 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8szkd/crc-debug-zgt6k" podStartSLOduration=1.830996534 podStartE2EDuration="13.671593608s" podCreationTimestamp="2026-03-01 10:16:27 +0000 UTC" firstStartedPulling="2026-03-01 10:16:27.572539669 +0000 UTC m=+4116.814418866" lastFinishedPulling="2026-03-01 10:16:39.413136743 +0000 UTC m=+4128.655015940" observedRunningTime="2026-03-01 10:16:40.669212188 +0000 UTC m=+4129.911091385" watchObservedRunningTime="2026-03-01 10:16:40.671593608 +0000 UTC m=+4129.913472805" Mar 01 10:16:41 crc kubenswrapper[4792]: I0301 10:16:41.122163 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6z92d" podUID="7fb9c2d9-68e1-4fca-9749-7abfe0778f3f" containerName="registry-server" probeResult="failure" output=< Mar 01 10:16:41 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 10:16:41 crc kubenswrapper[4792]: > Mar 01 10:16:50 crc kubenswrapper[4792]: I0301 10:16:50.997194 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6z92d" podUID="7fb9c2d9-68e1-4fca-9749-7abfe0778f3f" containerName="registry-server" probeResult="failure" output=< Mar 01 10:16:50 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 10:16:50 crc kubenswrapper[4792]: > Mar 01 10:16:53 crc kubenswrapper[4792]: I0301 10:16:53.408801 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:16:53 crc kubenswrapper[4792]: E0301 10:16:53.409762 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:17:01 crc kubenswrapper[4792]: I0301 10:17:01.011196 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6z92d" podUID="7fb9c2d9-68e1-4fca-9749-7abfe0778f3f" containerName="registry-server" probeResult="failure" output=< Mar 01 10:17:01 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 10:17:01 crc kubenswrapper[4792]: > Mar 01 10:17:04 crc kubenswrapper[4792]: I0301 10:17:04.409194 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:17:04 crc kubenswrapper[4792]: E0301 10:17:04.409953 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:17:09 crc kubenswrapper[4792]: I0301 10:17:09.996141 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6z92d" Mar 01 10:17:10 crc kubenswrapper[4792]: I0301 10:17:10.051087 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6z92d" Mar 01 10:17:10 crc kubenswrapper[4792]: I0301 10:17:10.235070 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6z92d"] Mar 01 10:17:11 crc kubenswrapper[4792]: I0301 10:17:11.402220 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6z92d" podUID="7fb9c2d9-68e1-4fca-9749-7abfe0778f3f" containerName="registry-server" containerID="cri-o://a190ba791fc79c3bb1e89f0f4c912141789982112b6c4f33837ef401b1d2da09" gracePeriod=2 Mar 01 10:17:11 crc kubenswrapper[4792]: I0301 10:17:11.988467 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6z92d" Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.149510 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fb9c2d9-68e1-4fca-9749-7abfe0778f3f-utilities\") pod \"7fb9c2d9-68e1-4fca-9749-7abfe0778f3f\" (UID: \"7fb9c2d9-68e1-4fca-9749-7abfe0778f3f\") " Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.149565 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fb9c2d9-68e1-4fca-9749-7abfe0778f3f-catalog-content\") pod \"7fb9c2d9-68e1-4fca-9749-7abfe0778f3f\" (UID: \"7fb9c2d9-68e1-4fca-9749-7abfe0778f3f\") " Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.149737 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drslj\" (UniqueName: \"kubernetes.io/projected/7fb9c2d9-68e1-4fca-9749-7abfe0778f3f-kube-api-access-drslj\") pod \"7fb9c2d9-68e1-4fca-9749-7abfe0778f3f\" (UID: \"7fb9c2d9-68e1-4fca-9749-7abfe0778f3f\") " Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.150364 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fb9c2d9-68e1-4fca-9749-7abfe0778f3f-utilities" (OuterVolumeSpecName: "utilities") pod "7fb9c2d9-68e1-4fca-9749-7abfe0778f3f" (UID: "7fb9c2d9-68e1-4fca-9749-7abfe0778f3f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.162548 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fb9c2d9-68e1-4fca-9749-7abfe0778f3f-kube-api-access-drslj" (OuterVolumeSpecName: "kube-api-access-drslj") pod "7fb9c2d9-68e1-4fca-9749-7abfe0778f3f" (UID: "7fb9c2d9-68e1-4fca-9749-7abfe0778f3f"). InnerVolumeSpecName "kube-api-access-drslj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.252405 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drslj\" (UniqueName: \"kubernetes.io/projected/7fb9c2d9-68e1-4fca-9749-7abfe0778f3f-kube-api-access-drslj\") on node \"crc\" DevicePath \"\"" Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.252437 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fb9c2d9-68e1-4fca-9749-7abfe0778f3f-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.279892 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fb9c2d9-68e1-4fca-9749-7abfe0778f3f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7fb9c2d9-68e1-4fca-9749-7abfe0778f3f" (UID: "7fb9c2d9-68e1-4fca-9749-7abfe0778f3f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.354141 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fb9c2d9-68e1-4fca-9749-7abfe0778f3f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.411263 4792 generic.go:334] "Generic (PLEG): container finished" podID="7fb9c2d9-68e1-4fca-9749-7abfe0778f3f" containerID="a190ba791fc79c3bb1e89f0f4c912141789982112b6c4f33837ef401b1d2da09" exitCode=0 Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.411307 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6z92d" event={"ID":"7fb9c2d9-68e1-4fca-9749-7abfe0778f3f","Type":"ContainerDied","Data":"a190ba791fc79c3bb1e89f0f4c912141789982112b6c4f33837ef401b1d2da09"} Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.411344 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6z92d" event={"ID":"7fb9c2d9-68e1-4fca-9749-7abfe0778f3f","Type":"ContainerDied","Data":"9653c39b1fc9d37ab3fa3dc401ab1732f3efcb27add0085798088c83918fe421"} Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.411344 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6z92d" Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.411361 4792 scope.go:117] "RemoveContainer" containerID="a190ba791fc79c3bb1e89f0f4c912141789982112b6c4f33837ef401b1d2da09" Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.431111 4792 scope.go:117] "RemoveContainer" containerID="e3667961a693dba37bb04e11233051f1743a9f4f35ff5d1573d0a8ac7aed1fd7" Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.474988 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6z92d"] Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.485460 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6z92d"] Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.488165 4792 scope.go:117] "RemoveContainer" containerID="12e62220e8f801f41c466262887e7df524da2d6dadb151112dbd3c99a23c9b8e" Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.510868 4792 scope.go:117] "RemoveContainer" containerID="a190ba791fc79c3bb1e89f0f4c912141789982112b6c4f33837ef401b1d2da09" Mar 01 10:17:12 crc kubenswrapper[4792]: E0301 10:17:12.511528 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a190ba791fc79c3bb1e89f0f4c912141789982112b6c4f33837ef401b1d2da09\": container with ID starting with a190ba791fc79c3bb1e89f0f4c912141789982112b6c4f33837ef401b1d2da09 not found: ID does not exist" containerID="a190ba791fc79c3bb1e89f0f4c912141789982112b6c4f33837ef401b1d2da09" Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.511558 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a190ba791fc79c3bb1e89f0f4c912141789982112b6c4f33837ef401b1d2da09"} err="failed to get container status \"a190ba791fc79c3bb1e89f0f4c912141789982112b6c4f33837ef401b1d2da09\": rpc error: code = NotFound desc = could not find container \"a190ba791fc79c3bb1e89f0f4c912141789982112b6c4f33837ef401b1d2da09\": container with ID starting with a190ba791fc79c3bb1e89f0f4c912141789982112b6c4f33837ef401b1d2da09 not found: ID does not exist" Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.511580 4792 scope.go:117] "RemoveContainer" containerID="e3667961a693dba37bb04e11233051f1743a9f4f35ff5d1573d0a8ac7aed1fd7" Mar 01 10:17:12 crc kubenswrapper[4792]: E0301 10:17:12.513312 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3667961a693dba37bb04e11233051f1743a9f4f35ff5d1573d0a8ac7aed1fd7\": container with ID starting with e3667961a693dba37bb04e11233051f1743a9f4f35ff5d1573d0a8ac7aed1fd7 not found: ID does not exist" containerID="e3667961a693dba37bb04e11233051f1743a9f4f35ff5d1573d0a8ac7aed1fd7" Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.513355 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3667961a693dba37bb04e11233051f1743a9f4f35ff5d1573d0a8ac7aed1fd7"} err="failed to get container status \"e3667961a693dba37bb04e11233051f1743a9f4f35ff5d1573d0a8ac7aed1fd7\": rpc error: code = NotFound desc = could not find container \"e3667961a693dba37bb04e11233051f1743a9f4f35ff5d1573d0a8ac7aed1fd7\": container with ID starting with e3667961a693dba37bb04e11233051f1743a9f4f35ff5d1573d0a8ac7aed1fd7 not found: ID does not exist" Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.513383 4792 scope.go:117] "RemoveContainer" containerID="12e62220e8f801f41c466262887e7df524da2d6dadb151112dbd3c99a23c9b8e" Mar 01 10:17:12 crc kubenswrapper[4792]: E0301 10:17:12.514206 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12e62220e8f801f41c466262887e7df524da2d6dadb151112dbd3c99a23c9b8e\": container with ID starting with 12e62220e8f801f41c466262887e7df524da2d6dadb151112dbd3c99a23c9b8e not found: ID does not exist" containerID="12e62220e8f801f41c466262887e7df524da2d6dadb151112dbd3c99a23c9b8e" Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.514234 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12e62220e8f801f41c466262887e7df524da2d6dadb151112dbd3c99a23c9b8e"} err="failed to get container status \"12e62220e8f801f41c466262887e7df524da2d6dadb151112dbd3c99a23c9b8e\": rpc error: code = NotFound desc = could not find container \"12e62220e8f801f41c466262887e7df524da2d6dadb151112dbd3c99a23c9b8e\": container with ID starting with 12e62220e8f801f41c466262887e7df524da2d6dadb151112dbd3c99a23c9b8e not found: ID does not exist" Mar 01 10:17:13 crc kubenswrapper[4792]: I0301 10:17:13.422418 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fb9c2d9-68e1-4fca-9749-7abfe0778f3f" path="/var/lib/kubelet/pods/7fb9c2d9-68e1-4fca-9749-7abfe0778f3f/volumes" Mar 01 10:17:15 crc kubenswrapper[4792]: I0301 10:17:15.409831 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:17:15 crc kubenswrapper[4792]: E0301 10:17:15.410734 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:17:20 crc kubenswrapper[4792]: I0301 10:17:20.475593 4792 generic.go:334] "Generic (PLEG): container finished" podID="21e1cd52-26d9-4c5d-b87b-b124e2b7cb06" containerID="12a4ef8d3fa3b5f92a7e689e1d88b84e48e9aed058fb8fe4d0d9b00831ceeea9" exitCode=0 Mar 01 10:17:20 crc kubenswrapper[4792]: I0301 10:17:20.475813 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8szkd/crc-debug-zgt6k" event={"ID":"21e1cd52-26d9-4c5d-b87b-b124e2b7cb06","Type":"ContainerDied","Data":"12a4ef8d3fa3b5f92a7e689e1d88b84e48e9aed058fb8fe4d0d9b00831ceeea9"} Mar 01 10:17:21 crc kubenswrapper[4792]: I0301 10:17:21.663228 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8szkd/crc-debug-zgt6k" Mar 01 10:17:21 crc kubenswrapper[4792]: I0301 10:17:21.744216 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nbbf\" (UniqueName: \"kubernetes.io/projected/21e1cd52-26d9-4c5d-b87b-b124e2b7cb06-kube-api-access-9nbbf\") pod \"21e1cd52-26d9-4c5d-b87b-b124e2b7cb06\" (UID: \"21e1cd52-26d9-4c5d-b87b-b124e2b7cb06\") " Mar 01 10:17:21 crc kubenswrapper[4792]: I0301 10:17:21.744269 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/21e1cd52-26d9-4c5d-b87b-b124e2b7cb06-host\") pod \"21e1cd52-26d9-4c5d-b87b-b124e2b7cb06\" (UID: \"21e1cd52-26d9-4c5d-b87b-b124e2b7cb06\") " Mar 01 10:17:21 crc kubenswrapper[4792]: I0301 10:17:21.744418 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21e1cd52-26d9-4c5d-b87b-b124e2b7cb06-host" (OuterVolumeSpecName: "host") pod "21e1cd52-26d9-4c5d-b87b-b124e2b7cb06" (UID: "21e1cd52-26d9-4c5d-b87b-b124e2b7cb06"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 10:17:21 crc kubenswrapper[4792]: I0301 10:17:21.744719 4792 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/21e1cd52-26d9-4c5d-b87b-b124e2b7cb06-host\") on node \"crc\" DevicePath \"\"" Mar 01 10:17:21 crc kubenswrapper[4792]: I0301 10:17:21.752165 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21e1cd52-26d9-4c5d-b87b-b124e2b7cb06-kube-api-access-9nbbf" (OuterVolumeSpecName: "kube-api-access-9nbbf") pod "21e1cd52-26d9-4c5d-b87b-b124e2b7cb06" (UID: "21e1cd52-26d9-4c5d-b87b-b124e2b7cb06"). InnerVolumeSpecName "kube-api-access-9nbbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:17:21 crc kubenswrapper[4792]: I0301 10:17:21.782194 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8szkd/crc-debug-zgt6k"] Mar 01 10:17:21 crc kubenswrapper[4792]: I0301 10:17:21.798355 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8szkd/crc-debug-zgt6k"] Mar 01 10:17:21 crc kubenswrapper[4792]: I0301 10:17:21.847003 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nbbf\" (UniqueName: \"kubernetes.io/projected/21e1cd52-26d9-4c5d-b87b-b124e2b7cb06-kube-api-access-9nbbf\") on node \"crc\" DevicePath \"\"" Mar 01 10:17:22 crc kubenswrapper[4792]: I0301 10:17:22.510306 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="434542fb734e119099c99109e2963f44067d819ceef6faa0696fa255b4e3073f" Mar 01 10:17:22 crc kubenswrapper[4792]: I0301 10:17:22.510367 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8szkd/crc-debug-zgt6k" Mar 01 10:17:23 crc kubenswrapper[4792]: I0301 10:17:23.122462 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8szkd/crc-debug-ph9f5"] Mar 01 10:17:23 crc kubenswrapper[4792]: E0301 10:17:23.122953 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21e1cd52-26d9-4c5d-b87b-b124e2b7cb06" containerName="container-00" Mar 01 10:17:23 crc kubenswrapper[4792]: I0301 10:17:23.122969 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="21e1cd52-26d9-4c5d-b87b-b124e2b7cb06" containerName="container-00" Mar 01 10:17:23 crc kubenswrapper[4792]: E0301 10:17:23.122979 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fb9c2d9-68e1-4fca-9749-7abfe0778f3f" containerName="extract-content" Mar 01 10:17:23 crc kubenswrapper[4792]: I0301 10:17:23.122987 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fb9c2d9-68e1-4fca-9749-7abfe0778f3f" containerName="extract-content" Mar 01 10:17:23 crc kubenswrapper[4792]: E0301 10:17:23.123010 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fb9c2d9-68e1-4fca-9749-7abfe0778f3f" containerName="registry-server" Mar 01 10:17:23 crc kubenswrapper[4792]: I0301 10:17:23.123019 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fb9c2d9-68e1-4fca-9749-7abfe0778f3f" containerName="registry-server" Mar 01 10:17:23 crc kubenswrapper[4792]: E0301 10:17:23.123043 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fb9c2d9-68e1-4fca-9749-7abfe0778f3f" containerName="extract-utilities" Mar 01 10:17:23 crc kubenswrapper[4792]: I0301 10:17:23.123051 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fb9c2d9-68e1-4fca-9749-7abfe0778f3f" containerName="extract-utilities" Mar 01 10:17:23 crc kubenswrapper[4792]: I0301 10:17:23.123303 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="21e1cd52-26d9-4c5d-b87b-b124e2b7cb06" containerName="container-00" Mar 01 10:17:23 crc kubenswrapper[4792]: I0301 10:17:23.123317 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fb9c2d9-68e1-4fca-9749-7abfe0778f3f" containerName="registry-server" Mar 01 10:17:23 crc kubenswrapper[4792]: I0301 10:17:23.124079 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8szkd/crc-debug-ph9f5" Mar 01 10:17:23 crc kubenswrapper[4792]: I0301 10:17:23.171416 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbr4g\" (UniqueName: \"kubernetes.io/projected/7f8b498c-79c9-4d69-85fe-5662bac5a08d-kube-api-access-wbr4g\") pod \"crc-debug-ph9f5\" (UID: \"7f8b498c-79c9-4d69-85fe-5662bac5a08d\") " pod="openshift-must-gather-8szkd/crc-debug-ph9f5" Mar 01 10:17:23 crc kubenswrapper[4792]: I0301 10:17:23.171538 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7f8b498c-79c9-4d69-85fe-5662bac5a08d-host\") pod \"crc-debug-ph9f5\" (UID: \"7f8b498c-79c9-4d69-85fe-5662bac5a08d\") " pod="openshift-must-gather-8szkd/crc-debug-ph9f5" Mar 01 10:17:23 crc kubenswrapper[4792]: I0301 10:17:23.273151 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbr4g\" (UniqueName: \"kubernetes.io/projected/7f8b498c-79c9-4d69-85fe-5662bac5a08d-kube-api-access-wbr4g\") pod \"crc-debug-ph9f5\" (UID: \"7f8b498c-79c9-4d69-85fe-5662bac5a08d\") " pod="openshift-must-gather-8szkd/crc-debug-ph9f5" Mar 01 10:17:23 crc kubenswrapper[4792]: I0301 10:17:23.273240 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7f8b498c-79c9-4d69-85fe-5662bac5a08d-host\") pod \"crc-debug-ph9f5\" (UID: \"7f8b498c-79c9-4d69-85fe-5662bac5a08d\") " pod="openshift-must-gather-8szkd/crc-debug-ph9f5" Mar 01 10:17:23 crc kubenswrapper[4792]: I0301 10:17:23.273436 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7f8b498c-79c9-4d69-85fe-5662bac5a08d-host\") pod \"crc-debug-ph9f5\" (UID: \"7f8b498c-79c9-4d69-85fe-5662bac5a08d\") " pod="openshift-must-gather-8szkd/crc-debug-ph9f5" Mar 01 10:17:23 crc kubenswrapper[4792]: I0301 10:17:23.288618 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbr4g\" (UniqueName: \"kubernetes.io/projected/7f8b498c-79c9-4d69-85fe-5662bac5a08d-kube-api-access-wbr4g\") pod \"crc-debug-ph9f5\" (UID: \"7f8b498c-79c9-4d69-85fe-5662bac5a08d\") " pod="openshift-must-gather-8szkd/crc-debug-ph9f5" Mar 01 10:17:23 crc kubenswrapper[4792]: I0301 10:17:23.418367 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21e1cd52-26d9-4c5d-b87b-b124e2b7cb06" path="/var/lib/kubelet/pods/21e1cd52-26d9-4c5d-b87b-b124e2b7cb06/volumes" Mar 01 10:17:23 crc kubenswrapper[4792]: I0301 10:17:23.441058 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8szkd/crc-debug-ph9f5" Mar 01 10:17:23 crc kubenswrapper[4792]: I0301 10:17:23.546891 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8szkd/crc-debug-ph9f5" event={"ID":"7f8b498c-79c9-4d69-85fe-5662bac5a08d","Type":"ContainerStarted","Data":"8d94543550dd8ed17603b227cf614a7d05921b5c966f5daaf670da82374caa69"} Mar 01 10:17:24 crc kubenswrapper[4792]: I0301 10:17:24.559392 4792 generic.go:334] "Generic (PLEG): container finished" podID="7f8b498c-79c9-4d69-85fe-5662bac5a08d" containerID="df8ecbd7090c652c476da0720390ff4d5644e5341ba47e2dc96da76468154054" exitCode=0 Mar 01 10:17:24 crc kubenswrapper[4792]: I0301 10:17:24.559513 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8szkd/crc-debug-ph9f5" event={"ID":"7f8b498c-79c9-4d69-85fe-5662bac5a08d","Type":"ContainerDied","Data":"df8ecbd7090c652c476da0720390ff4d5644e5341ba47e2dc96da76468154054"} Mar 01 10:17:24 crc kubenswrapper[4792]: I0301 10:17:24.973591 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8szkd/crc-debug-ph9f5"] Mar 01 10:17:24 crc kubenswrapper[4792]: I0301 10:17:24.981579 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8szkd/crc-debug-ph9f5"] Mar 01 10:17:25 crc kubenswrapper[4792]: I0301 10:17:25.671531 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8szkd/crc-debug-ph9f5" Mar 01 10:17:25 crc kubenswrapper[4792]: I0301 10:17:25.721896 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbr4g\" (UniqueName: \"kubernetes.io/projected/7f8b498c-79c9-4d69-85fe-5662bac5a08d-kube-api-access-wbr4g\") pod \"7f8b498c-79c9-4d69-85fe-5662bac5a08d\" (UID: \"7f8b498c-79c9-4d69-85fe-5662bac5a08d\") " Mar 01 10:17:25 crc kubenswrapper[4792]: I0301 10:17:25.722016 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7f8b498c-79c9-4d69-85fe-5662bac5a08d-host\") pod \"7f8b498c-79c9-4d69-85fe-5662bac5a08d\" (UID: \"7f8b498c-79c9-4d69-85fe-5662bac5a08d\") " Mar 01 10:17:25 crc kubenswrapper[4792]: I0301 10:17:25.722558 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7f8b498c-79c9-4d69-85fe-5662bac5a08d-host" (OuterVolumeSpecName: "host") pod "7f8b498c-79c9-4d69-85fe-5662bac5a08d" (UID: "7f8b498c-79c9-4d69-85fe-5662bac5a08d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 10:17:25 crc kubenswrapper[4792]: I0301 10:17:25.728110 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f8b498c-79c9-4d69-85fe-5662bac5a08d-kube-api-access-wbr4g" (OuterVolumeSpecName: "kube-api-access-wbr4g") pod "7f8b498c-79c9-4d69-85fe-5662bac5a08d" (UID: "7f8b498c-79c9-4d69-85fe-5662bac5a08d"). InnerVolumeSpecName "kube-api-access-wbr4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:17:25 crc kubenswrapper[4792]: I0301 10:17:25.824953 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbr4g\" (UniqueName: \"kubernetes.io/projected/7f8b498c-79c9-4d69-85fe-5662bac5a08d-kube-api-access-wbr4g\") on node \"crc\" DevicePath \"\"" Mar 01 10:17:25 crc kubenswrapper[4792]: I0301 10:17:25.824999 4792 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7f8b498c-79c9-4d69-85fe-5662bac5a08d-host\") on node \"crc\" DevicePath \"\"" Mar 01 10:17:26 crc kubenswrapper[4792]: I0301 10:17:26.240131 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8szkd/crc-debug-c6rzj"] Mar 01 10:17:26 crc kubenswrapper[4792]: E0301 10:17:26.240932 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f8b498c-79c9-4d69-85fe-5662bac5a08d" containerName="container-00" Mar 01 10:17:26 crc kubenswrapper[4792]: I0301 10:17:26.240949 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f8b498c-79c9-4d69-85fe-5662bac5a08d" containerName="container-00" Mar 01 10:17:26 crc kubenswrapper[4792]: I0301 10:17:26.241131 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f8b498c-79c9-4d69-85fe-5662bac5a08d" containerName="container-00" Mar 01 10:17:26 crc kubenswrapper[4792]: I0301 10:17:26.241724 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8szkd/crc-debug-c6rzj" Mar 01 10:17:26 crc kubenswrapper[4792]: I0301 10:17:26.334444 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxkxn\" (UniqueName: \"kubernetes.io/projected/3548cb06-25bd-4b92-8633-a56e0996925f-kube-api-access-sxkxn\") pod \"crc-debug-c6rzj\" (UID: \"3548cb06-25bd-4b92-8633-a56e0996925f\") " pod="openshift-must-gather-8szkd/crc-debug-c6rzj" Mar 01 10:17:26 crc kubenswrapper[4792]: I0301 10:17:26.334511 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3548cb06-25bd-4b92-8633-a56e0996925f-host\") pod \"crc-debug-c6rzj\" (UID: \"3548cb06-25bd-4b92-8633-a56e0996925f\") " pod="openshift-must-gather-8szkd/crc-debug-c6rzj" Mar 01 10:17:26 crc kubenswrapper[4792]: I0301 10:17:26.435745 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3548cb06-25bd-4b92-8633-a56e0996925f-host\") pod \"crc-debug-c6rzj\" (UID: \"3548cb06-25bd-4b92-8633-a56e0996925f\") " pod="openshift-must-gather-8szkd/crc-debug-c6rzj" Mar 01 10:17:26 crc kubenswrapper[4792]: I0301 10:17:26.435884 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3548cb06-25bd-4b92-8633-a56e0996925f-host\") pod \"crc-debug-c6rzj\" (UID: \"3548cb06-25bd-4b92-8633-a56e0996925f\") " pod="openshift-must-gather-8szkd/crc-debug-c6rzj" Mar 01 10:17:26 crc kubenswrapper[4792]: I0301 10:17:26.435994 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxkxn\" (UniqueName: \"kubernetes.io/projected/3548cb06-25bd-4b92-8633-a56e0996925f-kube-api-access-sxkxn\") pod \"crc-debug-c6rzj\" (UID: \"3548cb06-25bd-4b92-8633-a56e0996925f\") " pod="openshift-must-gather-8szkd/crc-debug-c6rzj" Mar 01 10:17:26 crc kubenswrapper[4792]: I0301 10:17:26.680129 4792 scope.go:117] "RemoveContainer" containerID="df8ecbd7090c652c476da0720390ff4d5644e5341ba47e2dc96da76468154054" Mar 01 10:17:26 crc kubenswrapper[4792]: I0301 10:17:26.680243 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8szkd/crc-debug-ph9f5" Mar 01 10:17:26 crc kubenswrapper[4792]: I0301 10:17:26.761588 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxkxn\" (UniqueName: \"kubernetes.io/projected/3548cb06-25bd-4b92-8633-a56e0996925f-kube-api-access-sxkxn\") pod \"crc-debug-c6rzj\" (UID: \"3548cb06-25bd-4b92-8633-a56e0996925f\") " pod="openshift-must-gather-8szkd/crc-debug-c6rzj" Mar 01 10:17:26 crc kubenswrapper[4792]: I0301 10:17:26.863471 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8szkd/crc-debug-c6rzj" Mar 01 10:17:27 crc kubenswrapper[4792]: W0301 10:17:27.032292 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3548cb06_25bd_4b92_8633_a56e0996925f.slice/crio-c4ec4652ea067166a99eb4f583932618d94a381e324521e0c66f56b8a107e32d WatchSource:0}: Error finding container c4ec4652ea067166a99eb4f583932618d94a381e324521e0c66f56b8a107e32d: Status 404 returned error can't find the container with id c4ec4652ea067166a99eb4f583932618d94a381e324521e0c66f56b8a107e32d Mar 01 10:17:27 crc kubenswrapper[4792]: I0301 10:17:27.409450 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:17:27 crc kubenswrapper[4792]: E0301 10:17:27.409997 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:17:27 crc kubenswrapper[4792]: I0301 10:17:27.418652 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f8b498c-79c9-4d69-85fe-5662bac5a08d" path="/var/lib/kubelet/pods/7f8b498c-79c9-4d69-85fe-5662bac5a08d/volumes" Mar 01 10:17:27 crc kubenswrapper[4792]: I0301 10:17:27.691121 4792 generic.go:334] "Generic (PLEG): container finished" podID="3548cb06-25bd-4b92-8633-a56e0996925f" containerID="1fb264ea34cd0df53641cbc5584d4107d16d25be8d770a5ddb0d84a29a984936" exitCode=0 Mar 01 10:17:27 crc kubenswrapper[4792]: I0301 10:17:27.691199 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8szkd/crc-debug-c6rzj" event={"ID":"3548cb06-25bd-4b92-8633-a56e0996925f","Type":"ContainerDied","Data":"1fb264ea34cd0df53641cbc5584d4107d16d25be8d770a5ddb0d84a29a984936"} Mar 01 10:17:27 crc kubenswrapper[4792]: I0301 10:17:27.691241 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8szkd/crc-debug-c6rzj" event={"ID":"3548cb06-25bd-4b92-8633-a56e0996925f","Type":"ContainerStarted","Data":"c4ec4652ea067166a99eb4f583932618d94a381e324521e0c66f56b8a107e32d"} Mar 01 10:17:27 crc kubenswrapper[4792]: I0301 10:17:27.731556 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8szkd/crc-debug-c6rzj"] Mar 01 10:17:27 crc kubenswrapper[4792]: I0301 10:17:27.740227 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8szkd/crc-debug-c6rzj"] Mar 01 10:17:28 crc kubenswrapper[4792]: I0301 10:17:28.879634 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8szkd/crc-debug-c6rzj" Mar 01 10:17:28 crc kubenswrapper[4792]: I0301 10:17:28.912420 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3548cb06-25bd-4b92-8633-a56e0996925f-host\") pod \"3548cb06-25bd-4b92-8633-a56e0996925f\" (UID: \"3548cb06-25bd-4b92-8633-a56e0996925f\") " Mar 01 10:17:28 crc kubenswrapper[4792]: I0301 10:17:28.912536 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxkxn\" (UniqueName: \"kubernetes.io/projected/3548cb06-25bd-4b92-8633-a56e0996925f-kube-api-access-sxkxn\") pod \"3548cb06-25bd-4b92-8633-a56e0996925f\" (UID: \"3548cb06-25bd-4b92-8633-a56e0996925f\") " Mar 01 10:17:28 crc kubenswrapper[4792]: I0301 10:17:28.912535 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3548cb06-25bd-4b92-8633-a56e0996925f-host" (OuterVolumeSpecName: "host") pod "3548cb06-25bd-4b92-8633-a56e0996925f" (UID: "3548cb06-25bd-4b92-8633-a56e0996925f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 10:17:28 crc kubenswrapper[4792]: I0301 10:17:28.912942 4792 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3548cb06-25bd-4b92-8633-a56e0996925f-host\") on node \"crc\" DevicePath \"\"" Mar 01 10:17:28 crc kubenswrapper[4792]: I0301 10:17:28.918069 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3548cb06-25bd-4b92-8633-a56e0996925f-kube-api-access-sxkxn" (OuterVolumeSpecName: "kube-api-access-sxkxn") pod "3548cb06-25bd-4b92-8633-a56e0996925f" (UID: "3548cb06-25bd-4b92-8633-a56e0996925f"). InnerVolumeSpecName "kube-api-access-sxkxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:17:29 crc kubenswrapper[4792]: I0301 10:17:29.014777 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxkxn\" (UniqueName: \"kubernetes.io/projected/3548cb06-25bd-4b92-8633-a56e0996925f-kube-api-access-sxkxn\") on node \"crc\" DevicePath \"\"" Mar 01 10:17:29 crc kubenswrapper[4792]: I0301 10:17:29.421209 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3548cb06-25bd-4b92-8633-a56e0996925f" path="/var/lib/kubelet/pods/3548cb06-25bd-4b92-8633-a56e0996925f/volumes" Mar 01 10:17:29 crc kubenswrapper[4792]: I0301 10:17:29.709117 4792 scope.go:117] "RemoveContainer" containerID="1fb264ea34cd0df53641cbc5584d4107d16d25be8d770a5ddb0d84a29a984936" Mar 01 10:17:29 crc kubenswrapper[4792]: I0301 10:17:29.709494 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8szkd/crc-debug-c6rzj" Mar 01 10:17:39 crc kubenswrapper[4792]: I0301 10:17:39.768961 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8qspr"] Mar 01 10:17:39 crc kubenswrapper[4792]: E0301 10:17:39.769869 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3548cb06-25bd-4b92-8633-a56e0996925f" containerName="container-00" Mar 01 10:17:39 crc kubenswrapper[4792]: I0301 10:17:39.769881 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3548cb06-25bd-4b92-8633-a56e0996925f" containerName="container-00" Mar 01 10:17:39 crc kubenswrapper[4792]: I0301 10:17:39.770097 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="3548cb06-25bd-4b92-8633-a56e0996925f" containerName="container-00" Mar 01 10:17:39 crc kubenswrapper[4792]: I0301 10:17:39.771624 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8qspr" Mar 01 10:17:39 crc kubenswrapper[4792]: I0301 10:17:39.793301 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8qspr"] Mar 01 10:17:39 crc kubenswrapper[4792]: I0301 10:17:39.834079 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4edec827-16e6-4138-a7f0-0b84d0c3dfa6-utilities\") pod \"community-operators-8qspr\" (UID: \"4edec827-16e6-4138-a7f0-0b84d0c3dfa6\") " pod="openshift-marketplace/community-operators-8qspr" Mar 01 10:17:39 crc kubenswrapper[4792]: I0301 10:17:39.834154 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4edec827-16e6-4138-a7f0-0b84d0c3dfa6-catalog-content\") pod \"community-operators-8qspr\" (UID: \"4edec827-16e6-4138-a7f0-0b84d0c3dfa6\") " pod="openshift-marketplace/community-operators-8qspr" Mar 01 10:17:39 crc kubenswrapper[4792]: I0301 10:17:39.834208 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwqhw\" (UniqueName: \"kubernetes.io/projected/4edec827-16e6-4138-a7f0-0b84d0c3dfa6-kube-api-access-qwqhw\") pod \"community-operators-8qspr\" (UID: \"4edec827-16e6-4138-a7f0-0b84d0c3dfa6\") " pod="openshift-marketplace/community-operators-8qspr" Mar 01 10:17:39 crc kubenswrapper[4792]: I0301 10:17:39.935456 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4edec827-16e6-4138-a7f0-0b84d0c3dfa6-catalog-content\") pod \"community-operators-8qspr\" (UID: \"4edec827-16e6-4138-a7f0-0b84d0c3dfa6\") " pod="openshift-marketplace/community-operators-8qspr" Mar 01 10:17:39 crc kubenswrapper[4792]: I0301 10:17:39.935531 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwqhw\" (UniqueName: \"kubernetes.io/projected/4edec827-16e6-4138-a7f0-0b84d0c3dfa6-kube-api-access-qwqhw\") pod \"community-operators-8qspr\" (UID: \"4edec827-16e6-4138-a7f0-0b84d0c3dfa6\") " pod="openshift-marketplace/community-operators-8qspr" Mar 01 10:17:39 crc kubenswrapper[4792]: I0301 10:17:39.935644 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4edec827-16e6-4138-a7f0-0b84d0c3dfa6-utilities\") pod \"community-operators-8qspr\" (UID: \"4edec827-16e6-4138-a7f0-0b84d0c3dfa6\") " pod="openshift-marketplace/community-operators-8qspr" Mar 01 10:17:39 crc kubenswrapper[4792]: I0301 10:17:39.936147 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4edec827-16e6-4138-a7f0-0b84d0c3dfa6-utilities\") pod \"community-operators-8qspr\" (UID: \"4edec827-16e6-4138-a7f0-0b84d0c3dfa6\") " pod="openshift-marketplace/community-operators-8qspr" Mar 01 10:17:39 crc kubenswrapper[4792]: I0301 10:17:39.936356 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4edec827-16e6-4138-a7f0-0b84d0c3dfa6-catalog-content\") pod \"community-operators-8qspr\" (UID: \"4edec827-16e6-4138-a7f0-0b84d0c3dfa6\") " pod="openshift-marketplace/community-operators-8qspr" Mar 01 10:17:39 crc kubenswrapper[4792]: I0301 10:17:39.957038 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwqhw\" (UniqueName: \"kubernetes.io/projected/4edec827-16e6-4138-a7f0-0b84d0c3dfa6-kube-api-access-qwqhw\") pod \"community-operators-8qspr\" (UID: \"4edec827-16e6-4138-a7f0-0b84d0c3dfa6\") " pod="openshift-marketplace/community-operators-8qspr" Mar 01 10:17:40 crc kubenswrapper[4792]: I0301 10:17:40.090244 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8qspr" Mar 01 10:17:40 crc kubenswrapper[4792]: I0301 10:17:40.712756 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8qspr"] Mar 01 10:17:40 crc kubenswrapper[4792]: I0301 10:17:40.798529 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8qspr" event={"ID":"4edec827-16e6-4138-a7f0-0b84d0c3dfa6","Type":"ContainerStarted","Data":"6c752bb11d7e7fc2f68f413de4a7285c4f5635a9012e66eda517f2d86c1445a2"} Mar 01 10:17:41 crc kubenswrapper[4792]: I0301 10:17:41.809054 4792 generic.go:334] "Generic (PLEG): container finished" podID="4edec827-16e6-4138-a7f0-0b84d0c3dfa6" containerID="43c8c65f4329bbe44b7faf9b7a5c4619edafb02deae1f7f400f2f685457f7e17" exitCode=0 Mar 01 10:17:41 crc kubenswrapper[4792]: I0301 10:17:41.809179 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8qspr" event={"ID":"4edec827-16e6-4138-a7f0-0b84d0c3dfa6","Type":"ContainerDied","Data":"43c8c65f4329bbe44b7faf9b7a5c4619edafb02deae1f7f400f2f685457f7e17"} Mar 01 10:17:42 crc kubenswrapper[4792]: I0301 10:17:42.408757 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:17:42 crc kubenswrapper[4792]: E0301 10:17:42.409259 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:17:42 crc kubenswrapper[4792]: I0301 10:17:42.819373 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8qspr" event={"ID":"4edec827-16e6-4138-a7f0-0b84d0c3dfa6","Type":"ContainerStarted","Data":"608a7ad1aff7c1f3a38df93cfda6b576e32ba8d1b48e9a116624e3763fb48e60"} Mar 01 10:17:44 crc kubenswrapper[4792]: I0301 10:17:44.837561 4792 generic.go:334] "Generic (PLEG): container finished" podID="4edec827-16e6-4138-a7f0-0b84d0c3dfa6" containerID="608a7ad1aff7c1f3a38df93cfda6b576e32ba8d1b48e9a116624e3763fb48e60" exitCode=0 Mar 01 10:17:44 crc kubenswrapper[4792]: I0301 10:17:44.837658 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8qspr" event={"ID":"4edec827-16e6-4138-a7f0-0b84d0c3dfa6","Type":"ContainerDied","Data":"608a7ad1aff7c1f3a38df93cfda6b576e32ba8d1b48e9a116624e3763fb48e60"} Mar 01 10:17:45 crc kubenswrapper[4792]: I0301 10:17:45.853823 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8qspr" event={"ID":"4edec827-16e6-4138-a7f0-0b84d0c3dfa6","Type":"ContainerStarted","Data":"8333c23bb09de5c284f9191008ad8bb3eebc5ca29e546e6a1e85f7f7a48bcdaa"} Mar 01 10:17:45 crc kubenswrapper[4792]: I0301 10:17:45.895720 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8qspr" podStartSLOduration=3.497556296 podStartE2EDuration="6.895703404s" podCreationTimestamp="2026-03-01 10:17:39 +0000 UTC" firstStartedPulling="2026-03-01 10:17:41.81168804 +0000 UTC m=+4191.053567237" lastFinishedPulling="2026-03-01 10:17:45.209835148 +0000 UTC m=+4194.451714345" observedRunningTime="2026-03-01 10:17:45.884296201 +0000 UTC m=+4195.126175398" watchObservedRunningTime="2026-03-01 10:17:45.895703404 +0000 UTC m=+4195.137582601" Mar 01 10:17:50 crc kubenswrapper[4792]: I0301 10:17:50.090687 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8qspr" Mar 01 10:17:50 crc kubenswrapper[4792]: I0301 10:17:50.091279 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8qspr" Mar 01 10:17:50 crc kubenswrapper[4792]: I0301 10:17:50.201041 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8qspr" Mar 01 10:17:50 crc kubenswrapper[4792]: I0301 10:17:50.959624 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8qspr" Mar 01 10:17:51 crc kubenswrapper[4792]: I0301 10:17:51.031051 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8qspr"] Mar 01 10:17:52 crc kubenswrapper[4792]: I0301 10:17:52.916566 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8qspr" podUID="4edec827-16e6-4138-a7f0-0b84d0c3dfa6" containerName="registry-server" containerID="cri-o://8333c23bb09de5c284f9191008ad8bb3eebc5ca29e546e6a1e85f7f7a48bcdaa" gracePeriod=2 Mar 01 10:17:53 crc kubenswrapper[4792]: I0301 10:17:53.895467 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8qspr" Mar 01 10:17:53 crc kubenswrapper[4792]: I0301 10:17:53.922419 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4edec827-16e6-4138-a7f0-0b84d0c3dfa6-catalog-content\") pod \"4edec827-16e6-4138-a7f0-0b84d0c3dfa6\" (UID: \"4edec827-16e6-4138-a7f0-0b84d0c3dfa6\") " Mar 01 10:17:53 crc kubenswrapper[4792]: I0301 10:17:53.922504 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4edec827-16e6-4138-a7f0-0b84d0c3dfa6-utilities\") pod \"4edec827-16e6-4138-a7f0-0b84d0c3dfa6\" (UID: \"4edec827-16e6-4138-a7f0-0b84d0c3dfa6\") " Mar 01 10:17:53 crc kubenswrapper[4792]: I0301 10:17:53.922522 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwqhw\" (UniqueName: \"kubernetes.io/projected/4edec827-16e6-4138-a7f0-0b84d0c3dfa6-kube-api-access-qwqhw\") pod \"4edec827-16e6-4138-a7f0-0b84d0c3dfa6\" (UID: \"4edec827-16e6-4138-a7f0-0b84d0c3dfa6\") " Mar 01 10:17:53 crc kubenswrapper[4792]: I0301 10:17:53.924125 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4edec827-16e6-4138-a7f0-0b84d0c3dfa6-utilities" (OuterVolumeSpecName: "utilities") pod "4edec827-16e6-4138-a7f0-0b84d0c3dfa6" (UID: "4edec827-16e6-4138-a7f0-0b84d0c3dfa6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:17:53 crc kubenswrapper[4792]: I0301 10:17:53.946705 4792 generic.go:334] "Generic (PLEG): container finished" podID="4edec827-16e6-4138-a7f0-0b84d0c3dfa6" containerID="8333c23bb09de5c284f9191008ad8bb3eebc5ca29e546e6a1e85f7f7a48bcdaa" exitCode=0 Mar 01 10:17:53 crc kubenswrapper[4792]: I0301 10:17:53.946753 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8qspr" event={"ID":"4edec827-16e6-4138-a7f0-0b84d0c3dfa6","Type":"ContainerDied","Data":"8333c23bb09de5c284f9191008ad8bb3eebc5ca29e546e6a1e85f7f7a48bcdaa"} Mar 01 10:17:53 crc kubenswrapper[4792]: I0301 10:17:53.946785 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8qspr" event={"ID":"4edec827-16e6-4138-a7f0-0b84d0c3dfa6","Type":"ContainerDied","Data":"6c752bb11d7e7fc2f68f413de4a7285c4f5635a9012e66eda517f2d86c1445a2"} Mar 01 10:17:53 crc kubenswrapper[4792]: I0301 10:17:53.946809 4792 scope.go:117] "RemoveContainer" containerID="8333c23bb09de5c284f9191008ad8bb3eebc5ca29e546e6a1e85f7f7a48bcdaa" Mar 01 10:17:53 crc kubenswrapper[4792]: I0301 10:17:53.947431 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8qspr" Mar 01 10:17:53 crc kubenswrapper[4792]: I0301 10:17:53.957917 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4edec827-16e6-4138-a7f0-0b84d0c3dfa6-kube-api-access-qwqhw" (OuterVolumeSpecName: "kube-api-access-qwqhw") pod "4edec827-16e6-4138-a7f0-0b84d0c3dfa6" (UID: "4edec827-16e6-4138-a7f0-0b84d0c3dfa6"). InnerVolumeSpecName "kube-api-access-qwqhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:17:54 crc kubenswrapper[4792]: I0301 10:17:54.007858 4792 scope.go:117] "RemoveContainer" containerID="608a7ad1aff7c1f3a38df93cfda6b576e32ba8d1b48e9a116624e3763fb48e60" Mar 01 10:17:54 crc kubenswrapper[4792]: I0301 10:17:54.026753 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4edec827-16e6-4138-a7f0-0b84d0c3dfa6-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 10:17:54 crc kubenswrapper[4792]: I0301 10:17:54.026785 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwqhw\" (UniqueName: \"kubernetes.io/projected/4edec827-16e6-4138-a7f0-0b84d0c3dfa6-kube-api-access-qwqhw\") on node \"crc\" DevicePath \"\"" Mar 01 10:17:54 crc kubenswrapper[4792]: I0301 10:17:54.041557 4792 scope.go:117] "RemoveContainer" containerID="43c8c65f4329bbe44b7faf9b7a5c4619edafb02deae1f7f400f2f685457f7e17" Mar 01 10:17:54 crc kubenswrapper[4792]: I0301 10:17:54.094896 4792 scope.go:117] "RemoveContainer" containerID="8333c23bb09de5c284f9191008ad8bb3eebc5ca29e546e6a1e85f7f7a48bcdaa" Mar 01 10:17:54 crc kubenswrapper[4792]: E0301 10:17:54.095711 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8333c23bb09de5c284f9191008ad8bb3eebc5ca29e546e6a1e85f7f7a48bcdaa\": container with ID starting with 8333c23bb09de5c284f9191008ad8bb3eebc5ca29e546e6a1e85f7f7a48bcdaa not found: ID does not exist" containerID="8333c23bb09de5c284f9191008ad8bb3eebc5ca29e546e6a1e85f7f7a48bcdaa" Mar 01 10:17:54 crc kubenswrapper[4792]: I0301 10:17:54.095753 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8333c23bb09de5c284f9191008ad8bb3eebc5ca29e546e6a1e85f7f7a48bcdaa"} err="failed to get container status \"8333c23bb09de5c284f9191008ad8bb3eebc5ca29e546e6a1e85f7f7a48bcdaa\": rpc error: code = NotFound desc = could not find container \"8333c23bb09de5c284f9191008ad8bb3eebc5ca29e546e6a1e85f7f7a48bcdaa\": container with ID starting with 8333c23bb09de5c284f9191008ad8bb3eebc5ca29e546e6a1e85f7f7a48bcdaa not found: ID does not exist" Mar 01 10:17:54 crc kubenswrapper[4792]: I0301 10:17:54.095778 4792 scope.go:117] "RemoveContainer" containerID="608a7ad1aff7c1f3a38df93cfda6b576e32ba8d1b48e9a116624e3763fb48e60" Mar 01 10:17:54 crc kubenswrapper[4792]: E0301 10:17:54.096276 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"608a7ad1aff7c1f3a38df93cfda6b576e32ba8d1b48e9a116624e3763fb48e60\": container with ID starting with 608a7ad1aff7c1f3a38df93cfda6b576e32ba8d1b48e9a116624e3763fb48e60 not found: ID does not exist" containerID="608a7ad1aff7c1f3a38df93cfda6b576e32ba8d1b48e9a116624e3763fb48e60" Mar 01 10:17:54 crc kubenswrapper[4792]: I0301 10:17:54.096324 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"608a7ad1aff7c1f3a38df93cfda6b576e32ba8d1b48e9a116624e3763fb48e60"} err="failed to get container status \"608a7ad1aff7c1f3a38df93cfda6b576e32ba8d1b48e9a116624e3763fb48e60\": rpc error: code = NotFound desc = could not find container \"608a7ad1aff7c1f3a38df93cfda6b576e32ba8d1b48e9a116624e3763fb48e60\": container with ID starting with 608a7ad1aff7c1f3a38df93cfda6b576e32ba8d1b48e9a116624e3763fb48e60 not found: ID does not exist" Mar 01 10:17:54 crc kubenswrapper[4792]: I0301 10:17:54.096367 4792 scope.go:117] "RemoveContainer" containerID="43c8c65f4329bbe44b7faf9b7a5c4619edafb02deae1f7f400f2f685457f7e17" Mar 01 10:17:54 crc kubenswrapper[4792]: E0301 10:17:54.096770 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43c8c65f4329bbe44b7faf9b7a5c4619edafb02deae1f7f400f2f685457f7e17\": container with ID starting with 43c8c65f4329bbe44b7faf9b7a5c4619edafb02deae1f7f400f2f685457f7e17 not found: ID does not exist" containerID="43c8c65f4329bbe44b7faf9b7a5c4619edafb02deae1f7f400f2f685457f7e17" Mar 01 10:17:54 crc kubenswrapper[4792]: I0301 10:17:54.096811 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43c8c65f4329bbe44b7faf9b7a5c4619edafb02deae1f7f400f2f685457f7e17"} err="failed to get container status \"43c8c65f4329bbe44b7faf9b7a5c4619edafb02deae1f7f400f2f685457f7e17\": rpc error: code = NotFound desc = could not find container \"43c8c65f4329bbe44b7faf9b7a5c4619edafb02deae1f7f400f2f685457f7e17\": container with ID starting with 43c8c65f4329bbe44b7faf9b7a5c4619edafb02deae1f7f400f2f685457f7e17 not found: ID does not exist" Mar 01 10:17:54 crc kubenswrapper[4792]: I0301 10:17:54.239914 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4edec827-16e6-4138-a7f0-0b84d0c3dfa6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4edec827-16e6-4138-a7f0-0b84d0c3dfa6" (UID: "4edec827-16e6-4138-a7f0-0b84d0c3dfa6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:17:54 crc kubenswrapper[4792]: I0301 10:17:54.284478 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8qspr"] Mar 01 10:17:54 crc kubenswrapper[4792]: I0301 10:17:54.292287 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8qspr"] Mar 01 10:17:54 crc kubenswrapper[4792]: I0301 10:17:54.337354 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4edec827-16e6-4138-a7f0-0b84d0c3dfa6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 10:17:55 crc kubenswrapper[4792]: I0301 10:17:55.408897 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:17:55 crc kubenswrapper[4792]: E0301 10:17:55.409533 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:17:55 crc kubenswrapper[4792]: I0301 10:17:55.418794 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4edec827-16e6-4138-a7f0-0b84d0c3dfa6" path="/var/lib/kubelet/pods/4edec827-16e6-4138-a7f0-0b84d0c3dfa6/volumes" Mar 01 10:18:00 crc kubenswrapper[4792]: I0301 10:18:00.148976 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539338-kcgfx"] Mar 01 10:18:00 crc kubenswrapper[4792]: E0301 10:18:00.149751 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4edec827-16e6-4138-a7f0-0b84d0c3dfa6" containerName="registry-server" Mar 01 10:18:00 crc kubenswrapper[4792]: I0301 10:18:00.149764 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4edec827-16e6-4138-a7f0-0b84d0c3dfa6" containerName="registry-server" Mar 01 10:18:00 crc kubenswrapper[4792]: E0301 10:18:00.149793 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4edec827-16e6-4138-a7f0-0b84d0c3dfa6" containerName="extract-content" Mar 01 10:18:00 crc kubenswrapper[4792]: I0301 10:18:00.149799 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4edec827-16e6-4138-a7f0-0b84d0c3dfa6" containerName="extract-content" Mar 01 10:18:00 crc kubenswrapper[4792]: E0301 10:18:00.149814 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4edec827-16e6-4138-a7f0-0b84d0c3dfa6" containerName="extract-utilities" Mar 01 10:18:00 crc kubenswrapper[4792]: I0301 10:18:00.149821 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4edec827-16e6-4138-a7f0-0b84d0c3dfa6" containerName="extract-utilities" Mar 01 10:18:00 crc kubenswrapper[4792]: I0301 10:18:00.150660 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="4edec827-16e6-4138-a7f0-0b84d0c3dfa6" containerName="registry-server" Mar 01 10:18:00 crc kubenswrapper[4792]: I0301 10:18:00.212203 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539338-kcgfx" Mar 01 10:18:00 crc kubenswrapper[4792]: I0301 10:18:00.215028 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 10:18:00 crc kubenswrapper[4792]: I0301 10:18:00.216383 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 10:18:00 crc kubenswrapper[4792]: I0301 10:18:00.216997 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 10:18:00 crc kubenswrapper[4792]: I0301 10:18:00.219670 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539338-kcgfx"] Mar 01 10:18:00 crc kubenswrapper[4792]: I0301 10:18:00.255034 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwkt8\" (UniqueName: \"kubernetes.io/projected/a36fce8b-7564-4d74-b3ad-9bfe9979cb67-kube-api-access-gwkt8\") pod \"auto-csr-approver-29539338-kcgfx\" (UID: \"a36fce8b-7564-4d74-b3ad-9bfe9979cb67\") " pod="openshift-infra/auto-csr-approver-29539338-kcgfx" Mar 01 10:18:00 crc kubenswrapper[4792]: I0301 10:18:00.356998 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwkt8\" (UniqueName: \"kubernetes.io/projected/a36fce8b-7564-4d74-b3ad-9bfe9979cb67-kube-api-access-gwkt8\") pod \"auto-csr-approver-29539338-kcgfx\" (UID: \"a36fce8b-7564-4d74-b3ad-9bfe9979cb67\") " pod="openshift-infra/auto-csr-approver-29539338-kcgfx" Mar 01 10:18:00 crc kubenswrapper[4792]: I0301 10:18:00.377874 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwkt8\" (UniqueName: \"kubernetes.io/projected/a36fce8b-7564-4d74-b3ad-9bfe9979cb67-kube-api-access-gwkt8\") pod \"auto-csr-approver-29539338-kcgfx\" (UID: \"a36fce8b-7564-4d74-b3ad-9bfe9979cb67\") " pod="openshift-infra/auto-csr-approver-29539338-kcgfx" Mar 01 10:18:00 crc kubenswrapper[4792]: I0301 10:18:00.559169 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539338-kcgfx" Mar 01 10:18:01 crc kubenswrapper[4792]: I0301 10:18:01.018677 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539338-kcgfx"] Mar 01 10:18:02 crc kubenswrapper[4792]: I0301 10:18:02.028793 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539338-kcgfx" event={"ID":"a36fce8b-7564-4d74-b3ad-9bfe9979cb67","Type":"ContainerStarted","Data":"122486aa561c7db6b94b6a3f7bfb44fc59bc39758ef217f2d542471c2f785f9f"} Mar 01 10:18:03 crc kubenswrapper[4792]: I0301 10:18:03.038889 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539338-kcgfx" event={"ID":"a36fce8b-7564-4d74-b3ad-9bfe9979cb67","Type":"ContainerStarted","Data":"dd7d1b10f4227bfce662716c9cbd0e72c4c6de3ac564b41925cd39f0160eafdf"} Mar 01 10:18:03 crc kubenswrapper[4792]: I0301 10:18:03.056535 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29539338-kcgfx" podStartSLOduration=2.158880776 podStartE2EDuration="3.056513513s" podCreationTimestamp="2026-03-01 10:18:00 +0000 UTC" firstStartedPulling="2026-03-01 10:18:01.166538424 +0000 UTC m=+4210.408417621" lastFinishedPulling="2026-03-01 10:18:02.064171161 +0000 UTC m=+4211.306050358" observedRunningTime="2026-03-01 10:18:03.053052547 +0000 UTC m=+4212.294931764" watchObservedRunningTime="2026-03-01 10:18:03.056513513 +0000 UTC m=+4212.298392720" Mar 01 10:18:04 crc kubenswrapper[4792]: I0301 10:18:04.048744 4792 generic.go:334] "Generic (PLEG): container finished" podID="a36fce8b-7564-4d74-b3ad-9bfe9979cb67" containerID="dd7d1b10f4227bfce662716c9cbd0e72c4c6de3ac564b41925cd39f0160eafdf" exitCode=0 Mar 01 10:18:04 crc kubenswrapper[4792]: I0301 10:18:04.048804 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539338-kcgfx" event={"ID":"a36fce8b-7564-4d74-b3ad-9bfe9979cb67","Type":"ContainerDied","Data":"dd7d1b10f4227bfce662716c9cbd0e72c4c6de3ac564b41925cd39f0160eafdf"} Mar 01 10:18:05 crc kubenswrapper[4792]: I0301 10:18:05.434307 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539338-kcgfx" Mar 01 10:18:05 crc kubenswrapper[4792]: I0301 10:18:05.581423 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwkt8\" (UniqueName: \"kubernetes.io/projected/a36fce8b-7564-4d74-b3ad-9bfe9979cb67-kube-api-access-gwkt8\") pod \"a36fce8b-7564-4d74-b3ad-9bfe9979cb67\" (UID: \"a36fce8b-7564-4d74-b3ad-9bfe9979cb67\") " Mar 01 10:18:05 crc kubenswrapper[4792]: I0301 10:18:05.594481 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a36fce8b-7564-4d74-b3ad-9bfe9979cb67-kube-api-access-gwkt8" (OuterVolumeSpecName: "kube-api-access-gwkt8") pod "a36fce8b-7564-4d74-b3ad-9bfe9979cb67" (UID: "a36fce8b-7564-4d74-b3ad-9bfe9979cb67"). InnerVolumeSpecName "kube-api-access-gwkt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:18:05 crc kubenswrapper[4792]: I0301 10:18:05.684695 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwkt8\" (UniqueName: \"kubernetes.io/projected/a36fce8b-7564-4d74-b3ad-9bfe9979cb67-kube-api-access-gwkt8\") on node \"crc\" DevicePath \"\"" Mar 01 10:18:06 crc kubenswrapper[4792]: I0301 10:18:06.066001 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539338-kcgfx" event={"ID":"a36fce8b-7564-4d74-b3ad-9bfe9979cb67","Type":"ContainerDied","Data":"122486aa561c7db6b94b6a3f7bfb44fc59bc39758ef217f2d542471c2f785f9f"} Mar 01 10:18:06 crc kubenswrapper[4792]: I0301 10:18:06.066178 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="122486aa561c7db6b94b6a3f7bfb44fc59bc39758ef217f2d542471c2f785f9f" Mar 01 10:18:06 crc kubenswrapper[4792]: I0301 10:18:06.066053 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539338-kcgfx" Mar 01 10:18:06 crc kubenswrapper[4792]: I0301 10:18:06.122329 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539332-f6brk"] Mar 01 10:18:06 crc kubenswrapper[4792]: I0301 10:18:06.131448 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539332-f6brk"] Mar 01 10:18:07 crc kubenswrapper[4792]: I0301 10:18:07.421042 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f9c6b5a-8834-45a7-bf9d-000bcfd068f3" path="/var/lib/kubelet/pods/5f9c6b5a-8834-45a7-bf9d-000bcfd068f3/volumes" Mar 01 10:18:08 crc kubenswrapper[4792]: I0301 10:18:08.409064 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:18:08 crc kubenswrapper[4792]: E0301 10:18:08.409629 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:18:19 crc kubenswrapper[4792]: I0301 10:18:19.409035 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:18:19 crc kubenswrapper[4792]: E0301 10:18:19.409792 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:18:26 crc kubenswrapper[4792]: I0301 10:18:26.234238 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7f86869f48-jg6nw_6c97472a-b6b7-4fc4-b872-a318812f0999/barbican-api/0.log" Mar 01 10:18:26 crc kubenswrapper[4792]: I0301 10:18:26.492845 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7f86869f48-jg6nw_6c97472a-b6b7-4fc4-b872-a318812f0999/barbican-api-log/0.log" Mar 01 10:18:26 crc kubenswrapper[4792]: I0301 10:18:26.535789 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-64f98fd86b-96l6n_d30c642c-b4ae-495a-8acd-cc8be4a0f412/barbican-keystone-listener/0.log" Mar 01 10:18:26 crc kubenswrapper[4792]: I0301 10:18:26.581592 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-64f98fd86b-96l6n_d30c642c-b4ae-495a-8acd-cc8be4a0f412/barbican-keystone-listener-log/0.log" Mar 01 10:18:27 crc kubenswrapper[4792]: I0301 10:18:27.261870 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-65f4d58895-tvn59_26fbd30a-a485-4463-9aac-bb695c43e9e3/barbican-worker-log/0.log" Mar 01 10:18:27 crc kubenswrapper[4792]: I0301 10:18:27.267907 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-65f4d58895-tvn59_26fbd30a-a485-4463-9aac-bb695c43e9e3/barbican-worker/0.log" Mar 01 10:18:27 crc kubenswrapper[4792]: I0301 10:18:27.489792 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb_1201ca91-41eb-45d0-991d-71883b4014ae/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:18:27 crc kubenswrapper[4792]: I0301 10:18:27.529741 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_63238274-bc2e-4686-8371-e891944269f9/ceilometer-central-agent/0.log" Mar 01 10:18:27 crc kubenswrapper[4792]: I0301 10:18:27.589373 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_63238274-bc2e-4686-8371-e891944269f9/ceilometer-notification-agent/0.log" Mar 01 10:18:27 crc kubenswrapper[4792]: I0301 10:18:27.707850 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_63238274-bc2e-4686-8371-e891944269f9/proxy-httpd/0.log" Mar 01 10:18:27 crc kubenswrapper[4792]: I0301 10:18:27.837804 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_63238274-bc2e-4686-8371-e891944269f9/sg-core/0.log" Mar 01 10:18:28 crc kubenswrapper[4792]: I0301 10:18:28.089035 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj_f3a428e9-b35d-4f80-bb40-c158095d5bfa/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:18:28 crc kubenswrapper[4792]: I0301 10:18:28.527023 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m_2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:18:28 crc kubenswrapper[4792]: I0301 10:18:28.644718 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_084f9db1-15eb-458c-8b43-aeb5dbb0555f/cinder-api-log/0.log" Mar 01 10:18:28 crc kubenswrapper[4792]: I0301 10:18:28.663304 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_084f9db1-15eb-458c-8b43-aeb5dbb0555f/cinder-api/0.log" Mar 01 10:18:28 crc kubenswrapper[4792]: I0301 10:18:28.878961 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_23d15722-3d0f-44ce-ac55-eba67760f0e9/probe/0.log" Mar 01 10:18:29 crc kubenswrapper[4792]: I0301 10:18:29.110818 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_23d15722-3d0f-44ce-ac55-eba67760f0e9/cinder-backup/0.log" Mar 01 10:18:29 crc kubenswrapper[4792]: I0301 10:18:29.187064 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_688f590f-ae5c-4caf-b8c7-013a118f42c5/cinder-scheduler/0.log" Mar 01 10:18:29 crc kubenswrapper[4792]: I0301 10:18:29.279053 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_688f590f-ae5c-4caf-b8c7-013a118f42c5/probe/0.log" Mar 01 10:18:29 crc kubenswrapper[4792]: I0301 10:18:29.384399 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_d3ca4743-fa6c-4e2e-b2c8-b2362f44a727/cinder-volume/0.log" Mar 01 10:18:29 crc kubenswrapper[4792]: I0301 10:18:29.462559 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_d3ca4743-fa6c-4e2e-b2c8-b2362f44a727/probe/0.log" Mar 01 10:18:29 crc kubenswrapper[4792]: I0301 10:18:29.709115 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5_cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:18:29 crc kubenswrapper[4792]: I0301 10:18:29.749609 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-dtgks_f25228f4-912f-408c-a1d6-9279c350b767/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:18:29 crc kubenswrapper[4792]: I0301 10:18:29.907223 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-86c6bdcc4c-fqgkv_49541358-1fd0-4d1d-8b61-0c618994dfc0/init/0.log" Mar 01 10:18:30 crc kubenswrapper[4792]: I0301 10:18:30.125056 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-86c6bdcc4c-fqgkv_49541358-1fd0-4d1d-8b61-0c618994dfc0/init/0.log" Mar 01 10:18:30 crc kubenswrapper[4792]: I0301 10:18:30.247410 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_52b189da-3327-40c1-bf22-a842b0980593/glance-httpd/0.log" Mar 01 10:18:30 crc kubenswrapper[4792]: I0301 10:18:30.276184 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-86c6bdcc4c-fqgkv_49541358-1fd0-4d1d-8b61-0c618994dfc0/dnsmasq-dns/0.log" Mar 01 10:18:30 crc kubenswrapper[4792]: I0301 10:18:30.362984 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_52b189da-3327-40c1-bf22-a842b0980593/glance-log/0.log" Mar 01 10:18:30 crc kubenswrapper[4792]: I0301 10:18:30.554387 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_9d055103-6c35-481f-820a-7aa363543404/glance-httpd/0.log" Mar 01 10:18:30 crc kubenswrapper[4792]: I0301 10:18:30.616969 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_9d055103-6c35-481f-820a-7aa363543404/glance-log/0.log" Mar 01 10:18:30 crc kubenswrapper[4792]: I0301 10:18:30.835590 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-79f8cb6d9d-xg7h5_d7f79f77-ac1b-445e-8e28-85c8964f5461/horizon/0.log" Mar 01 10:18:30 crc kubenswrapper[4792]: I0301 10:18:30.920163 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-79f8cb6d9d-xg7h5_d7f79f77-ac1b-445e-8e28-85c8964f5461/horizon-log/0.log" Mar 01 10:18:30 crc kubenswrapper[4792]: I0301 10:18:30.986508 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-xqprh_d11c64e6-0562-41d9-a213-f1c5749b4c83/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:18:31 crc kubenswrapper[4792]: I0301 10:18:31.162002 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-4rw28_822af429-9091-43e5-a16d-7a287f2c5bb2/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:18:31 crc kubenswrapper[4792]: I0301 10:18:31.346062 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29539321-sclgm_7ec04609-b280-4df0-a0c5-2e4c7208c1c6/keystone-cron/0.log" Mar 01 10:18:31 crc kubenswrapper[4792]: I0301 10:18:31.426784 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-749f685d77-ggsln_b60e7776-3e2a-4e08-900d-cd39a29a78bc/keystone-api/0.log" Mar 01 10:18:31 crc kubenswrapper[4792]: I0301 10:18:31.498133 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_6f21d62f-3539-4d5d-aeaa-cc816a51d412/kube-state-metrics/0.log" Mar 01 10:18:31 crc kubenswrapper[4792]: I0301 10:18:31.772545 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt_c7230f65-7e9a-4455-8d25-c49393bfbafe/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:18:31 crc kubenswrapper[4792]: I0301 10:18:31.818265 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_e6660fdc-5636-44ec-b6c0-e0e417d72e8a/manila-api-log/0.log" Mar 01 10:18:31 crc kubenswrapper[4792]: I0301 10:18:31.865614 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_e6660fdc-5636-44ec-b6c0-e0e417d72e8a/manila-api/0.log" Mar 01 10:18:32 crc kubenswrapper[4792]: I0301 10:18:32.060716 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_5813cf9a-1d9e-4a74-82e1-68e994c9175a/probe/0.log" Mar 01 10:18:32 crc kubenswrapper[4792]: I0301 10:18:32.149037 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_5813cf9a-1d9e-4a74-82e1-68e994c9175a/manila-scheduler/0.log" Mar 01 10:18:32 crc kubenswrapper[4792]: I0301 10:18:32.166699 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_03462f2f-874f-496a-934b-9fa6e2c55850/manila-share/0.log" Mar 01 10:18:32 crc kubenswrapper[4792]: I0301 10:18:32.257082 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_03462f2f-874f-496a-934b-9fa6e2c55850/probe/0.log" Mar 01 10:18:32 crc kubenswrapper[4792]: I0301 10:18:32.409617 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:18:32 crc kubenswrapper[4792]: E0301 10:18:32.409813 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:18:32 crc kubenswrapper[4792]: I0301 10:18:32.598858 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6c8bdfb955-kjg92_ceced30a-39e5-413f-a498-e5d4500f1eea/neutron-api/0.log" Mar 01 10:18:32 crc kubenswrapper[4792]: I0301 10:18:32.688700 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6c8bdfb955-kjg92_ceced30a-39e5-413f-a498-e5d4500f1eea/neutron-httpd/0.log" Mar 01 10:18:32 crc kubenswrapper[4792]: I0301 10:18:32.832112 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt_f737af00-5e6f-4a95-bf94-738b72990ebd/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:18:33 crc kubenswrapper[4792]: I0301 10:18:33.383038 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_f95aafcd-79b6-4ece-b3e1-ee9ea32a2754/nova-cell0-conductor-conductor/0.log" Mar 01 10:18:33 crc kubenswrapper[4792]: I0301 10:18:33.486292 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_9c6de822-b7f5-4530-bb5b-ca879ff899fc/nova-api-log/0.log" Mar 01 10:18:33 crc kubenswrapper[4792]: I0301 10:18:33.700445 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_9c6de822-b7f5-4530-bb5b-ca879ff899fc/nova-api-api/0.log" Mar 01 10:18:33 crc kubenswrapper[4792]: I0301 10:18:33.747712 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_9ef6cc4e-2fd6-403b-a163-638395c30672/nova-cell1-conductor-conductor/0.log" Mar 01 10:18:33 crc kubenswrapper[4792]: I0301 10:18:33.812200 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_63afaac7-c934-4410-b2b5-ab04ad085489/nova-cell1-novncproxy-novncproxy/0.log" Mar 01 10:18:34 crc kubenswrapper[4792]: I0301 10:18:34.076592 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7_d7776778-c586-4ab6-8fdf-bfed4168992d/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:18:34 crc kubenswrapper[4792]: I0301 10:18:34.194798 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_cbf9560f-212f-460a-9a4d-250e20b00d18/nova-metadata-log/0.log" Mar 01 10:18:34 crc kubenswrapper[4792]: I0301 10:18:34.834974 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_3a38c1a1-88bc-4bce-aea4-13e676aab111/nova-scheduler-scheduler/0.log" Mar 01 10:18:34 crc kubenswrapper[4792]: I0301 10:18:34.997360 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f2d03d42-7830-444b-a8ae-c91e16d352b9/mysql-bootstrap/0.log" Mar 01 10:18:35 crc kubenswrapper[4792]: I0301 10:18:35.215391 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f2d03d42-7830-444b-a8ae-c91e16d352b9/galera/0.log" Mar 01 10:18:35 crc kubenswrapper[4792]: I0301 10:18:35.247771 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f2d03d42-7830-444b-a8ae-c91e16d352b9/mysql-bootstrap/0.log" Mar 01 10:18:35 crc kubenswrapper[4792]: I0301 10:18:35.479401 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b969e6eb-14a7-4e45-8342-ccbd05c06261/mysql-bootstrap/0.log" Mar 01 10:18:35 crc kubenswrapper[4792]: I0301 10:18:35.706755 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b969e6eb-14a7-4e45-8342-ccbd05c06261/mysql-bootstrap/0.log" Mar 01 10:18:35 crc kubenswrapper[4792]: I0301 10:18:35.711389 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b969e6eb-14a7-4e45-8342-ccbd05c06261/galera/0.log" Mar 01 10:18:35 crc kubenswrapper[4792]: I0301 10:18:35.863359 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_cbf9560f-212f-460a-9a4d-250e20b00d18/nova-metadata-metadata/0.log" Mar 01 10:18:36 crc kubenswrapper[4792]: I0301 10:18:36.322749 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_fecafda6-dcf9-46ea-8678-8da499154ad7/openstackclient/0.log" Mar 01 10:18:36 crc kubenswrapper[4792]: I0301 10:18:36.427079 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-7wc55_9493aff0-58e3-44ca-ba01-69f3b284d732/openstack-network-exporter/0.log" Mar 01 10:18:36 crc kubenswrapper[4792]: I0301 10:18:36.662426 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-mpvqc_d50ee3b1-4f97-4644-802d-04c85d9c3abc/ovn-controller/0.log" Mar 01 10:18:36 crc kubenswrapper[4792]: I0301 10:18:36.846252 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-nfzrr_22d78adc-2ff6-4f03-b60e-ac8e9a0f3699/ovsdb-server-init/0.log" Mar 01 10:18:37 crc kubenswrapper[4792]: I0301 10:18:37.038981 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-nfzrr_22d78adc-2ff6-4f03-b60e-ac8e9a0f3699/ovsdb-server-init/0.log" Mar 01 10:18:37 crc kubenswrapper[4792]: I0301 10:18:37.044380 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-nfzrr_22d78adc-2ff6-4f03-b60e-ac8e9a0f3699/ovs-vswitchd/0.log" Mar 01 10:18:37 crc kubenswrapper[4792]: I0301 10:18:37.116227 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-nfzrr_22d78adc-2ff6-4f03-b60e-ac8e9a0f3699/ovsdb-server/0.log" Mar 01 10:18:37 crc kubenswrapper[4792]: I0301 10:18:37.381147 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_1712e112-23fd-402b-ae0b-f63a594d4fab/openstack-network-exporter/0.log" Mar 01 10:18:37 crc kubenswrapper[4792]: I0301 10:18:37.392629 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-bc5rl_e4b8a64b-6bea-426c-b1f5-2372342d4211/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:18:37 crc kubenswrapper[4792]: I0301 10:18:37.874766 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_1712e112-23fd-402b-ae0b-f63a594d4fab/ovn-northd/0.log" Mar 01 10:18:37 crc kubenswrapper[4792]: I0301 10:18:37.896867 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_2c9312b5-705e-42f0-8462-62c8fdeb0791/ovsdbserver-nb/0.log" Mar 01 10:18:37 crc kubenswrapper[4792]: I0301 10:18:37.976836 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_2c9312b5-705e-42f0-8462-62c8fdeb0791/openstack-network-exporter/0.log" Mar 01 10:18:38 crc kubenswrapper[4792]: I0301 10:18:38.183381 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_a20f7417-3c04-411a-88b9-d60664faaee3/openstack-network-exporter/0.log" Mar 01 10:18:38 crc kubenswrapper[4792]: I0301 10:18:38.208006 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_a20f7417-3c04-411a-88b9-d60664faaee3/ovsdbserver-sb/0.log" Mar 01 10:18:38 crc kubenswrapper[4792]: I0301 10:18:38.482890 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-84f9696594-qdwsv_18f9e703-dec0-46e1-a428-580bdb68e54e/placement-api/0.log" Mar 01 10:18:38 crc kubenswrapper[4792]: I0301 10:18:38.665177 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e0e1dd7a-6a53-446d-bf90-5813f7a3fda0/setup-container/0.log" Mar 01 10:18:38 crc kubenswrapper[4792]: I0301 10:18:38.701155 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-84f9696594-qdwsv_18f9e703-dec0-46e1-a428-580bdb68e54e/placement-log/0.log" Mar 01 10:18:38 crc kubenswrapper[4792]: I0301 10:18:38.831358 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e0e1dd7a-6a53-446d-bf90-5813f7a3fda0/setup-container/0.log" Mar 01 10:18:38 crc kubenswrapper[4792]: I0301 10:18:38.958782 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e0e1dd7a-6a53-446d-bf90-5813f7a3fda0/rabbitmq/0.log" Mar 01 10:18:39 crc kubenswrapper[4792]: I0301 10:18:39.054895 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_63658b27-63d9-4a0f-afca-3a3c245b9b9d/setup-container/0.log" Mar 01 10:18:39 crc kubenswrapper[4792]: I0301 10:18:39.212971 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_63658b27-63d9-4a0f-afca-3a3c245b9b9d/setup-container/0.log" Mar 01 10:18:39 crc kubenswrapper[4792]: I0301 10:18:39.415013 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_63658b27-63d9-4a0f-afca-3a3c245b9b9d/rabbitmq/0.log" Mar 01 10:18:39 crc kubenswrapper[4792]: I0301 10:18:39.443658 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d_34275228-a1ab-4955-9d16-d184643a86d1/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:18:39 crc kubenswrapper[4792]: I0301 10:18:39.497132 4792 scope.go:117] "RemoveContainer" containerID="92df935b0380167f616deb42823e3a7a1744b9e85454d7327ad662b50fab1963" Mar 01 10:18:39 crc kubenswrapper[4792]: I0301 10:18:39.781250 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64_6c517000-6918-4f58-871b-7c4d26197ccf/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:18:39 crc kubenswrapper[4792]: I0301 10:18:39.880896 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-gxxr7_ff733b23-0a97-4623-9eeb-339aa02fc3b0/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:18:40 crc kubenswrapper[4792]: I0301 10:18:40.118344 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-8k5rj_ac58ff00-ba74-492a-97f1-e72c56686f1d/ssh-known-hosts-edpm-deployment/0.log" Mar 01 10:18:40 crc kubenswrapper[4792]: I0301 10:18:40.201232 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_ee1c75ce-61f7-4ce5-a757-b7405d7135bd/tempest-tests-tempest-tests-runner/0.log" Mar 01 10:18:40 crc kubenswrapper[4792]: I0301 10:18:40.514626 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_478d8531-4e8e-4775-999d-42af4afef106/test-operator-logs-container/0.log" Mar 01 10:18:40 crc kubenswrapper[4792]: I0301 10:18:40.660033 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-phn2l_59b987d8-9463-48cb-9651-1e5cb16aa764/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:18:47 crc kubenswrapper[4792]: I0301 10:18:47.412945 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:18:48 crc kubenswrapper[4792]: I0301 10:18:48.414316 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerStarted","Data":"34d9173341b46ccf37e4f77b26afd17d6e6d0f7b2699af960d32ad54ab5e3db7"} Mar 01 10:18:52 crc kubenswrapper[4792]: I0301 10:18:52.827559 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_84d455ad-7bbb-4771-a8ed-9aa1984e1d40/memcached/0.log" Mar 01 10:19:11 crc kubenswrapper[4792]: I0301 10:19:11.774099 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2_d4447fd9-d2df-47f4-a94f-ff8b4c5080bd/util/0.log" Mar 01 10:19:11 crc kubenswrapper[4792]: I0301 10:19:11.978148 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2_d4447fd9-d2df-47f4-a94f-ff8b4c5080bd/util/0.log" Mar 01 10:19:12 crc kubenswrapper[4792]: I0301 10:19:12.034134 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2_d4447fd9-d2df-47f4-a94f-ff8b4c5080bd/pull/0.log" Mar 01 10:19:12 crc kubenswrapper[4792]: I0301 10:19:12.063025 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2_d4447fd9-d2df-47f4-a94f-ff8b4c5080bd/pull/0.log" Mar 01 10:19:12 crc kubenswrapper[4792]: I0301 10:19:12.234343 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2_d4447fd9-d2df-47f4-a94f-ff8b4c5080bd/pull/0.log" Mar 01 10:19:12 crc kubenswrapper[4792]: I0301 10:19:12.284273 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2_d4447fd9-d2df-47f4-a94f-ff8b4c5080bd/util/0.log" Mar 01 10:19:12 crc kubenswrapper[4792]: I0301 10:19:12.306449 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2_d4447fd9-d2df-47f4-a94f-ff8b4c5080bd/extract/0.log" Mar 01 10:19:12 crc kubenswrapper[4792]: I0301 10:19:12.790687 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-5d87c9d997-72srw_bf1f37ea-a566-4dfd-b45b-02f284f19ce3/manager/0.log" Mar 01 10:19:13 crc kubenswrapper[4792]: I0301 10:19:13.425980 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-64db6967f8-9wzbh_02dd5cc0-c44b-4ede-972b-9d26c9c54100/manager/0.log" Mar 01 10:19:13 crc kubenswrapper[4792]: I0301 10:19:13.649749 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-cf99c678f-7v65r_5044cf86-f557-41d4-b6c0-a41a668ac999/manager/0.log" Mar 01 10:19:13 crc kubenswrapper[4792]: I0301 10:19:13.931099 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-78bc7f9bd9-55qzx_cd83ed19-023d-43c2-92db-d290499db3d4/manager/0.log" Mar 01 10:19:14 crc kubenswrapper[4792]: I0301 10:19:14.552388 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-545456dc4-jvw5j_2af4993f-9ba9-4f7a-a31e-2bd133a7d4c5/manager/0.log" Mar 01 10:19:14 crc kubenswrapper[4792]: I0301 10:19:14.654998 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-55d77d7b5c-jlnsb_8741a141-0194-4eb2-956e-c41f4ffe1338/manager/0.log" Mar 01 10:19:14 crc kubenswrapper[4792]: I0301 10:19:14.735588 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-f7fcc58b9-dsqtf_ea6739c2-185a-43e7-8fcf-0b2ae31957a0/manager/0.log" Mar 01 10:19:15 crc kubenswrapper[4792]: I0301 10:19:15.100673 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-67d996989d-t5fsn_376afe52-646d-44b7-b32e-ce6cd6dc21a6/manager/0.log" Mar 01 10:19:15 crc kubenswrapper[4792]: I0301 10:19:15.144592 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7c789f89c6-wjf62_234d2ae5-7589-44cc-83f4-b0ee8a91940a/manager/0.log" Mar 01 10:19:15 crc kubenswrapper[4792]: I0301 10:19:15.506870 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-7b6bfb6475-hlzm6_1793465e-1273-4250-a238-c99798788618/manager/0.log" Mar 01 10:19:15 crc kubenswrapper[4792]: I0301 10:19:15.676140 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-54688575f-qjqd2_dfb10d33-c4f1-4287-be83-dff835c733ba/manager/0.log" Mar 01 10:19:15 crc kubenswrapper[4792]: I0301 10:19:15.924476 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5d86c7ddb7-54rpl_ecc17c18-7695-4d22-9a95-bcac51800d60/manager/0.log" Mar 01 10:19:15 crc kubenswrapper[4792]: I0301 10:19:15.970363 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-74b6b5dc96-knk7m_8307ba19-5fc4-4cfc-b3cd-cafe5eac9cb9/manager/0.log" Mar 01 10:19:16 crc kubenswrapper[4792]: I0301 10:19:16.160805 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7b4cc4776948grv_9244686e-175e-45f9-9eb7-23621cd1f3cd/manager/0.log" Mar 01 10:19:16 crc kubenswrapper[4792]: I0301 10:19:16.512480 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-595c94944c-vtchh_c967e6f5-6388-4ae5-9ccf-379b6305e1b0/operator/0.log" Mar 01 10:19:16 crc kubenswrapper[4792]: I0301 10:19:16.687033 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-5kfk4_dc22117a-72a7-4838-bb1c-111e91514b98/registry-server/0.log" Mar 01 10:19:16 crc kubenswrapper[4792]: I0301 10:19:16.925931 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-75684d597f-zkx7c_3d38195c-e4ff-49cf-9592-e9f52d73f2df/manager/0.log" Mar 01 10:19:17 crc kubenswrapper[4792]: I0301 10:19:17.006978 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-648564c9fc-jdn6k_808b8753-0a20-419b-8b04-dcbccaa2d77e/manager/0.log" Mar 01 10:19:17 crc kubenswrapper[4792]: I0301 10:19:17.264892 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-r5l9m_1ecd6b07-eda9-41d6-90af-6471699ff808/operator/0.log" Mar 01 10:19:17 crc kubenswrapper[4792]: I0301 10:19:17.468296 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9b9ff9f4d-mqndr_e0cef8e2-a392-4612-97c6-17c611b2a44e/manager/0.log" Mar 01 10:19:17 crc kubenswrapper[4792]: I0301 10:19:17.738403 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-55b5ff4dbb-bcnns_2970c60c-7b03-4667-99e4-08c094cdbfc2/manager/0.log" Mar 01 10:19:17 crc kubenswrapper[4792]: I0301 10:19:17.764101 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5fdb694969-jpxwz_4fe8270e-a46d-40bc-8d24-a4585b196f5e/manager/0.log" Mar 01 10:19:18 crc kubenswrapper[4792]: I0301 10:19:18.117660 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-bccc79885-64lkf_e45ebab9-87d5-4b2f-b3d1-f1832864584d/manager/0.log" Mar 01 10:19:18 crc kubenswrapper[4792]: I0301 10:19:18.329718 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-864b865b94-5ndlx_d1d3783f-78e9-461a-916a-5a46e3083e70/manager/0.log" Mar 01 10:19:22 crc kubenswrapper[4792]: I0301 10:19:22.143987 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6db6876945-ggspg_b9e3fd6b-e3e2-4380-b8d7-900891df562a/manager/0.log" Mar 01 10:19:41 crc kubenswrapper[4792]: I0301 10:19:41.543032 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-9smfd_e0b63d94-59de-45da-8058-89714bea7a90/control-plane-machine-set-operator/0.log" Mar 01 10:19:41 crc kubenswrapper[4792]: I0301 10:19:41.725515 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-nv4bp_51683a24-edad-4808-b2ec-6a628bfdd937/kube-rbac-proxy/0.log" Mar 01 10:19:41 crc kubenswrapper[4792]: I0301 10:19:41.806547 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-nv4bp_51683a24-edad-4808-b2ec-6a628bfdd937/machine-api-operator/0.log" Mar 01 10:19:57 crc kubenswrapper[4792]: I0301 10:19:57.645398 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-tm5s6_2071887a-31a9-428d-92d0-bf8a361011ca/cert-manager-cainjector/0.log" Mar 01 10:19:57 crc kubenswrapper[4792]: I0301 10:19:57.674241 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-4qgsm_bf71ada0-c7b2-4255-bb2c-31ec3309a29d/cert-manager-controller/0.log" Mar 01 10:19:57 crc kubenswrapper[4792]: I0301 10:19:57.930124 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-rckpb_a03eedd4-ecde-4905-95a7-c43b45ef9da9/cert-manager-webhook/0.log" Mar 01 10:20:00 crc kubenswrapper[4792]: I0301 10:20:00.147171 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539340-rjnp9"] Mar 01 10:20:00 crc kubenswrapper[4792]: E0301 10:20:00.148087 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a36fce8b-7564-4d74-b3ad-9bfe9979cb67" containerName="oc" Mar 01 10:20:00 crc kubenswrapper[4792]: I0301 10:20:00.148100 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a36fce8b-7564-4d74-b3ad-9bfe9979cb67" containerName="oc" Mar 01 10:20:00 crc kubenswrapper[4792]: I0301 10:20:00.148293 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a36fce8b-7564-4d74-b3ad-9bfe9979cb67" containerName="oc" Mar 01 10:20:00 crc kubenswrapper[4792]: I0301 10:20:00.148930 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539340-rjnp9" Mar 01 10:20:00 crc kubenswrapper[4792]: I0301 10:20:00.153441 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 10:20:00 crc kubenswrapper[4792]: I0301 10:20:00.153454 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 10:20:00 crc kubenswrapper[4792]: I0301 10:20:00.153449 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 10:20:00 crc kubenswrapper[4792]: I0301 10:20:00.159824 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539340-rjnp9"] Mar 01 10:20:00 crc kubenswrapper[4792]: I0301 10:20:00.325787 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wjxw\" (UniqueName: \"kubernetes.io/projected/5d79ac35-053d-480b-a8ef-3b03122b0152-kube-api-access-8wjxw\") pod \"auto-csr-approver-29539340-rjnp9\" (UID: \"5d79ac35-053d-480b-a8ef-3b03122b0152\") " pod="openshift-infra/auto-csr-approver-29539340-rjnp9" Mar 01 10:20:00 crc kubenswrapper[4792]: I0301 10:20:00.428449 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wjxw\" (UniqueName: \"kubernetes.io/projected/5d79ac35-053d-480b-a8ef-3b03122b0152-kube-api-access-8wjxw\") pod \"auto-csr-approver-29539340-rjnp9\" (UID: \"5d79ac35-053d-480b-a8ef-3b03122b0152\") " pod="openshift-infra/auto-csr-approver-29539340-rjnp9" Mar 01 10:20:00 crc kubenswrapper[4792]: I0301 10:20:00.451724 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wjxw\" (UniqueName: \"kubernetes.io/projected/5d79ac35-053d-480b-a8ef-3b03122b0152-kube-api-access-8wjxw\") pod \"auto-csr-approver-29539340-rjnp9\" (UID: \"5d79ac35-053d-480b-a8ef-3b03122b0152\") " pod="openshift-infra/auto-csr-approver-29539340-rjnp9" Mar 01 10:20:00 crc kubenswrapper[4792]: I0301 10:20:00.472629 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539340-rjnp9" Mar 01 10:20:01 crc kubenswrapper[4792]: I0301 10:20:01.066040 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 01 10:20:01 crc kubenswrapper[4792]: I0301 10:20:01.075157 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539340-rjnp9"] Mar 01 10:20:02 crc kubenswrapper[4792]: I0301 10:20:02.042355 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539340-rjnp9" event={"ID":"5d79ac35-053d-480b-a8ef-3b03122b0152","Type":"ContainerStarted","Data":"993ecebc9f66b56082a2c6aad384afbd8b43e4f69e9684b2bc92daa3f1248dd6"} Mar 01 10:20:03 crc kubenswrapper[4792]: I0301 10:20:03.050755 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539340-rjnp9" event={"ID":"5d79ac35-053d-480b-a8ef-3b03122b0152","Type":"ContainerStarted","Data":"fd47f4e8c316ceef6350038b81becae73fa982c181d0ebd8621bb407bf6fb4b7"} Mar 01 10:20:03 crc kubenswrapper[4792]: I0301 10:20:03.066639 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29539340-rjnp9" podStartSLOduration=1.995481039 podStartE2EDuration="3.066623723s" podCreationTimestamp="2026-03-01 10:20:00 +0000 UTC" firstStartedPulling="2026-03-01 10:20:01.065730643 +0000 UTC m=+4330.307609850" lastFinishedPulling="2026-03-01 10:20:02.136873337 +0000 UTC m=+4331.378752534" observedRunningTime="2026-03-01 10:20:03.065085625 +0000 UTC m=+4332.306964832" watchObservedRunningTime="2026-03-01 10:20:03.066623723 +0000 UTC m=+4332.308502920" Mar 01 10:20:04 crc kubenswrapper[4792]: I0301 10:20:04.060163 4792 generic.go:334] "Generic (PLEG): container finished" podID="5d79ac35-053d-480b-a8ef-3b03122b0152" containerID="fd47f4e8c316ceef6350038b81becae73fa982c181d0ebd8621bb407bf6fb4b7" exitCode=0 Mar 01 10:20:04 crc kubenswrapper[4792]: I0301 10:20:04.060216 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539340-rjnp9" event={"ID":"5d79ac35-053d-480b-a8ef-3b03122b0152","Type":"ContainerDied","Data":"fd47f4e8c316ceef6350038b81becae73fa982c181d0ebd8621bb407bf6fb4b7"} Mar 01 10:20:05 crc kubenswrapper[4792]: I0301 10:20:05.402669 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539340-rjnp9" Mar 01 10:20:05 crc kubenswrapper[4792]: I0301 10:20:05.537438 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wjxw\" (UniqueName: \"kubernetes.io/projected/5d79ac35-053d-480b-a8ef-3b03122b0152-kube-api-access-8wjxw\") pod \"5d79ac35-053d-480b-a8ef-3b03122b0152\" (UID: \"5d79ac35-053d-480b-a8ef-3b03122b0152\") " Mar 01 10:20:05 crc kubenswrapper[4792]: I0301 10:20:05.543865 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d79ac35-053d-480b-a8ef-3b03122b0152-kube-api-access-8wjxw" (OuterVolumeSpecName: "kube-api-access-8wjxw") pod "5d79ac35-053d-480b-a8ef-3b03122b0152" (UID: "5d79ac35-053d-480b-a8ef-3b03122b0152"). InnerVolumeSpecName "kube-api-access-8wjxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:20:05 crc kubenswrapper[4792]: I0301 10:20:05.639486 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wjxw\" (UniqueName: \"kubernetes.io/projected/5d79ac35-053d-480b-a8ef-3b03122b0152-kube-api-access-8wjxw\") on node \"crc\" DevicePath \"\"" Mar 01 10:20:06 crc kubenswrapper[4792]: I0301 10:20:06.079861 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539340-rjnp9" event={"ID":"5d79ac35-053d-480b-a8ef-3b03122b0152","Type":"ContainerDied","Data":"993ecebc9f66b56082a2c6aad384afbd8b43e4f69e9684b2bc92daa3f1248dd6"} Mar 01 10:20:06 crc kubenswrapper[4792]: I0301 10:20:06.079926 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="993ecebc9f66b56082a2c6aad384afbd8b43e4f69e9684b2bc92daa3f1248dd6" Mar 01 10:20:06 crc kubenswrapper[4792]: I0301 10:20:06.079931 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539340-rjnp9" Mar 01 10:20:06 crc kubenswrapper[4792]: I0301 10:20:06.141964 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539334-c4cd2"] Mar 01 10:20:06 crc kubenswrapper[4792]: I0301 10:20:06.150460 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539334-c4cd2"] Mar 01 10:20:07 crc kubenswrapper[4792]: I0301 10:20:07.422197 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="807c2da3-ef0e-4e89-9457-37401354a8e9" path="/var/lib/kubelet/pods/807c2da3-ef0e-4e89-9457-37401354a8e9/volumes" Mar 01 10:20:14 crc kubenswrapper[4792]: I0301 10:20:14.290216 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-mtxkm_f7ca92c8-f38b-4a0a-b330-5809993cbb49/nmstate-console-plugin/0.log" Mar 01 10:20:14 crc kubenswrapper[4792]: I0301 10:20:14.404339 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-9j2tz_7105919f-ddac-45db-a8f7-bd927e5737df/nmstate-handler/0.log" Mar 01 10:20:14 crc kubenswrapper[4792]: I0301 10:20:14.664895 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-97cv9_bfe2cc56-28ca-4201-ba5a-4208dd1ec818/kube-rbac-proxy/0.log" Mar 01 10:20:14 crc kubenswrapper[4792]: I0301 10:20:14.867389 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-97cv9_bfe2cc56-28ca-4201-ba5a-4208dd1ec818/nmstate-metrics/0.log" Mar 01 10:20:15 crc kubenswrapper[4792]: I0301 10:20:15.014212 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-chfpw_fb942d1c-2a1a-4265-ae29-02f185d4cc40/nmstate-operator/0.log" Mar 01 10:20:15 crc kubenswrapper[4792]: I0301 10:20:15.054129 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-zwhpc_aa2300d6-10c0-4dc9-812a-fcb30f09920e/nmstate-webhook/0.log" Mar 01 10:20:39 crc kubenswrapper[4792]: I0301 10:20:39.685928 4792 scope.go:117] "RemoveContainer" containerID="b30f740be1f0c17125e3d3e62c51a8c03429e4e9dd6640d5fa13fe74408f6823" Mar 01 10:20:44 crc kubenswrapper[4792]: I0301 10:20:44.158994 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-twxml_f73a6813-31ea-4018-bd23-45bf2f1dfe89/kube-rbac-proxy/0.log" Mar 01 10:20:44 crc kubenswrapper[4792]: I0301 10:20:44.287631 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-twxml_f73a6813-31ea-4018-bd23-45bf2f1dfe89/controller/0.log" Mar 01 10:20:44 crc kubenswrapper[4792]: I0301 10:20:44.481750 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-frr-files/0.log" Mar 01 10:20:44 crc kubenswrapper[4792]: I0301 10:20:44.663608 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-metrics/0.log" Mar 01 10:20:44 crc kubenswrapper[4792]: I0301 10:20:44.679855 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-frr-files/0.log" Mar 01 10:20:44 crc kubenswrapper[4792]: I0301 10:20:44.686360 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-reloader/0.log" Mar 01 10:20:44 crc kubenswrapper[4792]: I0301 10:20:44.732892 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-reloader/0.log" Mar 01 10:20:44 crc kubenswrapper[4792]: I0301 10:20:44.972231 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-reloader/0.log" Mar 01 10:20:44 crc kubenswrapper[4792]: I0301 10:20:44.995984 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-frr-files/0.log" Mar 01 10:20:44 crc kubenswrapper[4792]: I0301 10:20:44.998268 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-metrics/0.log" Mar 01 10:20:45 crc kubenswrapper[4792]: I0301 10:20:45.061427 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-metrics/0.log" Mar 01 10:20:45 crc kubenswrapper[4792]: I0301 10:20:45.181101 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-metrics/0.log" Mar 01 10:20:45 crc kubenswrapper[4792]: I0301 10:20:45.186357 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-frr-files/0.log" Mar 01 10:20:45 crc kubenswrapper[4792]: I0301 10:20:45.212327 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-reloader/0.log" Mar 01 10:20:45 crc kubenswrapper[4792]: I0301 10:20:45.283113 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/controller/0.log" Mar 01 10:20:45 crc kubenswrapper[4792]: I0301 10:20:45.425652 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/frr-metrics/0.log" Mar 01 10:20:45 crc kubenswrapper[4792]: I0301 10:20:45.477697 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/kube-rbac-proxy/0.log" Mar 01 10:20:45 crc kubenswrapper[4792]: I0301 10:20:45.543041 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/kube-rbac-proxy-frr/0.log" Mar 01 10:20:45 crc kubenswrapper[4792]: I0301 10:20:45.709524 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/reloader/0.log" Mar 01 10:20:45 crc kubenswrapper[4792]: I0301 10:20:45.819253 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-kfnzk_d2f0572c-e661-495c-873c-6e2d18f2ab7d/frr-k8s-webhook-server/0.log" Mar 01 10:20:46 crc kubenswrapper[4792]: I0301 10:20:46.090329 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5cd84fcfbc-lrpmz_ba22e25a-31e8-4ca7-b169-f7433eda818b/manager/0.log" Mar 01 10:20:46 crc kubenswrapper[4792]: I0301 10:20:46.308552 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-776c7d78bd-jwfh6_cf86866e-8afa-44da-a688-e1c018a025bd/webhook-server/0.log" Mar 01 10:20:46 crc kubenswrapper[4792]: I0301 10:20:46.322564 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zpr27_8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7/kube-rbac-proxy/0.log" Mar 01 10:20:47 crc kubenswrapper[4792]: I0301 10:20:47.033033 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zpr27_8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7/speaker/0.log" Mar 01 10:20:47 crc kubenswrapper[4792]: I0301 10:20:47.063113 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/frr/0.log" Mar 01 10:21:01 crc kubenswrapper[4792]: I0301 10:21:01.914890 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst_8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd/util/0.log" Mar 01 10:21:02 crc kubenswrapper[4792]: I0301 10:21:02.133283 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst_8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd/pull/0.log" Mar 01 10:21:02 crc kubenswrapper[4792]: I0301 10:21:02.180310 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst_8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd/pull/0.log" Mar 01 10:21:02 crc kubenswrapper[4792]: I0301 10:21:02.180391 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst_8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd/util/0.log" Mar 01 10:21:02 crc kubenswrapper[4792]: I0301 10:21:02.331018 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst_8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd/util/0.log" Mar 01 10:21:02 crc kubenswrapper[4792]: I0301 10:21:02.347491 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst_8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd/pull/0.log" Mar 01 10:21:02 crc kubenswrapper[4792]: I0301 10:21:02.356154 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst_8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd/extract/0.log" Mar 01 10:21:02 crc kubenswrapper[4792]: I0301 10:21:02.599684 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sksw8_55c448d6-e926-4b07-8aec-8195d42d2e30/extract-utilities/0.log" Mar 01 10:21:02 crc kubenswrapper[4792]: I0301 10:21:02.711537 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sksw8_55c448d6-e926-4b07-8aec-8195d42d2e30/extract-utilities/0.log" Mar 01 10:21:02 crc kubenswrapper[4792]: I0301 10:21:02.738983 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sksw8_55c448d6-e926-4b07-8aec-8195d42d2e30/extract-content/0.log" Mar 01 10:21:02 crc kubenswrapper[4792]: I0301 10:21:02.761365 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sksw8_55c448d6-e926-4b07-8aec-8195d42d2e30/extract-content/0.log" Mar 01 10:21:02 crc kubenswrapper[4792]: I0301 10:21:02.933293 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sksw8_55c448d6-e926-4b07-8aec-8195d42d2e30/extract-content/0.log" Mar 01 10:21:02 crc kubenswrapper[4792]: I0301 10:21:02.940275 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sksw8_55c448d6-e926-4b07-8aec-8195d42d2e30/extract-utilities/0.log" Mar 01 10:21:03 crc kubenswrapper[4792]: I0301 10:21:03.132081 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-48zdf_d875f1af-e90b-4882-b472-f91651d468a6/extract-utilities/0.log" Mar 01 10:21:03 crc kubenswrapper[4792]: I0301 10:21:03.389050 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sksw8_55c448d6-e926-4b07-8aec-8195d42d2e30/registry-server/0.log" Mar 01 10:21:03 crc kubenswrapper[4792]: I0301 10:21:03.405112 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-48zdf_d875f1af-e90b-4882-b472-f91651d468a6/extract-content/0.log" Mar 01 10:21:03 crc kubenswrapper[4792]: I0301 10:21:03.427325 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-48zdf_d875f1af-e90b-4882-b472-f91651d468a6/extract-content/0.log" Mar 01 10:21:03 crc kubenswrapper[4792]: I0301 10:21:03.469015 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-48zdf_d875f1af-e90b-4882-b472-f91651d468a6/extract-utilities/0.log" Mar 01 10:21:03 crc kubenswrapper[4792]: I0301 10:21:03.685201 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-48zdf_d875f1af-e90b-4882-b472-f91651d468a6/extract-utilities/0.log" Mar 01 10:21:03 crc kubenswrapper[4792]: I0301 10:21:03.714729 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-48zdf_d875f1af-e90b-4882-b472-f91651d468a6/extract-content/0.log" Mar 01 10:21:04 crc kubenswrapper[4792]: I0301 10:21:04.089516 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t_a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4/util/0.log" Mar 01 10:21:04 crc kubenswrapper[4792]: I0301 10:21:04.206420 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t_a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4/pull/0.log" Mar 01 10:21:04 crc kubenswrapper[4792]: I0301 10:21:04.367002 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t_a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4/util/0.log" Mar 01 10:21:04 crc kubenswrapper[4792]: I0301 10:21:04.390128 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t_a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4/pull/0.log" Mar 01 10:21:04 crc kubenswrapper[4792]: I0301 10:21:04.479782 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-48zdf_d875f1af-e90b-4882-b472-f91651d468a6/registry-server/0.log" Mar 01 10:21:04 crc kubenswrapper[4792]: I0301 10:21:04.711818 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t_a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4/util/0.log" Mar 01 10:21:04 crc kubenswrapper[4792]: I0301 10:21:04.712338 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t_a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4/pull/0.log" Mar 01 10:21:04 crc kubenswrapper[4792]: I0301 10:21:04.716816 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t_a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4/extract/0.log" Mar 01 10:21:04 crc kubenswrapper[4792]: I0301 10:21:04.943307 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 10:21:04 crc kubenswrapper[4792]: I0301 10:21:04.943361 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 10:21:05 crc kubenswrapper[4792]: I0301 10:21:05.013935 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-gfkbs_46fe59e7-8122-4621-ae8d-237a91daee5e/marketplace-operator/0.log" Mar 01 10:21:05 crc kubenswrapper[4792]: I0301 10:21:05.033475 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zjvjw_3003e690-c3dd-4236-a95c-a0fb6ccb438e/extract-utilities/0.log" Mar 01 10:21:05 crc kubenswrapper[4792]: I0301 10:21:05.251791 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zjvjw_3003e690-c3dd-4236-a95c-a0fb6ccb438e/extract-utilities/0.log" Mar 01 10:21:05 crc kubenswrapper[4792]: I0301 10:21:05.297588 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zjvjw_3003e690-c3dd-4236-a95c-a0fb6ccb438e/extract-content/0.log" Mar 01 10:21:05 crc kubenswrapper[4792]: I0301 10:21:05.371467 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zjvjw_3003e690-c3dd-4236-a95c-a0fb6ccb438e/extract-content/0.log" Mar 01 10:21:05 crc kubenswrapper[4792]: I0301 10:21:05.488325 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zjvjw_3003e690-c3dd-4236-a95c-a0fb6ccb438e/extract-content/0.log" Mar 01 10:21:05 crc kubenswrapper[4792]: I0301 10:21:05.566175 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zjvjw_3003e690-c3dd-4236-a95c-a0fb6ccb438e/extract-utilities/0.log" Mar 01 10:21:05 crc kubenswrapper[4792]: I0301 10:21:05.672106 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zjvjw_3003e690-c3dd-4236-a95c-a0fb6ccb438e/registry-server/0.log" Mar 01 10:21:05 crc kubenswrapper[4792]: I0301 10:21:05.716410 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7fdqh_38bc0c09-286e-427a-95c2-8e2c9213b142/extract-utilities/0.log" Mar 01 10:21:05 crc kubenswrapper[4792]: I0301 10:21:05.898783 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7fdqh_38bc0c09-286e-427a-95c2-8e2c9213b142/extract-content/0.log" Mar 01 10:21:05 crc kubenswrapper[4792]: I0301 10:21:05.907567 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7fdqh_38bc0c09-286e-427a-95c2-8e2c9213b142/extract-utilities/0.log" Mar 01 10:21:05 crc kubenswrapper[4792]: I0301 10:21:05.949239 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7fdqh_38bc0c09-286e-427a-95c2-8e2c9213b142/extract-content/0.log" Mar 01 10:21:06 crc kubenswrapper[4792]: I0301 10:21:06.089452 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7fdqh_38bc0c09-286e-427a-95c2-8e2c9213b142/extract-utilities/0.log" Mar 01 10:21:06 crc kubenswrapper[4792]: I0301 10:21:06.109349 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7fdqh_38bc0c09-286e-427a-95c2-8e2c9213b142/extract-content/0.log" Mar 01 10:21:06 crc kubenswrapper[4792]: I0301 10:21:06.621486 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7fdqh_38bc0c09-286e-427a-95c2-8e2c9213b142/registry-server/0.log" Mar 01 10:21:34 crc kubenswrapper[4792]: I0301 10:21:34.943609 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 10:21:34 crc kubenswrapper[4792]: I0301 10:21:34.944118 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 10:22:00 crc kubenswrapper[4792]: I0301 10:22:00.140619 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539342-5t24p"] Mar 01 10:22:00 crc kubenswrapper[4792]: E0301 10:22:00.141602 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d79ac35-053d-480b-a8ef-3b03122b0152" containerName="oc" Mar 01 10:22:00 crc kubenswrapper[4792]: I0301 10:22:00.141619 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d79ac35-053d-480b-a8ef-3b03122b0152" containerName="oc" Mar 01 10:22:00 crc kubenswrapper[4792]: I0301 10:22:00.141815 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d79ac35-053d-480b-a8ef-3b03122b0152" containerName="oc" Mar 01 10:22:00 crc kubenswrapper[4792]: I0301 10:22:00.142515 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539342-5t24p" Mar 01 10:22:00 crc kubenswrapper[4792]: I0301 10:22:00.144549 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 10:22:00 crc kubenswrapper[4792]: I0301 10:22:00.144986 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 10:22:00 crc kubenswrapper[4792]: I0301 10:22:00.149529 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 10:22:00 crc kubenswrapper[4792]: I0301 10:22:00.160640 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539342-5t24p"] Mar 01 10:22:00 crc kubenswrapper[4792]: I0301 10:22:00.322194 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tjsj\" (UniqueName: \"kubernetes.io/projected/821a550c-e6ff-4517-a306-11ea497be759-kube-api-access-2tjsj\") pod \"auto-csr-approver-29539342-5t24p\" (UID: \"821a550c-e6ff-4517-a306-11ea497be759\") " pod="openshift-infra/auto-csr-approver-29539342-5t24p" Mar 01 10:22:00 crc kubenswrapper[4792]: I0301 10:22:00.424060 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tjsj\" (UniqueName: \"kubernetes.io/projected/821a550c-e6ff-4517-a306-11ea497be759-kube-api-access-2tjsj\") pod \"auto-csr-approver-29539342-5t24p\" (UID: \"821a550c-e6ff-4517-a306-11ea497be759\") " pod="openshift-infra/auto-csr-approver-29539342-5t24p" Mar 01 10:22:00 crc kubenswrapper[4792]: I0301 10:22:00.449299 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tjsj\" (UniqueName: \"kubernetes.io/projected/821a550c-e6ff-4517-a306-11ea497be759-kube-api-access-2tjsj\") pod \"auto-csr-approver-29539342-5t24p\" (UID: \"821a550c-e6ff-4517-a306-11ea497be759\") " pod="openshift-infra/auto-csr-approver-29539342-5t24p" Mar 01 10:22:00 crc kubenswrapper[4792]: I0301 10:22:00.480245 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539342-5t24p" Mar 01 10:22:00 crc kubenswrapper[4792]: I0301 10:22:00.965667 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539342-5t24p"] Mar 01 10:22:01 crc kubenswrapper[4792]: I0301 10:22:01.209436 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539342-5t24p" event={"ID":"821a550c-e6ff-4517-a306-11ea497be759","Type":"ContainerStarted","Data":"f117d9ce2e607aa3abf587259dd00adb7c06fd9684234b0efa083d157e7c032e"} Mar 01 10:22:03 crc kubenswrapper[4792]: I0301 10:22:03.237026 4792 generic.go:334] "Generic (PLEG): container finished" podID="821a550c-e6ff-4517-a306-11ea497be759" containerID="ec9227dfee7817fbe5632f2b037665aa0557b7bf6988989846a2502b4704a463" exitCode=0 Mar 01 10:22:03 crc kubenswrapper[4792]: I0301 10:22:03.237286 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539342-5t24p" event={"ID":"821a550c-e6ff-4517-a306-11ea497be759","Type":"ContainerDied","Data":"ec9227dfee7817fbe5632f2b037665aa0557b7bf6988989846a2502b4704a463"} Mar 01 10:22:04 crc kubenswrapper[4792]: I0301 10:22:04.866345 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539342-5t24p" Mar 01 10:22:04 crc kubenswrapper[4792]: I0301 10:22:04.923556 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tjsj\" (UniqueName: \"kubernetes.io/projected/821a550c-e6ff-4517-a306-11ea497be759-kube-api-access-2tjsj\") pod \"821a550c-e6ff-4517-a306-11ea497be759\" (UID: \"821a550c-e6ff-4517-a306-11ea497be759\") " Mar 01 10:22:04 crc kubenswrapper[4792]: I0301 10:22:04.956140 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/821a550c-e6ff-4517-a306-11ea497be759-kube-api-access-2tjsj" (OuterVolumeSpecName: "kube-api-access-2tjsj") pod "821a550c-e6ff-4517-a306-11ea497be759" (UID: "821a550c-e6ff-4517-a306-11ea497be759"). InnerVolumeSpecName "kube-api-access-2tjsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:22:04 crc kubenswrapper[4792]: I0301 10:22:04.956459 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 10:22:04 crc kubenswrapper[4792]: I0301 10:22:04.956495 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 10:22:04 crc kubenswrapper[4792]: I0301 10:22:04.956533 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 10:22:04 crc kubenswrapper[4792]: I0301 10:22:04.957235 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"34d9173341b46ccf37e4f77b26afd17d6e6d0f7b2699af960d32ad54ab5e3db7"} pod="openshift-machine-config-operator/machine-config-daemon-bqszv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 01 10:22:04 crc kubenswrapper[4792]: I0301 10:22:04.957294 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" containerID="cri-o://34d9173341b46ccf37e4f77b26afd17d6e6d0f7b2699af960d32ad54ab5e3db7" gracePeriod=600 Mar 01 10:22:05 crc kubenswrapper[4792]: I0301 10:22:05.029232 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tjsj\" (UniqueName: \"kubernetes.io/projected/821a550c-e6ff-4517-a306-11ea497be759-kube-api-access-2tjsj\") on node \"crc\" DevicePath \"\"" Mar 01 10:22:05 crc kubenswrapper[4792]: I0301 10:22:05.256562 4792 generic.go:334] "Generic (PLEG): container finished" podID="9105f6b0-6f16-47aa-8009-73736a90b765" containerID="34d9173341b46ccf37e4f77b26afd17d6e6d0f7b2699af960d32ad54ab5e3db7" exitCode=0 Mar 01 10:22:05 crc kubenswrapper[4792]: I0301 10:22:05.256638 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerDied","Data":"34d9173341b46ccf37e4f77b26afd17d6e6d0f7b2699af960d32ad54ab5e3db7"} Mar 01 10:22:05 crc kubenswrapper[4792]: I0301 10:22:05.256968 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:22:05 crc kubenswrapper[4792]: I0301 10:22:05.259447 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539342-5t24p" event={"ID":"821a550c-e6ff-4517-a306-11ea497be759","Type":"ContainerDied","Data":"f117d9ce2e607aa3abf587259dd00adb7c06fd9684234b0efa083d157e7c032e"} Mar 01 10:22:05 crc kubenswrapper[4792]: I0301 10:22:05.259478 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f117d9ce2e607aa3abf587259dd00adb7c06fd9684234b0efa083d157e7c032e" Mar 01 10:22:05 crc kubenswrapper[4792]: I0301 10:22:05.259530 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539342-5t24p" Mar 01 10:22:05 crc kubenswrapper[4792]: I0301 10:22:05.948167 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539336-vnhz7"] Mar 01 10:22:05 crc kubenswrapper[4792]: I0301 10:22:05.956693 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539336-vnhz7"] Mar 01 10:22:06 crc kubenswrapper[4792]: I0301 10:22:06.270180 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerStarted","Data":"6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43"} Mar 01 10:22:07 crc kubenswrapper[4792]: I0301 10:22:07.420427 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="731500c7-53e0-431a-9ec7-7e56ef9c11ee" path="/var/lib/kubelet/pods/731500c7-53e0-431a-9ec7-7e56ef9c11ee/volumes" Mar 01 10:22:39 crc kubenswrapper[4792]: I0301 10:22:39.840763 4792 scope.go:117] "RemoveContainer" containerID="457400b5efe76b08cc35b2db4e3b43b3bb9dd95107e6fe2815b55395cf597f33" Mar 01 10:22:39 crc kubenswrapper[4792]: I0301 10:22:39.925039 4792 scope.go:117] "RemoveContainer" containerID="12a4ef8d3fa3b5f92a7e689e1d88b84e48e9aed058fb8fe4d0d9b00831ceeea9" Mar 01 10:23:32 crc kubenswrapper[4792]: I0301 10:23:32.020965 4792 generic.go:334] "Generic (PLEG): container finished" podID="c72d6020-9460-4198-863a-ec32bc90fee9" containerID="3040d0fb9fd765268819a187e562939355666f759f1f77a68a46d8ecca69408c" exitCode=0 Mar 01 10:23:32 crc kubenswrapper[4792]: I0301 10:23:32.021027 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8szkd/must-gather-5ntfl" event={"ID":"c72d6020-9460-4198-863a-ec32bc90fee9","Type":"ContainerDied","Data":"3040d0fb9fd765268819a187e562939355666f759f1f77a68a46d8ecca69408c"} Mar 01 10:23:32 crc kubenswrapper[4792]: I0301 10:23:32.022214 4792 scope.go:117] "RemoveContainer" containerID="3040d0fb9fd765268819a187e562939355666f759f1f77a68a46d8ecca69408c" Mar 01 10:23:32 crc kubenswrapper[4792]: I0301 10:23:32.908411 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8szkd_must-gather-5ntfl_c72d6020-9460-4198-863a-ec32bc90fee9/gather/0.log" Mar 01 10:23:41 crc kubenswrapper[4792]: I0301 10:23:41.691823 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8szkd/must-gather-5ntfl"] Mar 01 10:23:41 crc kubenswrapper[4792]: I0301 10:23:41.696541 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-8szkd/must-gather-5ntfl" podUID="c72d6020-9460-4198-863a-ec32bc90fee9" containerName="copy" containerID="cri-o://882ed7a11f9213b4a2466a64a60d624aeb96a25ba7c34fc56d839ecd11da0b1a" gracePeriod=2 Mar 01 10:23:41 crc kubenswrapper[4792]: I0301 10:23:41.725258 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8szkd/must-gather-5ntfl"] Mar 01 10:23:42 crc kubenswrapper[4792]: I0301 10:23:42.101122 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8szkd_must-gather-5ntfl_c72d6020-9460-4198-863a-ec32bc90fee9/copy/0.log" Mar 01 10:23:42 crc kubenswrapper[4792]: I0301 10:23:42.101860 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8szkd/must-gather-5ntfl" Mar 01 10:23:42 crc kubenswrapper[4792]: I0301 10:23:42.114997 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8szkd_must-gather-5ntfl_c72d6020-9460-4198-863a-ec32bc90fee9/copy/0.log" Mar 01 10:23:42 crc kubenswrapper[4792]: I0301 10:23:42.115642 4792 generic.go:334] "Generic (PLEG): container finished" podID="c72d6020-9460-4198-863a-ec32bc90fee9" containerID="882ed7a11f9213b4a2466a64a60d624aeb96a25ba7c34fc56d839ecd11da0b1a" exitCode=143 Mar 01 10:23:42 crc kubenswrapper[4792]: I0301 10:23:42.115698 4792 scope.go:117] "RemoveContainer" containerID="882ed7a11f9213b4a2466a64a60d624aeb96a25ba7c34fc56d839ecd11da0b1a" Mar 01 10:23:42 crc kubenswrapper[4792]: I0301 10:23:42.115724 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8szkd/must-gather-5ntfl" Mar 01 10:23:42 crc kubenswrapper[4792]: I0301 10:23:42.134543 4792 scope.go:117] "RemoveContainer" containerID="3040d0fb9fd765268819a187e562939355666f759f1f77a68a46d8ecca69408c" Mar 01 10:23:42 crc kubenswrapper[4792]: I0301 10:23:42.187672 4792 scope.go:117] "RemoveContainer" containerID="882ed7a11f9213b4a2466a64a60d624aeb96a25ba7c34fc56d839ecd11da0b1a" Mar 01 10:23:42 crc kubenswrapper[4792]: E0301 10:23:42.188380 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"882ed7a11f9213b4a2466a64a60d624aeb96a25ba7c34fc56d839ecd11da0b1a\": container with ID starting with 882ed7a11f9213b4a2466a64a60d624aeb96a25ba7c34fc56d839ecd11da0b1a not found: ID does not exist" containerID="882ed7a11f9213b4a2466a64a60d624aeb96a25ba7c34fc56d839ecd11da0b1a" Mar 01 10:23:42 crc kubenswrapper[4792]: I0301 10:23:42.188429 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"882ed7a11f9213b4a2466a64a60d624aeb96a25ba7c34fc56d839ecd11da0b1a"} err="failed to get container status \"882ed7a11f9213b4a2466a64a60d624aeb96a25ba7c34fc56d839ecd11da0b1a\": rpc error: code = NotFound desc = could not find container \"882ed7a11f9213b4a2466a64a60d624aeb96a25ba7c34fc56d839ecd11da0b1a\": container with ID starting with 882ed7a11f9213b4a2466a64a60d624aeb96a25ba7c34fc56d839ecd11da0b1a not found: ID does not exist" Mar 01 10:23:42 crc kubenswrapper[4792]: I0301 10:23:42.188465 4792 scope.go:117] "RemoveContainer" containerID="3040d0fb9fd765268819a187e562939355666f759f1f77a68a46d8ecca69408c" Mar 01 10:23:42 crc kubenswrapper[4792]: E0301 10:23:42.189057 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3040d0fb9fd765268819a187e562939355666f759f1f77a68a46d8ecca69408c\": container with ID starting with 3040d0fb9fd765268819a187e562939355666f759f1f77a68a46d8ecca69408c not found: ID does not exist" containerID="3040d0fb9fd765268819a187e562939355666f759f1f77a68a46d8ecca69408c" Mar 01 10:23:42 crc kubenswrapper[4792]: I0301 10:23:42.189179 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3040d0fb9fd765268819a187e562939355666f759f1f77a68a46d8ecca69408c"} err="failed to get container status \"3040d0fb9fd765268819a187e562939355666f759f1f77a68a46d8ecca69408c\": rpc error: code = NotFound desc = could not find container \"3040d0fb9fd765268819a187e562939355666f759f1f77a68a46d8ecca69408c\": container with ID starting with 3040d0fb9fd765268819a187e562939355666f759f1f77a68a46d8ecca69408c not found: ID does not exist" Mar 01 10:23:42 crc kubenswrapper[4792]: I0301 10:23:42.257405 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58lp7\" (UniqueName: \"kubernetes.io/projected/c72d6020-9460-4198-863a-ec32bc90fee9-kube-api-access-58lp7\") pod \"c72d6020-9460-4198-863a-ec32bc90fee9\" (UID: \"c72d6020-9460-4198-863a-ec32bc90fee9\") " Mar 01 10:23:42 crc kubenswrapper[4792]: I0301 10:23:42.257549 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c72d6020-9460-4198-863a-ec32bc90fee9-must-gather-output\") pod \"c72d6020-9460-4198-863a-ec32bc90fee9\" (UID: \"c72d6020-9460-4198-863a-ec32bc90fee9\") " Mar 01 10:23:42 crc kubenswrapper[4792]: I0301 10:23:42.262846 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c72d6020-9460-4198-863a-ec32bc90fee9-kube-api-access-58lp7" (OuterVolumeSpecName: "kube-api-access-58lp7") pod "c72d6020-9460-4198-863a-ec32bc90fee9" (UID: "c72d6020-9460-4198-863a-ec32bc90fee9"). InnerVolumeSpecName "kube-api-access-58lp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:23:42 crc kubenswrapper[4792]: I0301 10:23:42.359672 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58lp7\" (UniqueName: \"kubernetes.io/projected/c72d6020-9460-4198-863a-ec32bc90fee9-kube-api-access-58lp7\") on node \"crc\" DevicePath \"\"" Mar 01 10:23:42 crc kubenswrapper[4792]: I0301 10:23:42.433223 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c72d6020-9460-4198-863a-ec32bc90fee9-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "c72d6020-9460-4198-863a-ec32bc90fee9" (UID: "c72d6020-9460-4198-863a-ec32bc90fee9"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:23:42 crc kubenswrapper[4792]: I0301 10:23:42.462184 4792 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c72d6020-9460-4198-863a-ec32bc90fee9-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 01 10:23:43 crc kubenswrapper[4792]: I0301 10:23:43.422933 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c72d6020-9460-4198-863a-ec32bc90fee9" path="/var/lib/kubelet/pods/c72d6020-9460-4198-863a-ec32bc90fee9/volumes" Mar 01 10:24:00 crc kubenswrapper[4792]: I0301 10:24:00.141826 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539344-r7v49"] Mar 01 10:24:00 crc kubenswrapper[4792]: E0301 10:24:00.142863 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="821a550c-e6ff-4517-a306-11ea497be759" containerName="oc" Mar 01 10:24:00 crc kubenswrapper[4792]: I0301 10:24:00.142879 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="821a550c-e6ff-4517-a306-11ea497be759" containerName="oc" Mar 01 10:24:00 crc kubenswrapper[4792]: E0301 10:24:00.142898 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c72d6020-9460-4198-863a-ec32bc90fee9" containerName="copy" Mar 01 10:24:00 crc kubenswrapper[4792]: I0301 10:24:00.142922 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c72d6020-9460-4198-863a-ec32bc90fee9" containerName="copy" Mar 01 10:24:00 crc kubenswrapper[4792]: E0301 10:24:00.142937 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c72d6020-9460-4198-863a-ec32bc90fee9" containerName="gather" Mar 01 10:24:00 crc kubenswrapper[4792]: I0301 10:24:00.142944 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c72d6020-9460-4198-863a-ec32bc90fee9" containerName="gather" Mar 01 10:24:00 crc kubenswrapper[4792]: I0301 10:24:00.143209 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c72d6020-9460-4198-863a-ec32bc90fee9" containerName="gather" Mar 01 10:24:00 crc kubenswrapper[4792]: I0301 10:24:00.143227 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="821a550c-e6ff-4517-a306-11ea497be759" containerName="oc" Mar 01 10:24:00 crc kubenswrapper[4792]: I0301 10:24:00.143242 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c72d6020-9460-4198-863a-ec32bc90fee9" containerName="copy" Mar 01 10:24:00 crc kubenswrapper[4792]: I0301 10:24:00.144059 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539344-r7v49" Mar 01 10:24:00 crc kubenswrapper[4792]: I0301 10:24:00.148232 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 10:24:00 crc kubenswrapper[4792]: I0301 10:24:00.148398 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 10:24:00 crc kubenswrapper[4792]: I0301 10:24:00.150168 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 10:24:00 crc kubenswrapper[4792]: I0301 10:24:00.175267 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539344-r7v49"] Mar 01 10:24:00 crc kubenswrapper[4792]: I0301 10:24:00.296939 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xps49\" (UniqueName: \"kubernetes.io/projected/6c5eb940-780b-4b4d-ab60-e1ad0c284811-kube-api-access-xps49\") pod \"auto-csr-approver-29539344-r7v49\" (UID: \"6c5eb940-780b-4b4d-ab60-e1ad0c284811\") " pod="openshift-infra/auto-csr-approver-29539344-r7v49" Mar 01 10:24:00 crc kubenswrapper[4792]: I0301 10:24:00.398806 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xps49\" (UniqueName: \"kubernetes.io/projected/6c5eb940-780b-4b4d-ab60-e1ad0c284811-kube-api-access-xps49\") pod \"auto-csr-approver-29539344-r7v49\" (UID: \"6c5eb940-780b-4b4d-ab60-e1ad0c284811\") " pod="openshift-infra/auto-csr-approver-29539344-r7v49" Mar 01 10:24:00 crc kubenswrapper[4792]: I0301 10:24:00.428228 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xps49\" (UniqueName: \"kubernetes.io/projected/6c5eb940-780b-4b4d-ab60-e1ad0c284811-kube-api-access-xps49\") pod \"auto-csr-approver-29539344-r7v49\" (UID: \"6c5eb940-780b-4b4d-ab60-e1ad0c284811\") " pod="openshift-infra/auto-csr-approver-29539344-r7v49" Mar 01 10:24:00 crc kubenswrapper[4792]: I0301 10:24:00.465088 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539344-r7v49" Mar 01 10:24:00 crc kubenswrapper[4792]: I0301 10:24:00.903593 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539344-r7v49"] Mar 01 10:24:01 crc kubenswrapper[4792]: I0301 10:24:01.293605 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539344-r7v49" event={"ID":"6c5eb940-780b-4b4d-ab60-e1ad0c284811","Type":"ContainerStarted","Data":"e1a4c4d9ec6ade48f56616e194ac4074ba88b48e6166927718481562f18e30d6"} Mar 01 10:24:02 crc kubenswrapper[4792]: I0301 10:24:02.301823 4792 generic.go:334] "Generic (PLEG): container finished" podID="6c5eb940-780b-4b4d-ab60-e1ad0c284811" containerID="1f71a688006db007bbde2ad2f2afb131f6dc80a772403871242301338cd9bc3e" exitCode=0 Mar 01 10:24:02 crc kubenswrapper[4792]: I0301 10:24:02.301871 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539344-r7v49" event={"ID":"6c5eb940-780b-4b4d-ab60-e1ad0c284811","Type":"ContainerDied","Data":"1f71a688006db007bbde2ad2f2afb131f6dc80a772403871242301338cd9bc3e"} Mar 01 10:24:03 crc kubenswrapper[4792]: I0301 10:24:03.658263 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539344-r7v49" Mar 01 10:24:03 crc kubenswrapper[4792]: I0301 10:24:03.773684 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xps49\" (UniqueName: \"kubernetes.io/projected/6c5eb940-780b-4b4d-ab60-e1ad0c284811-kube-api-access-xps49\") pod \"6c5eb940-780b-4b4d-ab60-e1ad0c284811\" (UID: \"6c5eb940-780b-4b4d-ab60-e1ad0c284811\") " Mar 01 10:24:03 crc kubenswrapper[4792]: I0301 10:24:03.780666 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c5eb940-780b-4b4d-ab60-e1ad0c284811-kube-api-access-xps49" (OuterVolumeSpecName: "kube-api-access-xps49") pod "6c5eb940-780b-4b4d-ab60-e1ad0c284811" (UID: "6c5eb940-780b-4b4d-ab60-e1ad0c284811"). InnerVolumeSpecName "kube-api-access-xps49". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:24:03 crc kubenswrapper[4792]: I0301 10:24:03.876586 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xps49\" (UniqueName: \"kubernetes.io/projected/6c5eb940-780b-4b4d-ab60-e1ad0c284811-kube-api-access-xps49\") on node \"crc\" DevicePath \"\"" Mar 01 10:24:04 crc kubenswrapper[4792]: I0301 10:24:04.322769 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539344-r7v49" event={"ID":"6c5eb940-780b-4b4d-ab60-e1ad0c284811","Type":"ContainerDied","Data":"e1a4c4d9ec6ade48f56616e194ac4074ba88b48e6166927718481562f18e30d6"} Mar 01 10:24:04 crc kubenswrapper[4792]: I0301 10:24:04.322819 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539344-r7v49" Mar 01 10:24:04 crc kubenswrapper[4792]: I0301 10:24:04.322830 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1a4c4d9ec6ade48f56616e194ac4074ba88b48e6166927718481562f18e30d6" Mar 01 10:24:04 crc kubenswrapper[4792]: I0301 10:24:04.731324 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539338-kcgfx"] Mar 01 10:24:04 crc kubenswrapper[4792]: I0301 10:24:04.739356 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539338-kcgfx"] Mar 01 10:24:05 crc kubenswrapper[4792]: I0301 10:24:05.422428 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a36fce8b-7564-4d74-b3ad-9bfe9979cb67" path="/var/lib/kubelet/pods/a36fce8b-7564-4d74-b3ad-9bfe9979cb67/volumes" Mar 01 10:24:34 crc kubenswrapper[4792]: I0301 10:24:34.943215 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 10:24:34 crc kubenswrapper[4792]: I0301 10:24:34.943778 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 10:24:40 crc kubenswrapper[4792]: I0301 10:24:40.030564 4792 scope.go:117] "RemoveContainer" containerID="dd7d1b10f4227bfce662716c9cbd0e72c4c6de3ac564b41925cd39f0160eafdf" Mar 01 10:24:47 crc kubenswrapper[4792]: I0301 10:24:47.475807 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ws9zl"] Mar 01 10:24:47 crc kubenswrapper[4792]: E0301 10:24:47.478002 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c5eb940-780b-4b4d-ab60-e1ad0c284811" containerName="oc" Mar 01 10:24:47 crc kubenswrapper[4792]: I0301 10:24:47.478159 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c5eb940-780b-4b4d-ab60-e1ad0c284811" containerName="oc" Mar 01 10:24:47 crc kubenswrapper[4792]: I0301 10:24:47.478427 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c5eb940-780b-4b4d-ab60-e1ad0c284811" containerName="oc" Mar 01 10:24:47 crc kubenswrapper[4792]: I0301 10:24:47.479947 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ws9zl" Mar 01 10:24:47 crc kubenswrapper[4792]: I0301 10:24:47.511182 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ws9zl"] Mar 01 10:24:47 crc kubenswrapper[4792]: I0301 10:24:47.548063 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55npp\" (UniqueName: \"kubernetes.io/projected/556f4724-e050-4a4f-b7a3-680d9d7f08c5-kube-api-access-55npp\") pod \"certified-operators-ws9zl\" (UID: \"556f4724-e050-4a4f-b7a3-680d9d7f08c5\") " pod="openshift-marketplace/certified-operators-ws9zl" Mar 01 10:24:47 crc kubenswrapper[4792]: I0301 10:24:47.548156 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/556f4724-e050-4a4f-b7a3-680d9d7f08c5-utilities\") pod \"certified-operators-ws9zl\" (UID: \"556f4724-e050-4a4f-b7a3-680d9d7f08c5\") " pod="openshift-marketplace/certified-operators-ws9zl" Mar 01 10:24:47 crc kubenswrapper[4792]: I0301 10:24:47.548297 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/556f4724-e050-4a4f-b7a3-680d9d7f08c5-catalog-content\") pod \"certified-operators-ws9zl\" (UID: \"556f4724-e050-4a4f-b7a3-680d9d7f08c5\") " pod="openshift-marketplace/certified-operators-ws9zl" Mar 01 10:24:47 crc kubenswrapper[4792]: I0301 10:24:47.649743 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/556f4724-e050-4a4f-b7a3-680d9d7f08c5-catalog-content\") pod \"certified-operators-ws9zl\" (UID: \"556f4724-e050-4a4f-b7a3-680d9d7f08c5\") " pod="openshift-marketplace/certified-operators-ws9zl" Mar 01 10:24:47 crc kubenswrapper[4792]: I0301 10:24:47.649880 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55npp\" (UniqueName: \"kubernetes.io/projected/556f4724-e050-4a4f-b7a3-680d9d7f08c5-kube-api-access-55npp\") pod \"certified-operators-ws9zl\" (UID: \"556f4724-e050-4a4f-b7a3-680d9d7f08c5\") " pod="openshift-marketplace/certified-operators-ws9zl" Mar 01 10:24:47 crc kubenswrapper[4792]: I0301 10:24:47.649934 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/556f4724-e050-4a4f-b7a3-680d9d7f08c5-utilities\") pod \"certified-operators-ws9zl\" (UID: \"556f4724-e050-4a4f-b7a3-680d9d7f08c5\") " pod="openshift-marketplace/certified-operators-ws9zl" Mar 01 10:24:47 crc kubenswrapper[4792]: I0301 10:24:47.650347 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/556f4724-e050-4a4f-b7a3-680d9d7f08c5-catalog-content\") pod \"certified-operators-ws9zl\" (UID: \"556f4724-e050-4a4f-b7a3-680d9d7f08c5\") " pod="openshift-marketplace/certified-operators-ws9zl" Mar 01 10:24:47 crc kubenswrapper[4792]: I0301 10:24:47.650438 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/556f4724-e050-4a4f-b7a3-680d9d7f08c5-utilities\") pod \"certified-operators-ws9zl\" (UID: \"556f4724-e050-4a4f-b7a3-680d9d7f08c5\") " pod="openshift-marketplace/certified-operators-ws9zl" Mar 01 10:24:47 crc kubenswrapper[4792]: I0301 10:24:47.674785 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55npp\" (UniqueName: \"kubernetes.io/projected/556f4724-e050-4a4f-b7a3-680d9d7f08c5-kube-api-access-55npp\") pod \"certified-operators-ws9zl\" (UID: \"556f4724-e050-4a4f-b7a3-680d9d7f08c5\") " pod="openshift-marketplace/certified-operators-ws9zl" Mar 01 10:24:47 crc kubenswrapper[4792]: I0301 10:24:47.801205 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ws9zl" Mar 01 10:24:48 crc kubenswrapper[4792]: I0301 10:24:48.153918 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ws9zl"] Mar 01 10:24:48 crc kubenswrapper[4792]: I0301 10:24:48.851400 4792 generic.go:334] "Generic (PLEG): container finished" podID="556f4724-e050-4a4f-b7a3-680d9d7f08c5" containerID="0d7de9e7b3f536d936e56a7ed234d1bc1331199661c31959d06ba45c5299b92e" exitCode=0 Mar 01 10:24:48 crc kubenswrapper[4792]: I0301 10:24:48.851724 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ws9zl" event={"ID":"556f4724-e050-4a4f-b7a3-680d9d7f08c5","Type":"ContainerDied","Data":"0d7de9e7b3f536d936e56a7ed234d1bc1331199661c31959d06ba45c5299b92e"} Mar 01 10:24:48 crc kubenswrapper[4792]: I0301 10:24:48.851751 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ws9zl" event={"ID":"556f4724-e050-4a4f-b7a3-680d9d7f08c5","Type":"ContainerStarted","Data":"6dfcf844b73ec95251bb5d3c6c8335ba419bfbd6d26edbde18758b36514066bd"} Mar 01 10:24:49 crc kubenswrapper[4792]: I0301 10:24:49.860670 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ws9zl" event={"ID":"556f4724-e050-4a4f-b7a3-680d9d7f08c5","Type":"ContainerStarted","Data":"e4853fa909b6ae75d307a4cc489cd9628705342927483c4ad7be8c32cbd91c40"} Mar 01 10:24:51 crc kubenswrapper[4792]: I0301 10:24:51.879564 4792 generic.go:334] "Generic (PLEG): container finished" podID="556f4724-e050-4a4f-b7a3-680d9d7f08c5" containerID="e4853fa909b6ae75d307a4cc489cd9628705342927483c4ad7be8c32cbd91c40" exitCode=0 Mar 01 10:24:51 crc kubenswrapper[4792]: I0301 10:24:51.879649 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ws9zl" event={"ID":"556f4724-e050-4a4f-b7a3-680d9d7f08c5","Type":"ContainerDied","Data":"e4853fa909b6ae75d307a4cc489cd9628705342927483c4ad7be8c32cbd91c40"} Mar 01 10:24:52 crc kubenswrapper[4792]: I0301 10:24:52.276935 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h67d8"] Mar 01 10:24:52 crc kubenswrapper[4792]: I0301 10:24:52.282317 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h67d8" Mar 01 10:24:52 crc kubenswrapper[4792]: I0301 10:24:52.288005 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h67d8"] Mar 01 10:24:52 crc kubenswrapper[4792]: I0301 10:24:52.368881 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/582fa17a-143d-4bb0-a32e-a6d30f3e3754-utilities\") pod \"redhat-marketplace-h67d8\" (UID: \"582fa17a-143d-4bb0-a32e-a6d30f3e3754\") " pod="openshift-marketplace/redhat-marketplace-h67d8" Mar 01 10:24:52 crc kubenswrapper[4792]: I0301 10:24:52.368965 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcqwd\" (UniqueName: \"kubernetes.io/projected/582fa17a-143d-4bb0-a32e-a6d30f3e3754-kube-api-access-qcqwd\") pod \"redhat-marketplace-h67d8\" (UID: \"582fa17a-143d-4bb0-a32e-a6d30f3e3754\") " pod="openshift-marketplace/redhat-marketplace-h67d8" Mar 01 10:24:52 crc kubenswrapper[4792]: I0301 10:24:52.371365 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/582fa17a-143d-4bb0-a32e-a6d30f3e3754-catalog-content\") pod \"redhat-marketplace-h67d8\" (UID: \"582fa17a-143d-4bb0-a32e-a6d30f3e3754\") " pod="openshift-marketplace/redhat-marketplace-h67d8" Mar 01 10:24:52 crc kubenswrapper[4792]: I0301 10:24:52.473893 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/582fa17a-143d-4bb0-a32e-a6d30f3e3754-utilities\") pod \"redhat-marketplace-h67d8\" (UID: \"582fa17a-143d-4bb0-a32e-a6d30f3e3754\") " pod="openshift-marketplace/redhat-marketplace-h67d8" Mar 01 10:24:52 crc kubenswrapper[4792]: I0301 10:24:52.474242 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcqwd\" (UniqueName: \"kubernetes.io/projected/582fa17a-143d-4bb0-a32e-a6d30f3e3754-kube-api-access-qcqwd\") pod \"redhat-marketplace-h67d8\" (UID: \"582fa17a-143d-4bb0-a32e-a6d30f3e3754\") " pod="openshift-marketplace/redhat-marketplace-h67d8" Mar 01 10:24:52 crc kubenswrapper[4792]: I0301 10:24:52.474447 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/582fa17a-143d-4bb0-a32e-a6d30f3e3754-catalog-content\") pod \"redhat-marketplace-h67d8\" (UID: \"582fa17a-143d-4bb0-a32e-a6d30f3e3754\") " pod="openshift-marketplace/redhat-marketplace-h67d8" Mar 01 10:24:52 crc kubenswrapper[4792]: I0301 10:24:52.477367 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/582fa17a-143d-4bb0-a32e-a6d30f3e3754-utilities\") pod \"redhat-marketplace-h67d8\" (UID: \"582fa17a-143d-4bb0-a32e-a6d30f3e3754\") " pod="openshift-marketplace/redhat-marketplace-h67d8" Mar 01 10:24:52 crc kubenswrapper[4792]: I0301 10:24:52.478005 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/582fa17a-143d-4bb0-a32e-a6d30f3e3754-catalog-content\") pod \"redhat-marketplace-h67d8\" (UID: \"582fa17a-143d-4bb0-a32e-a6d30f3e3754\") " pod="openshift-marketplace/redhat-marketplace-h67d8" Mar 01 10:24:52 crc kubenswrapper[4792]: I0301 10:24:52.502788 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcqwd\" (UniqueName: \"kubernetes.io/projected/582fa17a-143d-4bb0-a32e-a6d30f3e3754-kube-api-access-qcqwd\") pod \"redhat-marketplace-h67d8\" (UID: \"582fa17a-143d-4bb0-a32e-a6d30f3e3754\") " pod="openshift-marketplace/redhat-marketplace-h67d8" Mar 01 10:24:52 crc kubenswrapper[4792]: I0301 10:24:52.600482 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h67d8" Mar 01 10:24:52 crc kubenswrapper[4792]: I0301 10:24:52.892072 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ws9zl" event={"ID":"556f4724-e050-4a4f-b7a3-680d9d7f08c5","Type":"ContainerStarted","Data":"861184e477f3e515517f157d27fb90a7b0c209ddadfbd732c4c22b992ba1c5af"} Mar 01 10:24:52 crc kubenswrapper[4792]: I0301 10:24:52.919384 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ws9zl" podStartSLOduration=2.500398993 podStartE2EDuration="5.919362643s" podCreationTimestamp="2026-03-01 10:24:47 +0000 UTC" firstStartedPulling="2026-03-01 10:24:48.853571282 +0000 UTC m=+4618.095450479" lastFinishedPulling="2026-03-01 10:24:52.272534932 +0000 UTC m=+4621.514414129" observedRunningTime="2026-03-01 10:24:52.91363953 +0000 UTC m=+4622.155518737" watchObservedRunningTime="2026-03-01 10:24:52.919362643 +0000 UTC m=+4622.161241840" Mar 01 10:24:53 crc kubenswrapper[4792]: W0301 10:24:53.071162 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod582fa17a_143d_4bb0_a32e_a6d30f3e3754.slice/crio-d04038240a388695601510cacf85197a316c819f1b4f9a710217ff1d52887310 WatchSource:0}: Error finding container d04038240a388695601510cacf85197a316c819f1b4f9a710217ff1d52887310: Status 404 returned error can't find the container with id d04038240a388695601510cacf85197a316c819f1b4f9a710217ff1d52887310 Mar 01 10:24:53 crc kubenswrapper[4792]: I0301 10:24:53.083176 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h67d8"] Mar 01 10:24:53 crc kubenswrapper[4792]: I0301 10:24:53.903027 4792 generic.go:334] "Generic (PLEG): container finished" podID="582fa17a-143d-4bb0-a32e-a6d30f3e3754" containerID="db712e4845a42511840eb3f9112057fd1851197bb2b005495491c047312918b1" exitCode=0 Mar 01 10:24:53 crc kubenswrapper[4792]: I0301 10:24:53.903329 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h67d8" event={"ID":"582fa17a-143d-4bb0-a32e-a6d30f3e3754","Type":"ContainerDied","Data":"db712e4845a42511840eb3f9112057fd1851197bb2b005495491c047312918b1"} Mar 01 10:24:53 crc kubenswrapper[4792]: I0301 10:24:53.903353 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h67d8" event={"ID":"582fa17a-143d-4bb0-a32e-a6d30f3e3754","Type":"ContainerStarted","Data":"d04038240a388695601510cacf85197a316c819f1b4f9a710217ff1d52887310"} Mar 01 10:24:54 crc kubenswrapper[4792]: I0301 10:24:54.915648 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h67d8" event={"ID":"582fa17a-143d-4bb0-a32e-a6d30f3e3754","Type":"ContainerStarted","Data":"a288e2f2cd47e6a54f7d84d346120c0d4ba093502cdfe3c581395af6e149a1b1"} Mar 01 10:24:56 crc kubenswrapper[4792]: I0301 10:24:56.936215 4792 generic.go:334] "Generic (PLEG): container finished" podID="582fa17a-143d-4bb0-a32e-a6d30f3e3754" containerID="a288e2f2cd47e6a54f7d84d346120c0d4ba093502cdfe3c581395af6e149a1b1" exitCode=0 Mar 01 10:24:56 crc kubenswrapper[4792]: I0301 10:24:56.936865 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h67d8" event={"ID":"582fa17a-143d-4bb0-a32e-a6d30f3e3754","Type":"ContainerDied","Data":"a288e2f2cd47e6a54f7d84d346120c0d4ba093502cdfe3c581395af6e149a1b1"} Mar 01 10:24:57 crc kubenswrapper[4792]: I0301 10:24:57.802203 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ws9zl" Mar 01 10:24:57 crc kubenswrapper[4792]: I0301 10:24:57.802695 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ws9zl" Mar 01 10:24:57 crc kubenswrapper[4792]: I0301 10:24:57.853895 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ws9zl" Mar 01 10:24:57 crc kubenswrapper[4792]: I0301 10:24:57.946625 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h67d8" event={"ID":"582fa17a-143d-4bb0-a32e-a6d30f3e3754","Type":"ContainerStarted","Data":"2e70dc111e2d37aee7ee8720da31db75627890586f0bb80926fbcccdee190bb4"} Mar 01 10:24:57 crc kubenswrapper[4792]: I0301 10:24:57.966100 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h67d8" podStartSLOduration=2.513508012 podStartE2EDuration="5.966074873s" podCreationTimestamp="2026-03-01 10:24:52 +0000 UTC" firstStartedPulling="2026-03-01 10:24:53.906202131 +0000 UTC m=+4623.148081328" lastFinishedPulling="2026-03-01 10:24:57.358768992 +0000 UTC m=+4626.600648189" observedRunningTime="2026-03-01 10:24:57.963066278 +0000 UTC m=+4627.204945515" watchObservedRunningTime="2026-03-01 10:24:57.966074873 +0000 UTC m=+4627.207954070" Mar 01 10:24:58 crc kubenswrapper[4792]: I0301 10:24:58.001407 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ws9zl" Mar 01 10:24:59 crc kubenswrapper[4792]: I0301 10:24:59.464066 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ws9zl"] Mar 01 10:24:59 crc kubenswrapper[4792]: I0301 10:24:59.965654 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ws9zl" podUID="556f4724-e050-4a4f-b7a3-680d9d7f08c5" containerName="registry-server" containerID="cri-o://861184e477f3e515517f157d27fb90a7b0c209ddadfbd732c4c22b992ba1c5af" gracePeriod=2 Mar 01 10:25:00 crc kubenswrapper[4792]: I0301 10:25:00.959131 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ws9zl" Mar 01 10:25:00 crc kubenswrapper[4792]: I0301 10:25:00.974648 4792 generic.go:334] "Generic (PLEG): container finished" podID="556f4724-e050-4a4f-b7a3-680d9d7f08c5" containerID="861184e477f3e515517f157d27fb90a7b0c209ddadfbd732c4c22b992ba1c5af" exitCode=0 Mar 01 10:25:00 crc kubenswrapper[4792]: I0301 10:25:00.974686 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ws9zl" event={"ID":"556f4724-e050-4a4f-b7a3-680d9d7f08c5","Type":"ContainerDied","Data":"861184e477f3e515517f157d27fb90a7b0c209ddadfbd732c4c22b992ba1c5af"} Mar 01 10:25:00 crc kubenswrapper[4792]: I0301 10:25:00.974724 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ws9zl" Mar 01 10:25:00 crc kubenswrapper[4792]: I0301 10:25:00.974742 4792 scope.go:117] "RemoveContainer" containerID="861184e477f3e515517f157d27fb90a7b0c209ddadfbd732c4c22b992ba1c5af" Mar 01 10:25:00 crc kubenswrapper[4792]: I0301 10:25:00.974731 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ws9zl" event={"ID":"556f4724-e050-4a4f-b7a3-680d9d7f08c5","Type":"ContainerDied","Data":"6dfcf844b73ec95251bb5d3c6c8335ba419bfbd6d26edbde18758b36514066bd"} Mar 01 10:25:00 crc kubenswrapper[4792]: I0301 10:25:00.991515 4792 scope.go:117] "RemoveContainer" containerID="e4853fa909b6ae75d307a4cc489cd9628705342927483c4ad7be8c32cbd91c40" Mar 01 10:25:01 crc kubenswrapper[4792]: I0301 10:25:01.008266 4792 scope.go:117] "RemoveContainer" containerID="0d7de9e7b3f536d936e56a7ed234d1bc1331199661c31959d06ba45c5299b92e" Mar 01 10:25:01 crc kubenswrapper[4792]: I0301 10:25:01.051400 4792 scope.go:117] "RemoveContainer" containerID="861184e477f3e515517f157d27fb90a7b0c209ddadfbd732c4c22b992ba1c5af" Mar 01 10:25:01 crc kubenswrapper[4792]: E0301 10:25:01.051844 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"861184e477f3e515517f157d27fb90a7b0c209ddadfbd732c4c22b992ba1c5af\": container with ID starting with 861184e477f3e515517f157d27fb90a7b0c209ddadfbd732c4c22b992ba1c5af not found: ID does not exist" containerID="861184e477f3e515517f157d27fb90a7b0c209ddadfbd732c4c22b992ba1c5af" Mar 01 10:25:01 crc kubenswrapper[4792]: I0301 10:25:01.051878 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"861184e477f3e515517f157d27fb90a7b0c209ddadfbd732c4c22b992ba1c5af"} err="failed to get container status \"861184e477f3e515517f157d27fb90a7b0c209ddadfbd732c4c22b992ba1c5af\": rpc error: code = NotFound desc = could not find container \"861184e477f3e515517f157d27fb90a7b0c209ddadfbd732c4c22b992ba1c5af\": container with ID starting with 861184e477f3e515517f157d27fb90a7b0c209ddadfbd732c4c22b992ba1c5af not found: ID does not exist" Mar 01 10:25:01 crc kubenswrapper[4792]: I0301 10:25:01.051960 4792 scope.go:117] "RemoveContainer" containerID="e4853fa909b6ae75d307a4cc489cd9628705342927483c4ad7be8c32cbd91c40" Mar 01 10:25:01 crc kubenswrapper[4792]: E0301 10:25:01.052266 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4853fa909b6ae75d307a4cc489cd9628705342927483c4ad7be8c32cbd91c40\": container with ID starting with e4853fa909b6ae75d307a4cc489cd9628705342927483c4ad7be8c32cbd91c40 not found: ID does not exist" containerID="e4853fa909b6ae75d307a4cc489cd9628705342927483c4ad7be8c32cbd91c40" Mar 01 10:25:01 crc kubenswrapper[4792]: I0301 10:25:01.052291 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4853fa909b6ae75d307a4cc489cd9628705342927483c4ad7be8c32cbd91c40"} err="failed to get container status \"e4853fa909b6ae75d307a4cc489cd9628705342927483c4ad7be8c32cbd91c40\": rpc error: code = NotFound desc = could not find container \"e4853fa909b6ae75d307a4cc489cd9628705342927483c4ad7be8c32cbd91c40\": container with ID starting with e4853fa909b6ae75d307a4cc489cd9628705342927483c4ad7be8c32cbd91c40 not found: ID does not exist" Mar 01 10:25:01 crc kubenswrapper[4792]: I0301 10:25:01.052314 4792 scope.go:117] "RemoveContainer" containerID="0d7de9e7b3f536d936e56a7ed234d1bc1331199661c31959d06ba45c5299b92e" Mar 01 10:25:01 crc kubenswrapper[4792]: E0301 10:25:01.052778 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d7de9e7b3f536d936e56a7ed234d1bc1331199661c31959d06ba45c5299b92e\": container with ID starting with 0d7de9e7b3f536d936e56a7ed234d1bc1331199661c31959d06ba45c5299b92e not found: ID does not exist" containerID="0d7de9e7b3f536d936e56a7ed234d1bc1331199661c31959d06ba45c5299b92e" Mar 01 10:25:01 crc kubenswrapper[4792]: I0301 10:25:01.052860 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d7de9e7b3f536d936e56a7ed234d1bc1331199661c31959d06ba45c5299b92e"} err="failed to get container status \"0d7de9e7b3f536d936e56a7ed234d1bc1331199661c31959d06ba45c5299b92e\": rpc error: code = NotFound desc = could not find container \"0d7de9e7b3f536d936e56a7ed234d1bc1331199661c31959d06ba45c5299b92e\": container with ID starting with 0d7de9e7b3f536d936e56a7ed234d1bc1331199661c31959d06ba45c5299b92e not found: ID does not exist" Mar 01 10:25:01 crc kubenswrapper[4792]: I0301 10:25:01.071601 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/556f4724-e050-4a4f-b7a3-680d9d7f08c5-utilities\") pod \"556f4724-e050-4a4f-b7a3-680d9d7f08c5\" (UID: \"556f4724-e050-4a4f-b7a3-680d9d7f08c5\") " Mar 01 10:25:01 crc kubenswrapper[4792]: I0301 10:25:01.071768 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55npp\" (UniqueName: \"kubernetes.io/projected/556f4724-e050-4a4f-b7a3-680d9d7f08c5-kube-api-access-55npp\") pod \"556f4724-e050-4a4f-b7a3-680d9d7f08c5\" (UID: \"556f4724-e050-4a4f-b7a3-680d9d7f08c5\") " Mar 01 10:25:01 crc kubenswrapper[4792]: I0301 10:25:01.071849 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/556f4724-e050-4a4f-b7a3-680d9d7f08c5-catalog-content\") pod \"556f4724-e050-4a4f-b7a3-680d9d7f08c5\" (UID: \"556f4724-e050-4a4f-b7a3-680d9d7f08c5\") " Mar 01 10:25:01 crc kubenswrapper[4792]: I0301 10:25:01.073098 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/556f4724-e050-4a4f-b7a3-680d9d7f08c5-utilities" (OuterVolumeSpecName: "utilities") pod "556f4724-e050-4a4f-b7a3-680d9d7f08c5" (UID: "556f4724-e050-4a4f-b7a3-680d9d7f08c5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:25:01 crc kubenswrapper[4792]: I0301 10:25:01.076949 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/556f4724-e050-4a4f-b7a3-680d9d7f08c5-kube-api-access-55npp" (OuterVolumeSpecName: "kube-api-access-55npp") pod "556f4724-e050-4a4f-b7a3-680d9d7f08c5" (UID: "556f4724-e050-4a4f-b7a3-680d9d7f08c5"). InnerVolumeSpecName "kube-api-access-55npp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:25:01 crc kubenswrapper[4792]: I0301 10:25:01.121421 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/556f4724-e050-4a4f-b7a3-680d9d7f08c5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "556f4724-e050-4a4f-b7a3-680d9d7f08c5" (UID: "556f4724-e050-4a4f-b7a3-680d9d7f08c5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:25:01 crc kubenswrapper[4792]: I0301 10:25:01.174480 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55npp\" (UniqueName: \"kubernetes.io/projected/556f4724-e050-4a4f-b7a3-680d9d7f08c5-kube-api-access-55npp\") on node \"crc\" DevicePath \"\"" Mar 01 10:25:01 crc kubenswrapper[4792]: I0301 10:25:01.174528 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/556f4724-e050-4a4f-b7a3-680d9d7f08c5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 10:25:01 crc kubenswrapper[4792]: I0301 10:25:01.174541 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/556f4724-e050-4a4f-b7a3-680d9d7f08c5-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 10:25:01 crc kubenswrapper[4792]: I0301 10:25:01.312671 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ws9zl"] Mar 01 10:25:01 crc kubenswrapper[4792]: I0301 10:25:01.325181 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ws9zl"] Mar 01 10:25:01 crc kubenswrapper[4792]: I0301 10:25:01.419555 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="556f4724-e050-4a4f-b7a3-680d9d7f08c5" path="/var/lib/kubelet/pods/556f4724-e050-4a4f-b7a3-680d9d7f08c5/volumes" Mar 01 10:25:02 crc kubenswrapper[4792]: I0301 10:25:02.602043 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h67d8" Mar 01 10:25:02 crc kubenswrapper[4792]: I0301 10:25:02.602440 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h67d8" Mar 01 10:25:02 crc kubenswrapper[4792]: I0301 10:25:02.682084 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h67d8" Mar 01 10:25:03 crc kubenswrapper[4792]: I0301 10:25:03.040948 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h67d8" Mar 01 10:25:03 crc kubenswrapper[4792]: I0301 10:25:03.663257 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h67d8"] Mar 01 10:25:04 crc kubenswrapper[4792]: I0301 10:25:04.942969 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 10:25:04 crc kubenswrapper[4792]: I0301 10:25:04.943318 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 10:25:05 crc kubenswrapper[4792]: I0301 10:25:05.014334 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-h67d8" podUID="582fa17a-143d-4bb0-a32e-a6d30f3e3754" containerName="registry-server" containerID="cri-o://2e70dc111e2d37aee7ee8720da31db75627890586f0bb80926fbcccdee190bb4" gracePeriod=2 Mar 01 10:25:05 crc kubenswrapper[4792]: I0301 10:25:05.555740 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h67d8" Mar 01 10:25:05 crc kubenswrapper[4792]: I0301 10:25:05.683944 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/582fa17a-143d-4bb0-a32e-a6d30f3e3754-utilities\") pod \"582fa17a-143d-4bb0-a32e-a6d30f3e3754\" (UID: \"582fa17a-143d-4bb0-a32e-a6d30f3e3754\") " Mar 01 10:25:05 crc kubenswrapper[4792]: I0301 10:25:05.684056 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcqwd\" (UniqueName: \"kubernetes.io/projected/582fa17a-143d-4bb0-a32e-a6d30f3e3754-kube-api-access-qcqwd\") pod \"582fa17a-143d-4bb0-a32e-a6d30f3e3754\" (UID: \"582fa17a-143d-4bb0-a32e-a6d30f3e3754\") " Mar 01 10:25:05 crc kubenswrapper[4792]: I0301 10:25:05.684234 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/582fa17a-143d-4bb0-a32e-a6d30f3e3754-catalog-content\") pod \"582fa17a-143d-4bb0-a32e-a6d30f3e3754\" (UID: \"582fa17a-143d-4bb0-a32e-a6d30f3e3754\") " Mar 01 10:25:05 crc kubenswrapper[4792]: I0301 10:25:05.684757 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/582fa17a-143d-4bb0-a32e-a6d30f3e3754-utilities" (OuterVolumeSpecName: "utilities") pod "582fa17a-143d-4bb0-a32e-a6d30f3e3754" (UID: "582fa17a-143d-4bb0-a32e-a6d30f3e3754"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:25:05 crc kubenswrapper[4792]: I0301 10:25:05.691363 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/582fa17a-143d-4bb0-a32e-a6d30f3e3754-kube-api-access-qcqwd" (OuterVolumeSpecName: "kube-api-access-qcqwd") pod "582fa17a-143d-4bb0-a32e-a6d30f3e3754" (UID: "582fa17a-143d-4bb0-a32e-a6d30f3e3754"). InnerVolumeSpecName "kube-api-access-qcqwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:25:05 crc kubenswrapper[4792]: I0301 10:25:05.712449 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/582fa17a-143d-4bb0-a32e-a6d30f3e3754-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "582fa17a-143d-4bb0-a32e-a6d30f3e3754" (UID: "582fa17a-143d-4bb0-a32e-a6d30f3e3754"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:25:05 crc kubenswrapper[4792]: I0301 10:25:05.787278 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/582fa17a-143d-4bb0-a32e-a6d30f3e3754-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 10:25:05 crc kubenswrapper[4792]: I0301 10:25:05.787316 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcqwd\" (UniqueName: \"kubernetes.io/projected/582fa17a-143d-4bb0-a32e-a6d30f3e3754-kube-api-access-qcqwd\") on node \"crc\" DevicePath \"\"" Mar 01 10:25:05 crc kubenswrapper[4792]: I0301 10:25:05.787329 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/582fa17a-143d-4bb0-a32e-a6d30f3e3754-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 10:25:06 crc kubenswrapper[4792]: I0301 10:25:06.026454 4792 generic.go:334] "Generic (PLEG): container finished" podID="582fa17a-143d-4bb0-a32e-a6d30f3e3754" containerID="2e70dc111e2d37aee7ee8720da31db75627890586f0bb80926fbcccdee190bb4" exitCode=0 Mar 01 10:25:06 crc kubenswrapper[4792]: I0301 10:25:06.026530 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h67d8" event={"ID":"582fa17a-143d-4bb0-a32e-a6d30f3e3754","Type":"ContainerDied","Data":"2e70dc111e2d37aee7ee8720da31db75627890586f0bb80926fbcccdee190bb4"} Mar 01 10:25:06 crc kubenswrapper[4792]: I0301 10:25:06.026565 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h67d8" Mar 01 10:25:06 crc kubenswrapper[4792]: I0301 10:25:06.026757 4792 scope.go:117] "RemoveContainer" containerID="2e70dc111e2d37aee7ee8720da31db75627890586f0bb80926fbcccdee190bb4" Mar 01 10:25:06 crc kubenswrapper[4792]: I0301 10:25:06.026739 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h67d8" event={"ID":"582fa17a-143d-4bb0-a32e-a6d30f3e3754","Type":"ContainerDied","Data":"d04038240a388695601510cacf85197a316c819f1b4f9a710217ff1d52887310"} Mar 01 10:25:06 crc kubenswrapper[4792]: I0301 10:25:06.047149 4792 scope.go:117] "RemoveContainer" containerID="a288e2f2cd47e6a54f7d84d346120c0d4ba093502cdfe3c581395af6e149a1b1" Mar 01 10:25:06 crc kubenswrapper[4792]: I0301 10:25:06.071787 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h67d8"] Mar 01 10:25:06 crc kubenswrapper[4792]: I0301 10:25:06.087511 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-h67d8"] Mar 01 10:25:06 crc kubenswrapper[4792]: I0301 10:25:06.096181 4792 scope.go:117] "RemoveContainer" containerID="db712e4845a42511840eb3f9112057fd1851197bb2b005495491c047312918b1" Mar 01 10:25:06 crc kubenswrapper[4792]: I0301 10:25:06.148659 4792 scope.go:117] "RemoveContainer" containerID="2e70dc111e2d37aee7ee8720da31db75627890586f0bb80926fbcccdee190bb4" Mar 01 10:25:06 crc kubenswrapper[4792]: E0301 10:25:06.149181 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e70dc111e2d37aee7ee8720da31db75627890586f0bb80926fbcccdee190bb4\": container with ID starting with 2e70dc111e2d37aee7ee8720da31db75627890586f0bb80926fbcccdee190bb4 not found: ID does not exist" containerID="2e70dc111e2d37aee7ee8720da31db75627890586f0bb80926fbcccdee190bb4" Mar 01 10:25:06 crc kubenswrapper[4792]: I0301 10:25:06.149217 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e70dc111e2d37aee7ee8720da31db75627890586f0bb80926fbcccdee190bb4"} err="failed to get container status \"2e70dc111e2d37aee7ee8720da31db75627890586f0bb80926fbcccdee190bb4\": rpc error: code = NotFound desc = could not find container \"2e70dc111e2d37aee7ee8720da31db75627890586f0bb80926fbcccdee190bb4\": container with ID starting with 2e70dc111e2d37aee7ee8720da31db75627890586f0bb80926fbcccdee190bb4 not found: ID does not exist" Mar 01 10:25:06 crc kubenswrapper[4792]: I0301 10:25:06.149241 4792 scope.go:117] "RemoveContainer" containerID="a288e2f2cd47e6a54f7d84d346120c0d4ba093502cdfe3c581395af6e149a1b1" Mar 01 10:25:06 crc kubenswrapper[4792]: E0301 10:25:06.150009 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a288e2f2cd47e6a54f7d84d346120c0d4ba093502cdfe3c581395af6e149a1b1\": container with ID starting with a288e2f2cd47e6a54f7d84d346120c0d4ba093502cdfe3c581395af6e149a1b1 not found: ID does not exist" containerID="a288e2f2cd47e6a54f7d84d346120c0d4ba093502cdfe3c581395af6e149a1b1" Mar 01 10:25:06 crc kubenswrapper[4792]: I0301 10:25:06.150040 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a288e2f2cd47e6a54f7d84d346120c0d4ba093502cdfe3c581395af6e149a1b1"} err="failed to get container status \"a288e2f2cd47e6a54f7d84d346120c0d4ba093502cdfe3c581395af6e149a1b1\": rpc error: code = NotFound desc = could not find container \"a288e2f2cd47e6a54f7d84d346120c0d4ba093502cdfe3c581395af6e149a1b1\": container with ID starting with a288e2f2cd47e6a54f7d84d346120c0d4ba093502cdfe3c581395af6e149a1b1 not found: ID does not exist" Mar 01 10:25:06 crc kubenswrapper[4792]: I0301 10:25:06.150061 4792 scope.go:117] "RemoveContainer" containerID="db712e4845a42511840eb3f9112057fd1851197bb2b005495491c047312918b1" Mar 01 10:25:06 crc kubenswrapper[4792]: E0301 10:25:06.150470 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db712e4845a42511840eb3f9112057fd1851197bb2b005495491c047312918b1\": container with ID starting with db712e4845a42511840eb3f9112057fd1851197bb2b005495491c047312918b1 not found: ID does not exist" containerID="db712e4845a42511840eb3f9112057fd1851197bb2b005495491c047312918b1" Mar 01 10:25:06 crc kubenswrapper[4792]: I0301 10:25:06.150497 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db712e4845a42511840eb3f9112057fd1851197bb2b005495491c047312918b1"} err="failed to get container status \"db712e4845a42511840eb3f9112057fd1851197bb2b005495491c047312918b1\": rpc error: code = NotFound desc = could not find container \"db712e4845a42511840eb3f9112057fd1851197bb2b005495491c047312918b1\": container with ID starting with db712e4845a42511840eb3f9112057fd1851197bb2b005495491c047312918b1 not found: ID does not exist" Mar 01 10:25:07 crc kubenswrapper[4792]: I0301 10:25:07.422540 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="582fa17a-143d-4bb0-a32e-a6d30f3e3754" path="/var/lib/kubelet/pods/582fa17a-143d-4bb0-a32e-a6d30f3e3754/volumes" Mar 01 10:25:34 crc kubenswrapper[4792]: I0301 10:25:34.943405 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 10:25:34 crc kubenswrapper[4792]: I0301 10:25:34.943864 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 10:25:34 crc kubenswrapper[4792]: I0301 10:25:34.943899 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 10:25:34 crc kubenswrapper[4792]: I0301 10:25:34.944628 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43"} pod="openshift-machine-config-operator/machine-config-daemon-bqszv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 01 10:25:34 crc kubenswrapper[4792]: I0301 10:25:34.944676 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" containerID="cri-o://6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" gracePeriod=600 Mar 01 10:25:35 crc kubenswrapper[4792]: E0301 10:25:35.062959 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:25:35 crc kubenswrapper[4792]: I0301 10:25:35.266553 4792 generic.go:334] "Generic (PLEG): container finished" podID="9105f6b0-6f16-47aa-8009-73736a90b765" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" exitCode=0 Mar 01 10:25:35 crc kubenswrapper[4792]: I0301 10:25:35.266604 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerDied","Data":"6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43"} Mar 01 10:25:35 crc kubenswrapper[4792]: I0301 10:25:35.266642 4792 scope.go:117] "RemoveContainer" containerID="34d9173341b46ccf37e4f77b26afd17d6e6d0f7b2699af960d32ad54ab5e3db7" Mar 01 10:25:35 crc kubenswrapper[4792]: I0301 10:25:35.267339 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:25:35 crc kubenswrapper[4792]: E0301 10:25:35.267655 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:25:47 crc kubenswrapper[4792]: I0301 10:25:47.410078 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:25:47 crc kubenswrapper[4792]: E0301 10:25:47.412106 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:25:59 crc kubenswrapper[4792]: I0301 10:25:59.408948 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:25:59 crc kubenswrapper[4792]: E0301 10:25:59.411441 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:26:00 crc kubenswrapper[4792]: I0301 10:26:00.155230 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539346-l8ktk"] Mar 01 10:26:00 crc kubenswrapper[4792]: E0301 10:26:00.156589 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="556f4724-e050-4a4f-b7a3-680d9d7f08c5" containerName="extract-utilities" Mar 01 10:26:00 crc kubenswrapper[4792]: I0301 10:26:00.156614 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="556f4724-e050-4a4f-b7a3-680d9d7f08c5" containerName="extract-utilities" Mar 01 10:26:00 crc kubenswrapper[4792]: E0301 10:26:00.156633 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="582fa17a-143d-4bb0-a32e-a6d30f3e3754" containerName="extract-utilities" Mar 01 10:26:00 crc kubenswrapper[4792]: I0301 10:26:00.156642 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="582fa17a-143d-4bb0-a32e-a6d30f3e3754" containerName="extract-utilities" Mar 01 10:26:00 crc kubenswrapper[4792]: E0301 10:26:00.156659 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="556f4724-e050-4a4f-b7a3-680d9d7f08c5" containerName="registry-server" Mar 01 10:26:00 crc kubenswrapper[4792]: I0301 10:26:00.156669 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="556f4724-e050-4a4f-b7a3-680d9d7f08c5" containerName="registry-server" Mar 01 10:26:00 crc kubenswrapper[4792]: E0301 10:26:00.156684 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="556f4724-e050-4a4f-b7a3-680d9d7f08c5" containerName="extract-content" Mar 01 10:26:00 crc kubenswrapper[4792]: I0301 10:26:00.156692 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="556f4724-e050-4a4f-b7a3-680d9d7f08c5" containerName="extract-content" Mar 01 10:26:00 crc kubenswrapper[4792]: E0301 10:26:00.156713 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="582fa17a-143d-4bb0-a32e-a6d30f3e3754" containerName="registry-server" Mar 01 10:26:00 crc kubenswrapper[4792]: I0301 10:26:00.156722 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="582fa17a-143d-4bb0-a32e-a6d30f3e3754" containerName="registry-server" Mar 01 10:26:00 crc kubenswrapper[4792]: E0301 10:26:00.156745 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="582fa17a-143d-4bb0-a32e-a6d30f3e3754" containerName="extract-content" Mar 01 10:26:00 crc kubenswrapper[4792]: I0301 10:26:00.156754 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="582fa17a-143d-4bb0-a32e-a6d30f3e3754" containerName="extract-content" Mar 01 10:26:00 crc kubenswrapper[4792]: I0301 10:26:00.156990 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="556f4724-e050-4a4f-b7a3-680d9d7f08c5" containerName="registry-server" Mar 01 10:26:00 crc kubenswrapper[4792]: I0301 10:26:00.157031 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="582fa17a-143d-4bb0-a32e-a6d30f3e3754" containerName="registry-server" Mar 01 10:26:00 crc kubenswrapper[4792]: I0301 10:26:00.157900 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539346-l8ktk" Mar 01 10:26:00 crc kubenswrapper[4792]: I0301 10:26:00.160489 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 10:26:00 crc kubenswrapper[4792]: I0301 10:26:00.161782 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 10:26:00 crc kubenswrapper[4792]: I0301 10:26:00.163731 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 10:26:00 crc kubenswrapper[4792]: I0301 10:26:00.163747 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539346-l8ktk"] Mar 01 10:26:00 crc kubenswrapper[4792]: I0301 10:26:00.347069 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqhq2\" (UniqueName: \"kubernetes.io/projected/9c23503b-d97f-4cef-b792-e7fbdd8934ab-kube-api-access-dqhq2\") pod \"auto-csr-approver-29539346-l8ktk\" (UID: \"9c23503b-d97f-4cef-b792-e7fbdd8934ab\") " pod="openshift-infra/auto-csr-approver-29539346-l8ktk" Mar 01 10:26:00 crc kubenswrapper[4792]: I0301 10:26:00.449598 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqhq2\" (UniqueName: \"kubernetes.io/projected/9c23503b-d97f-4cef-b792-e7fbdd8934ab-kube-api-access-dqhq2\") pod \"auto-csr-approver-29539346-l8ktk\" (UID: \"9c23503b-d97f-4cef-b792-e7fbdd8934ab\") " pod="openshift-infra/auto-csr-approver-29539346-l8ktk" Mar 01 10:26:00 crc kubenswrapper[4792]: I0301 10:26:00.473650 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqhq2\" (UniqueName: \"kubernetes.io/projected/9c23503b-d97f-4cef-b792-e7fbdd8934ab-kube-api-access-dqhq2\") pod \"auto-csr-approver-29539346-l8ktk\" (UID: \"9c23503b-d97f-4cef-b792-e7fbdd8934ab\") " pod="openshift-infra/auto-csr-approver-29539346-l8ktk" Mar 01 10:26:00 crc kubenswrapper[4792]: I0301 10:26:00.477136 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539346-l8ktk" Mar 01 10:26:00 crc kubenswrapper[4792]: I0301 10:26:00.921343 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539346-l8ktk"] Mar 01 10:26:00 crc kubenswrapper[4792]: I0301 10:26:00.926625 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 01 10:26:01 crc kubenswrapper[4792]: I0301 10:26:01.555932 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539346-l8ktk" event={"ID":"9c23503b-d97f-4cef-b792-e7fbdd8934ab","Type":"ContainerStarted","Data":"c98fd7e3154616a93d3b7d0af38bc303e4e5ccabeefc5244bdc91aee0c92792e"} Mar 01 10:26:02 crc kubenswrapper[4792]: I0301 10:26:02.565086 4792 generic.go:334] "Generic (PLEG): container finished" podID="9c23503b-d97f-4cef-b792-e7fbdd8934ab" containerID="f2c5b6ef4792c5289a47b67d38a5bcb076b8c29acc57acf629d74f960223e4cf" exitCode=0 Mar 01 10:26:02 crc kubenswrapper[4792]: I0301 10:26:02.565152 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539346-l8ktk" event={"ID":"9c23503b-d97f-4cef-b792-e7fbdd8934ab","Type":"ContainerDied","Data":"f2c5b6ef4792c5289a47b67d38a5bcb076b8c29acc57acf629d74f960223e4cf"} Mar 01 10:26:03 crc kubenswrapper[4792]: I0301 10:26:03.937580 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539346-l8ktk" Mar 01 10:26:04 crc kubenswrapper[4792]: I0301 10:26:04.026445 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqhq2\" (UniqueName: \"kubernetes.io/projected/9c23503b-d97f-4cef-b792-e7fbdd8934ab-kube-api-access-dqhq2\") pod \"9c23503b-d97f-4cef-b792-e7fbdd8934ab\" (UID: \"9c23503b-d97f-4cef-b792-e7fbdd8934ab\") " Mar 01 10:26:04 crc kubenswrapper[4792]: I0301 10:26:04.033745 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c23503b-d97f-4cef-b792-e7fbdd8934ab-kube-api-access-dqhq2" (OuterVolumeSpecName: "kube-api-access-dqhq2") pod "9c23503b-d97f-4cef-b792-e7fbdd8934ab" (UID: "9c23503b-d97f-4cef-b792-e7fbdd8934ab"). InnerVolumeSpecName "kube-api-access-dqhq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:26:04 crc kubenswrapper[4792]: I0301 10:26:04.129374 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqhq2\" (UniqueName: \"kubernetes.io/projected/9c23503b-d97f-4cef-b792-e7fbdd8934ab-kube-api-access-dqhq2\") on node \"crc\" DevicePath \"\"" Mar 01 10:26:04 crc kubenswrapper[4792]: I0301 10:26:04.587311 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539346-l8ktk" event={"ID":"9c23503b-d97f-4cef-b792-e7fbdd8934ab","Type":"ContainerDied","Data":"c98fd7e3154616a93d3b7d0af38bc303e4e5ccabeefc5244bdc91aee0c92792e"} Mar 01 10:26:04 crc kubenswrapper[4792]: I0301 10:26:04.587348 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c98fd7e3154616a93d3b7d0af38bc303e4e5ccabeefc5244bdc91aee0c92792e" Mar 01 10:26:04 crc kubenswrapper[4792]: I0301 10:26:04.587372 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539346-l8ktk" Mar 01 10:26:05 crc kubenswrapper[4792]: I0301 10:26:05.020871 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539340-rjnp9"] Mar 01 10:26:05 crc kubenswrapper[4792]: I0301 10:26:05.033924 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539340-rjnp9"] Mar 01 10:26:05 crc kubenswrapper[4792]: I0301 10:26:05.424504 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d79ac35-053d-480b-a8ef-3b03122b0152" path="/var/lib/kubelet/pods/5d79ac35-053d-480b-a8ef-3b03122b0152/volumes" Mar 01 10:26:14 crc kubenswrapper[4792]: I0301 10:26:14.409465 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:26:14 crc kubenswrapper[4792]: E0301 10:26:14.410231 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:26:26 crc kubenswrapper[4792]: I0301 10:26:26.408868 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:26:26 crc kubenswrapper[4792]: E0301 10:26:26.409708 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:26:40 crc kubenswrapper[4792]: I0301 10:26:40.156099 4792 scope.go:117] "RemoveContainer" containerID="fd47f4e8c316ceef6350038b81becae73fa982c181d0ebd8621bb407bf6fb4b7" Mar 01 10:26:41 crc kubenswrapper[4792]: I0301 10:26:41.416523 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:26:41 crc kubenswrapper[4792]: E0301 10:26:41.417366 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:26:46 crc kubenswrapper[4792]: I0301 10:26:46.538201 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gqc89/must-gather-vdffg"] Mar 01 10:26:46 crc kubenswrapper[4792]: E0301 10:26:46.539089 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c23503b-d97f-4cef-b792-e7fbdd8934ab" containerName="oc" Mar 01 10:26:46 crc kubenswrapper[4792]: I0301 10:26:46.539105 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c23503b-d97f-4cef-b792-e7fbdd8934ab" containerName="oc" Mar 01 10:26:46 crc kubenswrapper[4792]: I0301 10:26:46.539329 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c23503b-d97f-4cef-b792-e7fbdd8934ab" containerName="oc" Mar 01 10:26:46 crc kubenswrapper[4792]: I0301 10:26:46.540521 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gqc89/must-gather-vdffg" Mar 01 10:26:46 crc kubenswrapper[4792]: I0301 10:26:46.548367 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-gqc89"/"openshift-service-ca.crt" Mar 01 10:26:46 crc kubenswrapper[4792]: I0301 10:26:46.549604 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-gqc89"/"kube-root-ca.crt" Mar 01 10:26:46 crc kubenswrapper[4792]: I0301 10:26:46.576758 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gqc89/must-gather-vdffg"] Mar 01 10:26:46 crc kubenswrapper[4792]: I0301 10:26:46.669896 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcq2x\" (UniqueName: \"kubernetes.io/projected/f3c98b67-7926-411d-9068-0b7991b0551b-kube-api-access-mcq2x\") pod \"must-gather-vdffg\" (UID: \"f3c98b67-7926-411d-9068-0b7991b0551b\") " pod="openshift-must-gather-gqc89/must-gather-vdffg" Mar 01 10:26:46 crc kubenswrapper[4792]: I0301 10:26:46.670109 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f3c98b67-7926-411d-9068-0b7991b0551b-must-gather-output\") pod \"must-gather-vdffg\" (UID: \"f3c98b67-7926-411d-9068-0b7991b0551b\") " pod="openshift-must-gather-gqc89/must-gather-vdffg" Mar 01 10:26:46 crc kubenswrapper[4792]: I0301 10:26:46.772266 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcq2x\" (UniqueName: \"kubernetes.io/projected/f3c98b67-7926-411d-9068-0b7991b0551b-kube-api-access-mcq2x\") pod \"must-gather-vdffg\" (UID: \"f3c98b67-7926-411d-9068-0b7991b0551b\") " pod="openshift-must-gather-gqc89/must-gather-vdffg" Mar 01 10:26:46 crc kubenswrapper[4792]: I0301 10:26:46.772358 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f3c98b67-7926-411d-9068-0b7991b0551b-must-gather-output\") pod \"must-gather-vdffg\" (UID: \"f3c98b67-7926-411d-9068-0b7991b0551b\") " pod="openshift-must-gather-gqc89/must-gather-vdffg" Mar 01 10:26:46 crc kubenswrapper[4792]: I0301 10:26:46.772843 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f3c98b67-7926-411d-9068-0b7991b0551b-must-gather-output\") pod \"must-gather-vdffg\" (UID: \"f3c98b67-7926-411d-9068-0b7991b0551b\") " pod="openshift-must-gather-gqc89/must-gather-vdffg" Mar 01 10:26:46 crc kubenswrapper[4792]: I0301 10:26:46.792178 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcq2x\" (UniqueName: \"kubernetes.io/projected/f3c98b67-7926-411d-9068-0b7991b0551b-kube-api-access-mcq2x\") pod \"must-gather-vdffg\" (UID: \"f3c98b67-7926-411d-9068-0b7991b0551b\") " pod="openshift-must-gather-gqc89/must-gather-vdffg" Mar 01 10:26:46 crc kubenswrapper[4792]: I0301 10:26:46.867292 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gqc89/must-gather-vdffg" Mar 01 10:26:47 crc kubenswrapper[4792]: I0301 10:26:47.370864 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gqc89/must-gather-vdffg"] Mar 01 10:26:48 crc kubenswrapper[4792]: I0301 10:26:48.032338 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gqc89/must-gather-vdffg" event={"ID":"f3c98b67-7926-411d-9068-0b7991b0551b","Type":"ContainerStarted","Data":"08ddb8424f25ab955f23fb248fb3199c701786d1289df5f9635fe0b5ba6df5a8"} Mar 01 10:26:48 crc kubenswrapper[4792]: I0301 10:26:48.033292 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gqc89/must-gather-vdffg" event={"ID":"f3c98b67-7926-411d-9068-0b7991b0551b","Type":"ContainerStarted","Data":"1565ec9980556fdbf01e99188e64fa14f5e2ec76dbcd766030cf211adcc3fc4e"} Mar 01 10:26:49 crc kubenswrapper[4792]: I0301 10:26:49.042630 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gqc89/must-gather-vdffg" event={"ID":"f3c98b67-7926-411d-9068-0b7991b0551b","Type":"ContainerStarted","Data":"3a427244043269002de52e0a079a3760bf74195c9fc73cf6abf599f3980c7560"} Mar 01 10:26:49 crc kubenswrapper[4792]: I0301 10:26:49.066493 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gqc89/must-gather-vdffg" podStartSLOduration=3.066476008 podStartE2EDuration="3.066476008s" podCreationTimestamp="2026-03-01 10:26:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 10:26:49.059476612 +0000 UTC m=+4738.301355809" watchObservedRunningTime="2026-03-01 10:26:49.066476008 +0000 UTC m=+4738.308355205" Mar 01 10:26:52 crc kubenswrapper[4792]: I0301 10:26:52.479581 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gqc89/crc-debug-tb4mv"] Mar 01 10:26:52 crc kubenswrapper[4792]: I0301 10:26:52.481533 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gqc89/crc-debug-tb4mv" Mar 01 10:26:52 crc kubenswrapper[4792]: I0301 10:26:52.483415 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-gqc89"/"default-dockercfg-cw4kn" Mar 01 10:26:52 crc kubenswrapper[4792]: I0301 10:26:52.518336 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e45184b8-fa23-4abb-8e90-7b490f7d3c04-host\") pod \"crc-debug-tb4mv\" (UID: \"e45184b8-fa23-4abb-8e90-7b490f7d3c04\") " pod="openshift-must-gather-gqc89/crc-debug-tb4mv" Mar 01 10:26:52 crc kubenswrapper[4792]: I0301 10:26:52.518723 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9zzg\" (UniqueName: \"kubernetes.io/projected/e45184b8-fa23-4abb-8e90-7b490f7d3c04-kube-api-access-n9zzg\") pod \"crc-debug-tb4mv\" (UID: \"e45184b8-fa23-4abb-8e90-7b490f7d3c04\") " pod="openshift-must-gather-gqc89/crc-debug-tb4mv" Mar 01 10:26:52 crc kubenswrapper[4792]: I0301 10:26:52.620260 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e45184b8-fa23-4abb-8e90-7b490f7d3c04-host\") pod \"crc-debug-tb4mv\" (UID: \"e45184b8-fa23-4abb-8e90-7b490f7d3c04\") " pod="openshift-must-gather-gqc89/crc-debug-tb4mv" Mar 01 10:26:52 crc kubenswrapper[4792]: I0301 10:26:52.620427 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e45184b8-fa23-4abb-8e90-7b490f7d3c04-host\") pod \"crc-debug-tb4mv\" (UID: \"e45184b8-fa23-4abb-8e90-7b490f7d3c04\") " pod="openshift-must-gather-gqc89/crc-debug-tb4mv" Mar 01 10:26:52 crc kubenswrapper[4792]: I0301 10:26:52.620619 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9zzg\" (UniqueName: \"kubernetes.io/projected/e45184b8-fa23-4abb-8e90-7b490f7d3c04-kube-api-access-n9zzg\") pod \"crc-debug-tb4mv\" (UID: \"e45184b8-fa23-4abb-8e90-7b490f7d3c04\") " pod="openshift-must-gather-gqc89/crc-debug-tb4mv" Mar 01 10:26:52 crc kubenswrapper[4792]: I0301 10:26:52.638823 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9zzg\" (UniqueName: \"kubernetes.io/projected/e45184b8-fa23-4abb-8e90-7b490f7d3c04-kube-api-access-n9zzg\") pod \"crc-debug-tb4mv\" (UID: \"e45184b8-fa23-4abb-8e90-7b490f7d3c04\") " pod="openshift-must-gather-gqc89/crc-debug-tb4mv" Mar 01 10:26:52 crc kubenswrapper[4792]: I0301 10:26:52.799477 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gqc89/crc-debug-tb4mv" Mar 01 10:26:52 crc kubenswrapper[4792]: W0301 10:26:52.834080 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode45184b8_fa23_4abb_8e90_7b490f7d3c04.slice/crio-22ca56a9c95ce18c1e3892347e183b4bae0378f6608a3f7a7e5c37af1cff7576 WatchSource:0}: Error finding container 22ca56a9c95ce18c1e3892347e183b4bae0378f6608a3f7a7e5c37af1cff7576: Status 404 returned error can't find the container with id 22ca56a9c95ce18c1e3892347e183b4bae0378f6608a3f7a7e5c37af1cff7576 Mar 01 10:26:53 crc kubenswrapper[4792]: I0301 10:26:53.087096 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gqc89/crc-debug-tb4mv" event={"ID":"e45184b8-fa23-4abb-8e90-7b490f7d3c04","Type":"ContainerStarted","Data":"9b1ac8b2a6b9a9734637c133cb7e1fc37defae1380b1bd14a5fb31b1efa6e0e5"} Mar 01 10:26:53 crc kubenswrapper[4792]: I0301 10:26:53.087402 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gqc89/crc-debug-tb4mv" event={"ID":"e45184b8-fa23-4abb-8e90-7b490f7d3c04","Type":"ContainerStarted","Data":"22ca56a9c95ce18c1e3892347e183b4bae0378f6608a3f7a7e5c37af1cff7576"} Mar 01 10:26:53 crc kubenswrapper[4792]: I0301 10:26:53.104867 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gqc89/crc-debug-tb4mv" podStartSLOduration=1.104845853 podStartE2EDuration="1.104845853s" podCreationTimestamp="2026-03-01 10:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 10:26:53.098242918 +0000 UTC m=+4742.340122115" watchObservedRunningTime="2026-03-01 10:26:53.104845853 +0000 UTC m=+4742.346725050" Mar 01 10:26:56 crc kubenswrapper[4792]: I0301 10:26:56.409620 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:26:56 crc kubenswrapper[4792]: E0301 10:26:56.410376 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:27:08 crc kubenswrapper[4792]: I0301 10:27:08.409715 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:27:08 crc kubenswrapper[4792]: E0301 10:27:08.412439 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:27:23 crc kubenswrapper[4792]: I0301 10:27:23.409146 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:27:23 crc kubenswrapper[4792]: E0301 10:27:23.411060 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:27:27 crc kubenswrapper[4792]: I0301 10:27:27.377478 4792 generic.go:334] "Generic (PLEG): container finished" podID="e45184b8-fa23-4abb-8e90-7b490f7d3c04" containerID="9b1ac8b2a6b9a9734637c133cb7e1fc37defae1380b1bd14a5fb31b1efa6e0e5" exitCode=0 Mar 01 10:27:27 crc kubenswrapper[4792]: I0301 10:27:27.377585 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gqc89/crc-debug-tb4mv" event={"ID":"e45184b8-fa23-4abb-8e90-7b490f7d3c04","Type":"ContainerDied","Data":"9b1ac8b2a6b9a9734637c133cb7e1fc37defae1380b1bd14a5fb31b1efa6e0e5"} Mar 01 10:27:28 crc kubenswrapper[4792]: I0301 10:27:28.539560 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gqc89/crc-debug-tb4mv" Mar 01 10:27:28 crc kubenswrapper[4792]: I0301 10:27:28.573080 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-gqc89/crc-debug-tb4mv"] Mar 01 10:27:28 crc kubenswrapper[4792]: I0301 10:27:28.581565 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-gqc89/crc-debug-tb4mv"] Mar 01 10:27:28 crc kubenswrapper[4792]: I0301 10:27:28.633658 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9zzg\" (UniqueName: \"kubernetes.io/projected/e45184b8-fa23-4abb-8e90-7b490f7d3c04-kube-api-access-n9zzg\") pod \"e45184b8-fa23-4abb-8e90-7b490f7d3c04\" (UID: \"e45184b8-fa23-4abb-8e90-7b490f7d3c04\") " Mar 01 10:27:28 crc kubenswrapper[4792]: I0301 10:27:28.633811 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e45184b8-fa23-4abb-8e90-7b490f7d3c04-host\") pod \"e45184b8-fa23-4abb-8e90-7b490f7d3c04\" (UID: \"e45184b8-fa23-4abb-8e90-7b490f7d3c04\") " Mar 01 10:27:28 crc kubenswrapper[4792]: I0301 10:27:28.634481 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e45184b8-fa23-4abb-8e90-7b490f7d3c04-host" (OuterVolumeSpecName: "host") pod "e45184b8-fa23-4abb-8e90-7b490f7d3c04" (UID: "e45184b8-fa23-4abb-8e90-7b490f7d3c04"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 10:27:28 crc kubenswrapper[4792]: I0301 10:27:28.641050 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e45184b8-fa23-4abb-8e90-7b490f7d3c04-kube-api-access-n9zzg" (OuterVolumeSpecName: "kube-api-access-n9zzg") pod "e45184b8-fa23-4abb-8e90-7b490f7d3c04" (UID: "e45184b8-fa23-4abb-8e90-7b490f7d3c04"). InnerVolumeSpecName "kube-api-access-n9zzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:27:28 crc kubenswrapper[4792]: I0301 10:27:28.735943 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9zzg\" (UniqueName: \"kubernetes.io/projected/e45184b8-fa23-4abb-8e90-7b490f7d3c04-kube-api-access-n9zzg\") on node \"crc\" DevicePath \"\"" Mar 01 10:27:28 crc kubenswrapper[4792]: I0301 10:27:28.735987 4792 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e45184b8-fa23-4abb-8e90-7b490f7d3c04-host\") on node \"crc\" DevicePath \"\"" Mar 01 10:27:29 crc kubenswrapper[4792]: I0301 10:27:29.399241 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22ca56a9c95ce18c1e3892347e183b4bae0378f6608a3f7a7e5c37af1cff7576" Mar 01 10:27:29 crc kubenswrapper[4792]: I0301 10:27:29.399531 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gqc89/crc-debug-tb4mv" Mar 01 10:27:29 crc kubenswrapper[4792]: I0301 10:27:29.418978 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e45184b8-fa23-4abb-8e90-7b490f7d3c04" path="/var/lib/kubelet/pods/e45184b8-fa23-4abb-8e90-7b490f7d3c04/volumes" Mar 01 10:27:29 crc kubenswrapper[4792]: I0301 10:27:29.761083 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gqc89/crc-debug-b5clz"] Mar 01 10:27:29 crc kubenswrapper[4792]: E0301 10:27:29.761741 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e45184b8-fa23-4abb-8e90-7b490f7d3c04" containerName="container-00" Mar 01 10:27:29 crc kubenswrapper[4792]: I0301 10:27:29.761752 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e45184b8-fa23-4abb-8e90-7b490f7d3c04" containerName="container-00" Mar 01 10:27:29 crc kubenswrapper[4792]: I0301 10:27:29.761972 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e45184b8-fa23-4abb-8e90-7b490f7d3c04" containerName="container-00" Mar 01 10:27:29 crc kubenswrapper[4792]: I0301 10:27:29.762564 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gqc89/crc-debug-b5clz" Mar 01 10:27:29 crc kubenswrapper[4792]: I0301 10:27:29.766103 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-gqc89"/"default-dockercfg-cw4kn" Mar 01 10:27:29 crc kubenswrapper[4792]: I0301 10:27:29.858143 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/906114a6-330e-47e1-a2d5-b2629604ec9b-host\") pod \"crc-debug-b5clz\" (UID: \"906114a6-330e-47e1-a2d5-b2629604ec9b\") " pod="openshift-must-gather-gqc89/crc-debug-b5clz" Mar 01 10:27:29 crc kubenswrapper[4792]: I0301 10:27:29.858214 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4s4x\" (UniqueName: \"kubernetes.io/projected/906114a6-330e-47e1-a2d5-b2629604ec9b-kube-api-access-d4s4x\") pod \"crc-debug-b5clz\" (UID: \"906114a6-330e-47e1-a2d5-b2629604ec9b\") " pod="openshift-must-gather-gqc89/crc-debug-b5clz" Mar 01 10:27:29 crc kubenswrapper[4792]: I0301 10:27:29.959973 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/906114a6-330e-47e1-a2d5-b2629604ec9b-host\") pod \"crc-debug-b5clz\" (UID: \"906114a6-330e-47e1-a2d5-b2629604ec9b\") " pod="openshift-must-gather-gqc89/crc-debug-b5clz" Mar 01 10:27:29 crc kubenswrapper[4792]: I0301 10:27:29.960055 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4s4x\" (UniqueName: \"kubernetes.io/projected/906114a6-330e-47e1-a2d5-b2629604ec9b-kube-api-access-d4s4x\") pod \"crc-debug-b5clz\" (UID: \"906114a6-330e-47e1-a2d5-b2629604ec9b\") " pod="openshift-must-gather-gqc89/crc-debug-b5clz" Mar 01 10:27:29 crc kubenswrapper[4792]: I0301 10:27:29.960348 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/906114a6-330e-47e1-a2d5-b2629604ec9b-host\") pod \"crc-debug-b5clz\" (UID: \"906114a6-330e-47e1-a2d5-b2629604ec9b\") " pod="openshift-must-gather-gqc89/crc-debug-b5clz" Mar 01 10:27:29 crc kubenswrapper[4792]: I0301 10:27:29.977729 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4s4x\" (UniqueName: \"kubernetes.io/projected/906114a6-330e-47e1-a2d5-b2629604ec9b-kube-api-access-d4s4x\") pod \"crc-debug-b5clz\" (UID: \"906114a6-330e-47e1-a2d5-b2629604ec9b\") " pod="openshift-must-gather-gqc89/crc-debug-b5clz" Mar 01 10:27:30 crc kubenswrapper[4792]: I0301 10:27:30.080388 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gqc89/crc-debug-b5clz" Mar 01 10:27:30 crc kubenswrapper[4792]: I0301 10:27:30.408370 4792 generic.go:334] "Generic (PLEG): container finished" podID="906114a6-330e-47e1-a2d5-b2629604ec9b" containerID="44972f8cdfee40aaa4b7d768644c72d5ce1a862062e2e025755a87475cc93e75" exitCode=0 Mar 01 10:27:30 crc kubenswrapper[4792]: I0301 10:27:30.408464 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gqc89/crc-debug-b5clz" event={"ID":"906114a6-330e-47e1-a2d5-b2629604ec9b","Type":"ContainerDied","Data":"44972f8cdfee40aaa4b7d768644c72d5ce1a862062e2e025755a87475cc93e75"} Mar 01 10:27:30 crc kubenswrapper[4792]: I0301 10:27:30.408643 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gqc89/crc-debug-b5clz" event={"ID":"906114a6-330e-47e1-a2d5-b2629604ec9b","Type":"ContainerStarted","Data":"a698a724eea7987fdc30fb2d5e78695ac28cddc639344b0f696d184f93a2fc3d"} Mar 01 10:27:30 crc kubenswrapper[4792]: I0301 10:27:30.841307 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-gqc89/crc-debug-b5clz"] Mar 01 10:27:30 crc kubenswrapper[4792]: I0301 10:27:30.850279 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-gqc89/crc-debug-b5clz"] Mar 01 10:27:31 crc kubenswrapper[4792]: I0301 10:27:31.509710 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gqc89/crc-debug-b5clz" Mar 01 10:27:31 crc kubenswrapper[4792]: I0301 10:27:31.632494 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4s4x\" (UniqueName: \"kubernetes.io/projected/906114a6-330e-47e1-a2d5-b2629604ec9b-kube-api-access-d4s4x\") pod \"906114a6-330e-47e1-a2d5-b2629604ec9b\" (UID: \"906114a6-330e-47e1-a2d5-b2629604ec9b\") " Mar 01 10:27:31 crc kubenswrapper[4792]: I0301 10:27:31.632560 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/906114a6-330e-47e1-a2d5-b2629604ec9b-host\") pod \"906114a6-330e-47e1-a2d5-b2629604ec9b\" (UID: \"906114a6-330e-47e1-a2d5-b2629604ec9b\") " Mar 01 10:27:31 crc kubenswrapper[4792]: I0301 10:27:31.632656 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/906114a6-330e-47e1-a2d5-b2629604ec9b-host" (OuterVolumeSpecName: "host") pod "906114a6-330e-47e1-a2d5-b2629604ec9b" (UID: "906114a6-330e-47e1-a2d5-b2629604ec9b"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 10:27:31 crc kubenswrapper[4792]: I0301 10:27:31.633461 4792 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/906114a6-330e-47e1-a2d5-b2629604ec9b-host\") on node \"crc\" DevicePath \"\"" Mar 01 10:27:31 crc kubenswrapper[4792]: I0301 10:27:31.660579 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/906114a6-330e-47e1-a2d5-b2629604ec9b-kube-api-access-d4s4x" (OuterVolumeSpecName: "kube-api-access-d4s4x") pod "906114a6-330e-47e1-a2d5-b2629604ec9b" (UID: "906114a6-330e-47e1-a2d5-b2629604ec9b"). InnerVolumeSpecName "kube-api-access-d4s4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:27:31 crc kubenswrapper[4792]: I0301 10:27:31.735718 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4s4x\" (UniqueName: \"kubernetes.io/projected/906114a6-330e-47e1-a2d5-b2629604ec9b-kube-api-access-d4s4x\") on node \"crc\" DevicePath \"\"" Mar 01 10:27:32 crc kubenswrapper[4792]: I0301 10:27:32.256300 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gqc89/crc-debug-f7z2b"] Mar 01 10:27:32 crc kubenswrapper[4792]: E0301 10:27:32.256977 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="906114a6-330e-47e1-a2d5-b2629604ec9b" containerName="container-00" Mar 01 10:27:32 crc kubenswrapper[4792]: I0301 10:27:32.256990 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="906114a6-330e-47e1-a2d5-b2629604ec9b" containerName="container-00" Mar 01 10:27:32 crc kubenswrapper[4792]: I0301 10:27:32.257195 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="906114a6-330e-47e1-a2d5-b2629604ec9b" containerName="container-00" Mar 01 10:27:32 crc kubenswrapper[4792]: I0301 10:27:32.257763 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gqc89/crc-debug-f7z2b" Mar 01 10:27:32 crc kubenswrapper[4792]: I0301 10:27:32.429021 4792 scope.go:117] "RemoveContainer" containerID="44972f8cdfee40aaa4b7d768644c72d5ce1a862062e2e025755a87475cc93e75" Mar 01 10:27:32 crc kubenswrapper[4792]: I0301 10:27:32.429250 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gqc89/crc-debug-b5clz" Mar 01 10:27:32 crc kubenswrapper[4792]: I0301 10:27:32.447467 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76rpw\" (UniqueName: \"kubernetes.io/projected/87295b8e-2de6-45e5-8a8c-67223695843f-kube-api-access-76rpw\") pod \"crc-debug-f7z2b\" (UID: \"87295b8e-2de6-45e5-8a8c-67223695843f\") " pod="openshift-must-gather-gqc89/crc-debug-f7z2b" Mar 01 10:27:32 crc kubenswrapper[4792]: I0301 10:27:32.447855 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/87295b8e-2de6-45e5-8a8c-67223695843f-host\") pod \"crc-debug-f7z2b\" (UID: \"87295b8e-2de6-45e5-8a8c-67223695843f\") " pod="openshift-must-gather-gqc89/crc-debug-f7z2b" Mar 01 10:27:32 crc kubenswrapper[4792]: I0301 10:27:32.550147 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76rpw\" (UniqueName: \"kubernetes.io/projected/87295b8e-2de6-45e5-8a8c-67223695843f-kube-api-access-76rpw\") pod \"crc-debug-f7z2b\" (UID: \"87295b8e-2de6-45e5-8a8c-67223695843f\") " pod="openshift-must-gather-gqc89/crc-debug-f7z2b" Mar 01 10:27:32 crc kubenswrapper[4792]: I0301 10:27:32.550299 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/87295b8e-2de6-45e5-8a8c-67223695843f-host\") pod \"crc-debug-f7z2b\" (UID: \"87295b8e-2de6-45e5-8a8c-67223695843f\") " pod="openshift-must-gather-gqc89/crc-debug-f7z2b" Mar 01 10:27:32 crc kubenswrapper[4792]: I0301 10:27:32.550386 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/87295b8e-2de6-45e5-8a8c-67223695843f-host\") pod \"crc-debug-f7z2b\" (UID: \"87295b8e-2de6-45e5-8a8c-67223695843f\") " pod="openshift-must-gather-gqc89/crc-debug-f7z2b" Mar 01 10:27:32 crc kubenswrapper[4792]: I0301 10:27:32.861436 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76rpw\" (UniqueName: \"kubernetes.io/projected/87295b8e-2de6-45e5-8a8c-67223695843f-kube-api-access-76rpw\") pod \"crc-debug-f7z2b\" (UID: \"87295b8e-2de6-45e5-8a8c-67223695843f\") " pod="openshift-must-gather-gqc89/crc-debug-f7z2b" Mar 01 10:27:32 crc kubenswrapper[4792]: I0301 10:27:32.878415 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gqc89/crc-debug-f7z2b" Mar 01 10:27:32 crc kubenswrapper[4792]: W0301 10:27:32.912854 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87295b8e_2de6_45e5_8a8c_67223695843f.slice/crio-5043b64d50537b6cfaf21fbd5e8cfde7f34c613a4ac57d32c2db7b8106aab901 WatchSource:0}: Error finding container 5043b64d50537b6cfaf21fbd5e8cfde7f34c613a4ac57d32c2db7b8106aab901: Status 404 returned error can't find the container with id 5043b64d50537b6cfaf21fbd5e8cfde7f34c613a4ac57d32c2db7b8106aab901 Mar 01 10:27:33 crc kubenswrapper[4792]: I0301 10:27:33.419122 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="906114a6-330e-47e1-a2d5-b2629604ec9b" path="/var/lib/kubelet/pods/906114a6-330e-47e1-a2d5-b2629604ec9b/volumes" Mar 01 10:27:33 crc kubenswrapper[4792]: I0301 10:27:33.438548 4792 generic.go:334] "Generic (PLEG): container finished" podID="87295b8e-2de6-45e5-8a8c-67223695843f" containerID="f8e7a7cf72954e5a371a752e566b57991977d10fab6f499762b2d8f952868a75" exitCode=0 Mar 01 10:27:33 crc kubenswrapper[4792]: I0301 10:27:33.438586 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gqc89/crc-debug-f7z2b" event={"ID":"87295b8e-2de6-45e5-8a8c-67223695843f","Type":"ContainerDied","Data":"f8e7a7cf72954e5a371a752e566b57991977d10fab6f499762b2d8f952868a75"} Mar 01 10:27:33 crc kubenswrapper[4792]: I0301 10:27:33.438648 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gqc89/crc-debug-f7z2b" event={"ID":"87295b8e-2de6-45e5-8a8c-67223695843f","Type":"ContainerStarted","Data":"5043b64d50537b6cfaf21fbd5e8cfde7f34c613a4ac57d32c2db7b8106aab901"} Mar 01 10:27:33 crc kubenswrapper[4792]: I0301 10:27:33.499769 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-gqc89/crc-debug-f7z2b"] Mar 01 10:27:33 crc kubenswrapper[4792]: I0301 10:27:33.509944 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-gqc89/crc-debug-f7z2b"] Mar 01 10:27:34 crc kubenswrapper[4792]: I0301 10:27:34.545107 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gqc89/crc-debug-f7z2b" Mar 01 10:27:34 crc kubenswrapper[4792]: I0301 10:27:34.697133 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/87295b8e-2de6-45e5-8a8c-67223695843f-host\") pod \"87295b8e-2de6-45e5-8a8c-67223695843f\" (UID: \"87295b8e-2de6-45e5-8a8c-67223695843f\") " Mar 01 10:27:34 crc kubenswrapper[4792]: I0301 10:27:34.697254 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76rpw\" (UniqueName: \"kubernetes.io/projected/87295b8e-2de6-45e5-8a8c-67223695843f-kube-api-access-76rpw\") pod \"87295b8e-2de6-45e5-8a8c-67223695843f\" (UID: \"87295b8e-2de6-45e5-8a8c-67223695843f\") " Mar 01 10:27:34 crc kubenswrapper[4792]: I0301 10:27:34.698042 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87295b8e-2de6-45e5-8a8c-67223695843f-host" (OuterVolumeSpecName: "host") pod "87295b8e-2de6-45e5-8a8c-67223695843f" (UID: "87295b8e-2de6-45e5-8a8c-67223695843f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 10:27:34 crc kubenswrapper[4792]: I0301 10:27:34.703946 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87295b8e-2de6-45e5-8a8c-67223695843f-kube-api-access-76rpw" (OuterVolumeSpecName: "kube-api-access-76rpw") pod "87295b8e-2de6-45e5-8a8c-67223695843f" (UID: "87295b8e-2de6-45e5-8a8c-67223695843f"). InnerVolumeSpecName "kube-api-access-76rpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:27:34 crc kubenswrapper[4792]: I0301 10:27:34.799839 4792 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/87295b8e-2de6-45e5-8a8c-67223695843f-host\") on node \"crc\" DevicePath \"\"" Mar 01 10:27:34 crc kubenswrapper[4792]: I0301 10:27:34.799874 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76rpw\" (UniqueName: \"kubernetes.io/projected/87295b8e-2de6-45e5-8a8c-67223695843f-kube-api-access-76rpw\") on node \"crc\" DevicePath \"\"" Mar 01 10:27:35 crc kubenswrapper[4792]: I0301 10:27:35.426344 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87295b8e-2de6-45e5-8a8c-67223695843f" path="/var/lib/kubelet/pods/87295b8e-2de6-45e5-8a8c-67223695843f/volumes" Mar 01 10:27:35 crc kubenswrapper[4792]: I0301 10:27:35.460099 4792 scope.go:117] "RemoveContainer" containerID="f8e7a7cf72954e5a371a752e566b57991977d10fab6f499762b2d8f952868a75" Mar 01 10:27:35 crc kubenswrapper[4792]: I0301 10:27:35.460361 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gqc89/crc-debug-f7z2b" Mar 01 10:27:36 crc kubenswrapper[4792]: I0301 10:27:36.410226 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:27:36 crc kubenswrapper[4792]: E0301 10:27:36.410822 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:27:49 crc kubenswrapper[4792]: I0301 10:27:49.409104 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:27:49 crc kubenswrapper[4792]: E0301 10:27:49.409883 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:27:49 crc kubenswrapper[4792]: I0301 10:27:49.951514 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cxhh6"] Mar 01 10:27:49 crc kubenswrapper[4792]: E0301 10:27:49.952582 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87295b8e-2de6-45e5-8a8c-67223695843f" containerName="container-00" Mar 01 10:27:49 crc kubenswrapper[4792]: I0301 10:27:49.952695 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="87295b8e-2de6-45e5-8a8c-67223695843f" containerName="container-00" Mar 01 10:27:49 crc kubenswrapper[4792]: I0301 10:27:49.953064 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="87295b8e-2de6-45e5-8a8c-67223695843f" containerName="container-00" Mar 01 10:27:49 crc kubenswrapper[4792]: I0301 10:27:49.954835 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cxhh6" Mar 01 10:27:49 crc kubenswrapper[4792]: I0301 10:27:49.971505 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cxhh6"] Mar 01 10:27:50 crc kubenswrapper[4792]: I0301 10:27:50.026410 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12314c82-8a7f-466f-939a-158a53420f72-utilities\") pod \"redhat-operators-cxhh6\" (UID: \"12314c82-8a7f-466f-939a-158a53420f72\") " pod="openshift-marketplace/redhat-operators-cxhh6" Mar 01 10:27:50 crc kubenswrapper[4792]: I0301 10:27:50.026510 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dxk6\" (UniqueName: \"kubernetes.io/projected/12314c82-8a7f-466f-939a-158a53420f72-kube-api-access-4dxk6\") pod \"redhat-operators-cxhh6\" (UID: \"12314c82-8a7f-466f-939a-158a53420f72\") " pod="openshift-marketplace/redhat-operators-cxhh6" Mar 01 10:27:50 crc kubenswrapper[4792]: I0301 10:27:50.026586 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12314c82-8a7f-466f-939a-158a53420f72-catalog-content\") pod \"redhat-operators-cxhh6\" (UID: \"12314c82-8a7f-466f-939a-158a53420f72\") " pod="openshift-marketplace/redhat-operators-cxhh6" Mar 01 10:27:50 crc kubenswrapper[4792]: I0301 10:27:50.127934 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12314c82-8a7f-466f-939a-158a53420f72-catalog-content\") pod \"redhat-operators-cxhh6\" (UID: \"12314c82-8a7f-466f-939a-158a53420f72\") " pod="openshift-marketplace/redhat-operators-cxhh6" Mar 01 10:27:50 crc kubenswrapper[4792]: I0301 10:27:50.128057 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12314c82-8a7f-466f-939a-158a53420f72-utilities\") pod \"redhat-operators-cxhh6\" (UID: \"12314c82-8a7f-466f-939a-158a53420f72\") " pod="openshift-marketplace/redhat-operators-cxhh6" Mar 01 10:27:50 crc kubenswrapper[4792]: I0301 10:27:50.128125 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dxk6\" (UniqueName: \"kubernetes.io/projected/12314c82-8a7f-466f-939a-158a53420f72-kube-api-access-4dxk6\") pod \"redhat-operators-cxhh6\" (UID: \"12314c82-8a7f-466f-939a-158a53420f72\") " pod="openshift-marketplace/redhat-operators-cxhh6" Mar 01 10:27:50 crc kubenswrapper[4792]: I0301 10:27:50.128507 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12314c82-8a7f-466f-939a-158a53420f72-catalog-content\") pod \"redhat-operators-cxhh6\" (UID: \"12314c82-8a7f-466f-939a-158a53420f72\") " pod="openshift-marketplace/redhat-operators-cxhh6" Mar 01 10:27:50 crc kubenswrapper[4792]: I0301 10:27:50.128551 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12314c82-8a7f-466f-939a-158a53420f72-utilities\") pod \"redhat-operators-cxhh6\" (UID: \"12314c82-8a7f-466f-939a-158a53420f72\") " pod="openshift-marketplace/redhat-operators-cxhh6" Mar 01 10:27:50 crc kubenswrapper[4792]: I0301 10:27:50.154324 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dxk6\" (UniqueName: \"kubernetes.io/projected/12314c82-8a7f-466f-939a-158a53420f72-kube-api-access-4dxk6\") pod \"redhat-operators-cxhh6\" (UID: \"12314c82-8a7f-466f-939a-158a53420f72\") " pod="openshift-marketplace/redhat-operators-cxhh6" Mar 01 10:27:50 crc kubenswrapper[4792]: I0301 10:27:50.311193 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cxhh6" Mar 01 10:27:50 crc kubenswrapper[4792]: I0301 10:27:50.820461 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cxhh6"] Mar 01 10:27:51 crc kubenswrapper[4792]: I0301 10:27:51.629382 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxhh6" event={"ID":"12314c82-8a7f-466f-939a-158a53420f72","Type":"ContainerStarted","Data":"1118d4695af3d42785b70cb2f47d1de1e099600c2214f5c7391a3af697aaf1ab"} Mar 01 10:27:52 crc kubenswrapper[4792]: I0301 10:27:52.637630 4792 generic.go:334] "Generic (PLEG): container finished" podID="12314c82-8a7f-466f-939a-158a53420f72" containerID="8691f7d7c721a269193a4a6cb58015f74a7ec254848c8d4d97fca1653885ba31" exitCode=0 Mar 01 10:27:52 crc kubenswrapper[4792]: I0301 10:27:52.637773 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxhh6" event={"ID":"12314c82-8a7f-466f-939a-158a53420f72","Type":"ContainerDied","Data":"8691f7d7c721a269193a4a6cb58015f74a7ec254848c8d4d97fca1653885ba31"} Mar 01 10:27:53 crc kubenswrapper[4792]: I0301 10:27:53.646646 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxhh6" event={"ID":"12314c82-8a7f-466f-939a-158a53420f72","Type":"ContainerStarted","Data":"a8620ea20d739c51308b4b8ed7430cafe7ba9fd563b914458c4b6bd618b02c6b"} Mar 01 10:27:58 crc kubenswrapper[4792]: I0301 10:27:58.686007 4792 generic.go:334] "Generic (PLEG): container finished" podID="12314c82-8a7f-466f-939a-158a53420f72" containerID="a8620ea20d739c51308b4b8ed7430cafe7ba9fd563b914458c4b6bd618b02c6b" exitCode=0 Mar 01 10:27:58 crc kubenswrapper[4792]: I0301 10:27:58.686631 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxhh6" event={"ID":"12314c82-8a7f-466f-939a-158a53420f72","Type":"ContainerDied","Data":"a8620ea20d739c51308b4b8ed7430cafe7ba9fd563b914458c4b6bd618b02c6b"} Mar 01 10:27:59 crc kubenswrapper[4792]: I0301 10:27:59.696474 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxhh6" event={"ID":"12314c82-8a7f-466f-939a-158a53420f72","Type":"ContainerStarted","Data":"1ef091670c5fdfa64e056797c2116c2378a573407bcb8db5089d780fe9884505"} Mar 01 10:27:59 crc kubenswrapper[4792]: I0301 10:27:59.715416 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cxhh6" podStartSLOduration=4.252022057 podStartE2EDuration="10.715395777s" podCreationTimestamp="2026-03-01 10:27:49 +0000 UTC" firstStartedPulling="2026-03-01 10:27:52.639654778 +0000 UTC m=+4801.881533975" lastFinishedPulling="2026-03-01 10:27:59.103028498 +0000 UTC m=+4808.344907695" observedRunningTime="2026-03-01 10:27:59.712523085 +0000 UTC m=+4808.954402282" watchObservedRunningTime="2026-03-01 10:27:59.715395777 +0000 UTC m=+4808.957274974" Mar 01 10:28:00 crc kubenswrapper[4792]: I0301 10:28:00.152118 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539348-5hvql"] Mar 01 10:28:00 crc kubenswrapper[4792]: I0301 10:28:00.153979 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539348-5hvql" Mar 01 10:28:00 crc kubenswrapper[4792]: I0301 10:28:00.156009 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 10:28:00 crc kubenswrapper[4792]: I0301 10:28:00.156460 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 10:28:00 crc kubenswrapper[4792]: I0301 10:28:00.156826 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 10:28:00 crc kubenswrapper[4792]: I0301 10:28:00.175046 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539348-5hvql"] Mar 01 10:28:00 crc kubenswrapper[4792]: I0301 10:28:00.247928 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffmwm\" (UniqueName: \"kubernetes.io/projected/5867157f-b16a-460c-afc4-0981a4d8ee43-kube-api-access-ffmwm\") pod \"auto-csr-approver-29539348-5hvql\" (UID: \"5867157f-b16a-460c-afc4-0981a4d8ee43\") " pod="openshift-infra/auto-csr-approver-29539348-5hvql" Mar 01 10:28:00 crc kubenswrapper[4792]: I0301 10:28:00.311570 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cxhh6" Mar 01 10:28:00 crc kubenswrapper[4792]: I0301 10:28:00.311625 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cxhh6" Mar 01 10:28:00 crc kubenswrapper[4792]: I0301 10:28:00.356397 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffmwm\" (UniqueName: \"kubernetes.io/projected/5867157f-b16a-460c-afc4-0981a4d8ee43-kube-api-access-ffmwm\") pod \"auto-csr-approver-29539348-5hvql\" (UID: \"5867157f-b16a-460c-afc4-0981a4d8ee43\") " pod="openshift-infra/auto-csr-approver-29539348-5hvql" Mar 01 10:28:00 crc kubenswrapper[4792]: I0301 10:28:00.374814 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffmwm\" (UniqueName: \"kubernetes.io/projected/5867157f-b16a-460c-afc4-0981a4d8ee43-kube-api-access-ffmwm\") pod \"auto-csr-approver-29539348-5hvql\" (UID: \"5867157f-b16a-460c-afc4-0981a4d8ee43\") " pod="openshift-infra/auto-csr-approver-29539348-5hvql" Mar 01 10:28:00 crc kubenswrapper[4792]: I0301 10:28:00.409047 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:28:00 crc kubenswrapper[4792]: E0301 10:28:00.409317 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:28:00 crc kubenswrapper[4792]: I0301 10:28:00.485361 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539348-5hvql" Mar 01 10:28:00 crc kubenswrapper[4792]: I0301 10:28:00.892401 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539348-5hvql"] Mar 01 10:28:01 crc kubenswrapper[4792]: I0301 10:28:01.365919 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cxhh6" podUID="12314c82-8a7f-466f-939a-158a53420f72" containerName="registry-server" probeResult="failure" output=< Mar 01 10:28:01 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 10:28:01 crc kubenswrapper[4792]: > Mar 01 10:28:01 crc kubenswrapper[4792]: I0301 10:28:01.723534 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539348-5hvql" event={"ID":"5867157f-b16a-460c-afc4-0981a4d8ee43","Type":"ContainerStarted","Data":"8c1e2f3178dea59ab178a2142d754bd6d93e0069c8e569ad774e0d802ed3549e"} Mar 01 10:28:02 crc kubenswrapper[4792]: I0301 10:28:02.733248 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539348-5hvql" event={"ID":"5867157f-b16a-460c-afc4-0981a4d8ee43","Type":"ContainerStarted","Data":"9237ca7f55124284e9b80295a7be3e3ee8987057df25870f237eb05840050933"} Mar 01 10:28:02 crc kubenswrapper[4792]: I0301 10:28:02.747066 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29539348-5hvql" podStartSLOduration=1.726261451 podStartE2EDuration="2.747044708s" podCreationTimestamp="2026-03-01 10:28:00 +0000 UTC" firstStartedPulling="2026-03-01 10:28:00.901929289 +0000 UTC m=+4810.143808486" lastFinishedPulling="2026-03-01 10:28:01.922712546 +0000 UTC m=+4811.164591743" observedRunningTime="2026-03-01 10:28:02.745990352 +0000 UTC m=+4811.987869549" watchObservedRunningTime="2026-03-01 10:28:02.747044708 +0000 UTC m=+4811.988923915" Mar 01 10:28:03 crc kubenswrapper[4792]: I0301 10:28:03.742770 4792 generic.go:334] "Generic (PLEG): container finished" podID="5867157f-b16a-460c-afc4-0981a4d8ee43" containerID="9237ca7f55124284e9b80295a7be3e3ee8987057df25870f237eb05840050933" exitCode=0 Mar 01 10:28:03 crc kubenswrapper[4792]: I0301 10:28:03.743201 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539348-5hvql" event={"ID":"5867157f-b16a-460c-afc4-0981a4d8ee43","Type":"ContainerDied","Data":"9237ca7f55124284e9b80295a7be3e3ee8987057df25870f237eb05840050933"} Mar 01 10:28:05 crc kubenswrapper[4792]: I0301 10:28:05.110942 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539348-5hvql" Mar 01 10:28:05 crc kubenswrapper[4792]: I0301 10:28:05.174959 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffmwm\" (UniqueName: \"kubernetes.io/projected/5867157f-b16a-460c-afc4-0981a4d8ee43-kube-api-access-ffmwm\") pod \"5867157f-b16a-460c-afc4-0981a4d8ee43\" (UID: \"5867157f-b16a-460c-afc4-0981a4d8ee43\") " Mar 01 10:28:05 crc kubenswrapper[4792]: I0301 10:28:05.180533 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5867157f-b16a-460c-afc4-0981a4d8ee43-kube-api-access-ffmwm" (OuterVolumeSpecName: "kube-api-access-ffmwm") pod "5867157f-b16a-460c-afc4-0981a4d8ee43" (UID: "5867157f-b16a-460c-afc4-0981a4d8ee43"). InnerVolumeSpecName "kube-api-access-ffmwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:28:05 crc kubenswrapper[4792]: I0301 10:28:05.276840 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffmwm\" (UniqueName: \"kubernetes.io/projected/5867157f-b16a-460c-afc4-0981a4d8ee43-kube-api-access-ffmwm\") on node \"crc\" DevicePath \"\"" Mar 01 10:28:05 crc kubenswrapper[4792]: I0301 10:28:05.760799 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539348-5hvql" event={"ID":"5867157f-b16a-460c-afc4-0981a4d8ee43","Type":"ContainerDied","Data":"8c1e2f3178dea59ab178a2142d754bd6d93e0069c8e569ad774e0d802ed3549e"} Mar 01 10:28:05 crc kubenswrapper[4792]: I0301 10:28:05.761064 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c1e2f3178dea59ab178a2142d754bd6d93e0069c8e569ad774e0d802ed3549e" Mar 01 10:28:05 crc kubenswrapper[4792]: I0301 10:28:05.760866 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539348-5hvql" Mar 01 10:28:05 crc kubenswrapper[4792]: I0301 10:28:05.821117 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539342-5t24p"] Mar 01 10:28:05 crc kubenswrapper[4792]: I0301 10:28:05.832503 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539342-5t24p"] Mar 01 10:28:07 crc kubenswrapper[4792]: I0301 10:28:07.418237 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="821a550c-e6ff-4517-a306-11ea497be759" path="/var/lib/kubelet/pods/821a550c-e6ff-4517-a306-11ea497be759/volumes" Mar 01 10:28:10 crc kubenswrapper[4792]: I0301 10:28:10.361863 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cxhh6" Mar 01 10:28:10 crc kubenswrapper[4792]: I0301 10:28:10.424823 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cxhh6" Mar 01 10:28:10 crc kubenswrapper[4792]: I0301 10:28:10.606863 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cxhh6"] Mar 01 10:28:11 crc kubenswrapper[4792]: I0301 10:28:11.809429 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cxhh6" podUID="12314c82-8a7f-466f-939a-158a53420f72" containerName="registry-server" containerID="cri-o://1ef091670c5fdfa64e056797c2116c2378a573407bcb8db5089d780fe9884505" gracePeriod=2 Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.295654 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cxhh6" Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.425831 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12314c82-8a7f-466f-939a-158a53420f72-catalog-content\") pod \"12314c82-8a7f-466f-939a-158a53420f72\" (UID: \"12314c82-8a7f-466f-939a-158a53420f72\") " Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.426144 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dxk6\" (UniqueName: \"kubernetes.io/projected/12314c82-8a7f-466f-939a-158a53420f72-kube-api-access-4dxk6\") pod \"12314c82-8a7f-466f-939a-158a53420f72\" (UID: \"12314c82-8a7f-466f-939a-158a53420f72\") " Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.426190 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12314c82-8a7f-466f-939a-158a53420f72-utilities\") pod \"12314c82-8a7f-466f-939a-158a53420f72\" (UID: \"12314c82-8a7f-466f-939a-158a53420f72\") " Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.426664 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12314c82-8a7f-466f-939a-158a53420f72-utilities" (OuterVolumeSpecName: "utilities") pod "12314c82-8a7f-466f-939a-158a53420f72" (UID: "12314c82-8a7f-466f-939a-158a53420f72"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.431422 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12314c82-8a7f-466f-939a-158a53420f72-kube-api-access-4dxk6" (OuterVolumeSpecName: "kube-api-access-4dxk6") pod "12314c82-8a7f-466f-939a-158a53420f72" (UID: "12314c82-8a7f-466f-939a-158a53420f72"). InnerVolumeSpecName "kube-api-access-4dxk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.528205 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dxk6\" (UniqueName: \"kubernetes.io/projected/12314c82-8a7f-466f-939a-158a53420f72-kube-api-access-4dxk6\") on node \"crc\" DevicePath \"\"" Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.528237 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12314c82-8a7f-466f-939a-158a53420f72-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.568754 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12314c82-8a7f-466f-939a-158a53420f72-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "12314c82-8a7f-466f-939a-158a53420f72" (UID: "12314c82-8a7f-466f-939a-158a53420f72"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.630270 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12314c82-8a7f-466f-939a-158a53420f72-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.820584 4792 generic.go:334] "Generic (PLEG): container finished" podID="12314c82-8a7f-466f-939a-158a53420f72" containerID="1ef091670c5fdfa64e056797c2116c2378a573407bcb8db5089d780fe9884505" exitCode=0 Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.820663 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxhh6" event={"ID":"12314c82-8a7f-466f-939a-158a53420f72","Type":"ContainerDied","Data":"1ef091670c5fdfa64e056797c2116c2378a573407bcb8db5089d780fe9884505"} Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.820666 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cxhh6" Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.820705 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxhh6" event={"ID":"12314c82-8a7f-466f-939a-158a53420f72","Type":"ContainerDied","Data":"1118d4695af3d42785b70cb2f47d1de1e099600c2214f5c7391a3af697aaf1ab"} Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.820766 4792 scope.go:117] "RemoveContainer" containerID="1ef091670c5fdfa64e056797c2116c2378a573407bcb8db5089d780fe9884505" Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.854443 4792 scope.go:117] "RemoveContainer" containerID="a8620ea20d739c51308b4b8ed7430cafe7ba9fd563b914458c4b6bd618b02c6b" Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.883253 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cxhh6"] Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.891848 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cxhh6"] Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.898625 4792 scope.go:117] "RemoveContainer" containerID="8691f7d7c721a269193a4a6cb58015f74a7ec254848c8d4d97fca1653885ba31" Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.951669 4792 scope.go:117] "RemoveContainer" containerID="1ef091670c5fdfa64e056797c2116c2378a573407bcb8db5089d780fe9884505" Mar 01 10:28:12 crc kubenswrapper[4792]: E0301 10:28:12.952168 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ef091670c5fdfa64e056797c2116c2378a573407bcb8db5089d780fe9884505\": container with ID starting with 1ef091670c5fdfa64e056797c2116c2378a573407bcb8db5089d780fe9884505 not found: ID does not exist" containerID="1ef091670c5fdfa64e056797c2116c2378a573407bcb8db5089d780fe9884505" Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.952206 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ef091670c5fdfa64e056797c2116c2378a573407bcb8db5089d780fe9884505"} err="failed to get container status \"1ef091670c5fdfa64e056797c2116c2378a573407bcb8db5089d780fe9884505\": rpc error: code = NotFound desc = could not find container \"1ef091670c5fdfa64e056797c2116c2378a573407bcb8db5089d780fe9884505\": container with ID starting with 1ef091670c5fdfa64e056797c2116c2378a573407bcb8db5089d780fe9884505 not found: ID does not exist" Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.952229 4792 scope.go:117] "RemoveContainer" containerID="a8620ea20d739c51308b4b8ed7430cafe7ba9fd563b914458c4b6bd618b02c6b" Mar 01 10:28:12 crc kubenswrapper[4792]: E0301 10:28:12.952449 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8620ea20d739c51308b4b8ed7430cafe7ba9fd563b914458c4b6bd618b02c6b\": container with ID starting with a8620ea20d739c51308b4b8ed7430cafe7ba9fd563b914458c4b6bd618b02c6b not found: ID does not exist" containerID="a8620ea20d739c51308b4b8ed7430cafe7ba9fd563b914458c4b6bd618b02c6b" Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.952480 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8620ea20d739c51308b4b8ed7430cafe7ba9fd563b914458c4b6bd618b02c6b"} err="failed to get container status \"a8620ea20d739c51308b4b8ed7430cafe7ba9fd563b914458c4b6bd618b02c6b\": rpc error: code = NotFound desc = could not find container \"a8620ea20d739c51308b4b8ed7430cafe7ba9fd563b914458c4b6bd618b02c6b\": container with ID starting with a8620ea20d739c51308b4b8ed7430cafe7ba9fd563b914458c4b6bd618b02c6b not found: ID does not exist" Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.952500 4792 scope.go:117] "RemoveContainer" containerID="8691f7d7c721a269193a4a6cb58015f74a7ec254848c8d4d97fca1653885ba31" Mar 01 10:28:12 crc kubenswrapper[4792]: E0301 10:28:12.952704 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8691f7d7c721a269193a4a6cb58015f74a7ec254848c8d4d97fca1653885ba31\": container with ID starting with 8691f7d7c721a269193a4a6cb58015f74a7ec254848c8d4d97fca1653885ba31 not found: ID does not exist" containerID="8691f7d7c721a269193a4a6cb58015f74a7ec254848c8d4d97fca1653885ba31" Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.952726 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8691f7d7c721a269193a4a6cb58015f74a7ec254848c8d4d97fca1653885ba31"} err="failed to get container status \"8691f7d7c721a269193a4a6cb58015f74a7ec254848c8d4d97fca1653885ba31\": rpc error: code = NotFound desc = could not find container \"8691f7d7c721a269193a4a6cb58015f74a7ec254848c8d4d97fca1653885ba31\": container with ID starting with 8691f7d7c721a269193a4a6cb58015f74a7ec254848c8d4d97fca1653885ba31 not found: ID does not exist" Mar 01 10:28:13 crc kubenswrapper[4792]: I0301 10:28:13.418961 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12314c82-8a7f-466f-939a-158a53420f72" path="/var/lib/kubelet/pods/12314c82-8a7f-466f-939a-158a53420f72/volumes" Mar 01 10:28:14 crc kubenswrapper[4792]: I0301 10:28:14.408861 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:28:14 crc kubenswrapper[4792]: E0301 10:28:14.409177 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:28:22 crc kubenswrapper[4792]: I0301 10:28:22.463282 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k4zrv"] Mar 01 10:28:22 crc kubenswrapper[4792]: E0301 10:28:22.464408 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12314c82-8a7f-466f-939a-158a53420f72" containerName="registry-server" Mar 01 10:28:22 crc kubenswrapper[4792]: I0301 10:28:22.464439 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="12314c82-8a7f-466f-939a-158a53420f72" containerName="registry-server" Mar 01 10:28:22 crc kubenswrapper[4792]: E0301 10:28:22.464451 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5867157f-b16a-460c-afc4-0981a4d8ee43" containerName="oc" Mar 01 10:28:22 crc kubenswrapper[4792]: I0301 10:28:22.464459 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="5867157f-b16a-460c-afc4-0981a4d8ee43" containerName="oc" Mar 01 10:28:22 crc kubenswrapper[4792]: E0301 10:28:22.464505 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12314c82-8a7f-466f-939a-158a53420f72" containerName="extract-utilities" Mar 01 10:28:22 crc kubenswrapper[4792]: I0301 10:28:22.464515 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="12314c82-8a7f-466f-939a-158a53420f72" containerName="extract-utilities" Mar 01 10:28:22 crc kubenswrapper[4792]: E0301 10:28:22.464529 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12314c82-8a7f-466f-939a-158a53420f72" containerName="extract-content" Mar 01 10:28:22 crc kubenswrapper[4792]: I0301 10:28:22.464536 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="12314c82-8a7f-466f-939a-158a53420f72" containerName="extract-content" Mar 01 10:28:22 crc kubenswrapper[4792]: I0301 10:28:22.464777 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="5867157f-b16a-460c-afc4-0981a4d8ee43" containerName="oc" Mar 01 10:28:22 crc kubenswrapper[4792]: I0301 10:28:22.464818 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="12314c82-8a7f-466f-939a-158a53420f72" containerName="registry-server" Mar 01 10:28:22 crc kubenswrapper[4792]: I0301 10:28:22.466580 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k4zrv" Mar 01 10:28:22 crc kubenswrapper[4792]: I0301 10:28:22.477367 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k4zrv"] Mar 01 10:28:22 crc kubenswrapper[4792]: I0301 10:28:22.574932 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a74a274-b6f7-421d-917b-33034be46bcf-catalog-content\") pod \"community-operators-k4zrv\" (UID: \"7a74a274-b6f7-421d-917b-33034be46bcf\") " pod="openshift-marketplace/community-operators-k4zrv" Mar 01 10:28:22 crc kubenswrapper[4792]: I0301 10:28:22.575324 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4b95\" (UniqueName: \"kubernetes.io/projected/7a74a274-b6f7-421d-917b-33034be46bcf-kube-api-access-v4b95\") pod \"community-operators-k4zrv\" (UID: \"7a74a274-b6f7-421d-917b-33034be46bcf\") " pod="openshift-marketplace/community-operators-k4zrv" Mar 01 10:28:22 crc kubenswrapper[4792]: I0301 10:28:22.575404 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a74a274-b6f7-421d-917b-33034be46bcf-utilities\") pod \"community-operators-k4zrv\" (UID: \"7a74a274-b6f7-421d-917b-33034be46bcf\") " pod="openshift-marketplace/community-operators-k4zrv" Mar 01 10:28:22 crc kubenswrapper[4792]: I0301 10:28:22.677398 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4b95\" (UniqueName: \"kubernetes.io/projected/7a74a274-b6f7-421d-917b-33034be46bcf-kube-api-access-v4b95\") pod \"community-operators-k4zrv\" (UID: \"7a74a274-b6f7-421d-917b-33034be46bcf\") " pod="openshift-marketplace/community-operators-k4zrv" Mar 01 10:28:22 crc kubenswrapper[4792]: I0301 10:28:22.678080 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a74a274-b6f7-421d-917b-33034be46bcf-utilities\") pod \"community-operators-k4zrv\" (UID: \"7a74a274-b6f7-421d-917b-33034be46bcf\") " pod="openshift-marketplace/community-operators-k4zrv" Mar 01 10:28:22 crc kubenswrapper[4792]: I0301 10:28:22.678131 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a74a274-b6f7-421d-917b-33034be46bcf-utilities\") pod \"community-operators-k4zrv\" (UID: \"7a74a274-b6f7-421d-917b-33034be46bcf\") " pod="openshift-marketplace/community-operators-k4zrv" Mar 01 10:28:22 crc kubenswrapper[4792]: I0301 10:28:22.678338 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a74a274-b6f7-421d-917b-33034be46bcf-catalog-content\") pod \"community-operators-k4zrv\" (UID: \"7a74a274-b6f7-421d-917b-33034be46bcf\") " pod="openshift-marketplace/community-operators-k4zrv" Mar 01 10:28:22 crc kubenswrapper[4792]: I0301 10:28:22.678628 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a74a274-b6f7-421d-917b-33034be46bcf-catalog-content\") pod \"community-operators-k4zrv\" (UID: \"7a74a274-b6f7-421d-917b-33034be46bcf\") " pod="openshift-marketplace/community-operators-k4zrv" Mar 01 10:28:22 crc kubenswrapper[4792]: I0301 10:28:22.712497 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4b95\" (UniqueName: \"kubernetes.io/projected/7a74a274-b6f7-421d-917b-33034be46bcf-kube-api-access-v4b95\") pod \"community-operators-k4zrv\" (UID: \"7a74a274-b6f7-421d-917b-33034be46bcf\") " pod="openshift-marketplace/community-operators-k4zrv" Mar 01 10:28:22 crc kubenswrapper[4792]: I0301 10:28:22.845001 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k4zrv" Mar 01 10:28:23 crc kubenswrapper[4792]: I0301 10:28:23.447812 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k4zrv"] Mar 01 10:28:23 crc kubenswrapper[4792]: W0301 10:28:23.455078 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a74a274_b6f7_421d_917b_33034be46bcf.slice/crio-83306a83f874a192e94ea93f5f35eeb60edae0342b45325c1cc5caf61b810c5d WatchSource:0}: Error finding container 83306a83f874a192e94ea93f5f35eeb60edae0342b45325c1cc5caf61b810c5d: Status 404 returned error can't find the container with id 83306a83f874a192e94ea93f5f35eeb60edae0342b45325c1cc5caf61b810c5d Mar 01 10:28:23 crc kubenswrapper[4792]: I0301 10:28:23.947434 4792 generic.go:334] "Generic (PLEG): container finished" podID="7a74a274-b6f7-421d-917b-33034be46bcf" containerID="3556e01ca51d89d9810182ac44c5e47267a5a99cda21c1fe6ba8f3a99f10147e" exitCode=0 Mar 01 10:28:23 crc kubenswrapper[4792]: I0301 10:28:23.947532 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k4zrv" event={"ID":"7a74a274-b6f7-421d-917b-33034be46bcf","Type":"ContainerDied","Data":"3556e01ca51d89d9810182ac44c5e47267a5a99cda21c1fe6ba8f3a99f10147e"} Mar 01 10:28:23 crc kubenswrapper[4792]: I0301 10:28:23.948022 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k4zrv" event={"ID":"7a74a274-b6f7-421d-917b-33034be46bcf","Type":"ContainerStarted","Data":"83306a83f874a192e94ea93f5f35eeb60edae0342b45325c1cc5caf61b810c5d"} Mar 01 10:28:24 crc kubenswrapper[4792]: I0301 10:28:24.958210 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k4zrv" event={"ID":"7a74a274-b6f7-421d-917b-33034be46bcf","Type":"ContainerStarted","Data":"015d6fed1253f9fed0bc4b9aaca2607218259b4b5bcf7ec7fd11af9d2009ea66"} Mar 01 10:28:25 crc kubenswrapper[4792]: I0301 10:28:25.409859 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:28:25 crc kubenswrapper[4792]: E0301 10:28:25.410363 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:28:26 crc kubenswrapper[4792]: I0301 10:28:26.978961 4792 generic.go:334] "Generic (PLEG): container finished" podID="7a74a274-b6f7-421d-917b-33034be46bcf" containerID="015d6fed1253f9fed0bc4b9aaca2607218259b4b5bcf7ec7fd11af9d2009ea66" exitCode=0 Mar 01 10:28:26 crc kubenswrapper[4792]: I0301 10:28:26.980408 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k4zrv" event={"ID":"7a74a274-b6f7-421d-917b-33034be46bcf","Type":"ContainerDied","Data":"015d6fed1253f9fed0bc4b9aaca2607218259b4b5bcf7ec7fd11af9d2009ea66"} Mar 01 10:28:27 crc kubenswrapper[4792]: I0301 10:28:27.991445 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k4zrv" event={"ID":"7a74a274-b6f7-421d-917b-33034be46bcf","Type":"ContainerStarted","Data":"059338cee094fc4765fb7b2214647dfeeb6a3603d7b01b2934562f3d164bf7ce"} Mar 01 10:28:28 crc kubenswrapper[4792]: I0301 10:28:28.019297 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k4zrv" podStartSLOduration=2.564903192 podStartE2EDuration="6.019274688s" podCreationTimestamp="2026-03-01 10:28:22 +0000 UTC" firstStartedPulling="2026-03-01 10:28:23.949589209 +0000 UTC m=+4833.191468406" lastFinishedPulling="2026-03-01 10:28:27.403960715 +0000 UTC m=+4836.645839902" observedRunningTime="2026-03-01 10:28:28.010535859 +0000 UTC m=+4837.252415056" watchObservedRunningTime="2026-03-01 10:28:28.019274688 +0000 UTC m=+4837.261153885" Mar 01 10:28:32 crc kubenswrapper[4792]: I0301 10:28:32.845941 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k4zrv" Mar 01 10:28:32 crc kubenswrapper[4792]: I0301 10:28:32.846509 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k4zrv" Mar 01 10:28:32 crc kubenswrapper[4792]: I0301 10:28:32.899932 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k4zrv" Mar 01 10:28:33 crc kubenswrapper[4792]: I0301 10:28:33.075079 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k4zrv" Mar 01 10:28:33 crc kubenswrapper[4792]: I0301 10:28:33.136623 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k4zrv"] Mar 01 10:28:35 crc kubenswrapper[4792]: I0301 10:28:35.041688 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-k4zrv" podUID="7a74a274-b6f7-421d-917b-33034be46bcf" containerName="registry-server" containerID="cri-o://059338cee094fc4765fb7b2214647dfeeb6a3603d7b01b2934562f3d164bf7ce" gracePeriod=2 Mar 01 10:28:35 crc kubenswrapper[4792]: I0301 10:28:35.488165 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k4zrv" Mar 01 10:28:35 crc kubenswrapper[4792]: I0301 10:28:35.617088 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4b95\" (UniqueName: \"kubernetes.io/projected/7a74a274-b6f7-421d-917b-33034be46bcf-kube-api-access-v4b95\") pod \"7a74a274-b6f7-421d-917b-33034be46bcf\" (UID: \"7a74a274-b6f7-421d-917b-33034be46bcf\") " Mar 01 10:28:35 crc kubenswrapper[4792]: I0301 10:28:35.617286 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a74a274-b6f7-421d-917b-33034be46bcf-utilities\") pod \"7a74a274-b6f7-421d-917b-33034be46bcf\" (UID: \"7a74a274-b6f7-421d-917b-33034be46bcf\") " Mar 01 10:28:35 crc kubenswrapper[4792]: I0301 10:28:35.617413 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a74a274-b6f7-421d-917b-33034be46bcf-catalog-content\") pod \"7a74a274-b6f7-421d-917b-33034be46bcf\" (UID: \"7a74a274-b6f7-421d-917b-33034be46bcf\") " Mar 01 10:28:35 crc kubenswrapper[4792]: I0301 10:28:35.618248 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a74a274-b6f7-421d-917b-33034be46bcf-utilities" (OuterVolumeSpecName: "utilities") pod "7a74a274-b6f7-421d-917b-33034be46bcf" (UID: "7a74a274-b6f7-421d-917b-33034be46bcf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:28:35 crc kubenswrapper[4792]: I0301 10:28:35.623207 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a74a274-b6f7-421d-917b-33034be46bcf-kube-api-access-v4b95" (OuterVolumeSpecName: "kube-api-access-v4b95") pod "7a74a274-b6f7-421d-917b-33034be46bcf" (UID: "7a74a274-b6f7-421d-917b-33034be46bcf"). InnerVolumeSpecName "kube-api-access-v4b95". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:28:35 crc kubenswrapper[4792]: I0301 10:28:35.666958 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a74a274-b6f7-421d-917b-33034be46bcf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a74a274-b6f7-421d-917b-33034be46bcf" (UID: "7a74a274-b6f7-421d-917b-33034be46bcf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:28:35 crc kubenswrapper[4792]: I0301 10:28:35.719559 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a74a274-b6f7-421d-917b-33034be46bcf-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 10:28:35 crc kubenswrapper[4792]: I0301 10:28:35.719598 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a74a274-b6f7-421d-917b-33034be46bcf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 10:28:35 crc kubenswrapper[4792]: I0301 10:28:35.719616 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4b95\" (UniqueName: \"kubernetes.io/projected/7a74a274-b6f7-421d-917b-33034be46bcf-kube-api-access-v4b95\") on node \"crc\" DevicePath \"\"" Mar 01 10:28:36 crc kubenswrapper[4792]: I0301 10:28:36.051377 4792 generic.go:334] "Generic (PLEG): container finished" podID="7a74a274-b6f7-421d-917b-33034be46bcf" containerID="059338cee094fc4765fb7b2214647dfeeb6a3603d7b01b2934562f3d164bf7ce" exitCode=0 Mar 01 10:28:36 crc kubenswrapper[4792]: I0301 10:28:36.051438 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k4zrv" Mar 01 10:28:36 crc kubenswrapper[4792]: I0301 10:28:36.051435 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k4zrv" event={"ID":"7a74a274-b6f7-421d-917b-33034be46bcf","Type":"ContainerDied","Data":"059338cee094fc4765fb7b2214647dfeeb6a3603d7b01b2934562f3d164bf7ce"} Mar 01 10:28:36 crc kubenswrapper[4792]: I0301 10:28:36.051498 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k4zrv" event={"ID":"7a74a274-b6f7-421d-917b-33034be46bcf","Type":"ContainerDied","Data":"83306a83f874a192e94ea93f5f35eeb60edae0342b45325c1cc5caf61b810c5d"} Mar 01 10:28:36 crc kubenswrapper[4792]: I0301 10:28:36.051517 4792 scope.go:117] "RemoveContainer" containerID="059338cee094fc4765fb7b2214647dfeeb6a3603d7b01b2934562f3d164bf7ce" Mar 01 10:28:36 crc kubenswrapper[4792]: I0301 10:28:36.089943 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k4zrv"] Mar 01 10:28:36 crc kubenswrapper[4792]: I0301 10:28:36.103423 4792 scope.go:117] "RemoveContainer" containerID="015d6fed1253f9fed0bc4b9aaca2607218259b4b5bcf7ec7fd11af9d2009ea66" Mar 01 10:28:36 crc kubenswrapper[4792]: I0301 10:28:36.107331 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-k4zrv"] Mar 01 10:28:36 crc kubenswrapper[4792]: I0301 10:28:36.133273 4792 scope.go:117] "RemoveContainer" containerID="3556e01ca51d89d9810182ac44c5e47267a5a99cda21c1fe6ba8f3a99f10147e" Mar 01 10:28:36 crc kubenswrapper[4792]: I0301 10:28:36.173217 4792 scope.go:117] "RemoveContainer" containerID="059338cee094fc4765fb7b2214647dfeeb6a3603d7b01b2934562f3d164bf7ce" Mar 01 10:28:36 crc kubenswrapper[4792]: E0301 10:28:36.173709 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"059338cee094fc4765fb7b2214647dfeeb6a3603d7b01b2934562f3d164bf7ce\": container with ID starting with 059338cee094fc4765fb7b2214647dfeeb6a3603d7b01b2934562f3d164bf7ce not found: ID does not exist" containerID="059338cee094fc4765fb7b2214647dfeeb6a3603d7b01b2934562f3d164bf7ce" Mar 01 10:28:36 crc kubenswrapper[4792]: I0301 10:28:36.173755 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"059338cee094fc4765fb7b2214647dfeeb6a3603d7b01b2934562f3d164bf7ce"} err="failed to get container status \"059338cee094fc4765fb7b2214647dfeeb6a3603d7b01b2934562f3d164bf7ce\": rpc error: code = NotFound desc = could not find container \"059338cee094fc4765fb7b2214647dfeeb6a3603d7b01b2934562f3d164bf7ce\": container with ID starting with 059338cee094fc4765fb7b2214647dfeeb6a3603d7b01b2934562f3d164bf7ce not found: ID does not exist" Mar 01 10:28:36 crc kubenswrapper[4792]: I0301 10:28:36.173789 4792 scope.go:117] "RemoveContainer" containerID="015d6fed1253f9fed0bc4b9aaca2607218259b4b5bcf7ec7fd11af9d2009ea66" Mar 01 10:28:36 crc kubenswrapper[4792]: E0301 10:28:36.174188 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"015d6fed1253f9fed0bc4b9aaca2607218259b4b5bcf7ec7fd11af9d2009ea66\": container with ID starting with 015d6fed1253f9fed0bc4b9aaca2607218259b4b5bcf7ec7fd11af9d2009ea66 not found: ID does not exist" containerID="015d6fed1253f9fed0bc4b9aaca2607218259b4b5bcf7ec7fd11af9d2009ea66" Mar 01 10:28:36 crc kubenswrapper[4792]: I0301 10:28:36.174231 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"015d6fed1253f9fed0bc4b9aaca2607218259b4b5bcf7ec7fd11af9d2009ea66"} err="failed to get container status \"015d6fed1253f9fed0bc4b9aaca2607218259b4b5bcf7ec7fd11af9d2009ea66\": rpc error: code = NotFound desc = could not find container \"015d6fed1253f9fed0bc4b9aaca2607218259b4b5bcf7ec7fd11af9d2009ea66\": container with ID starting with 015d6fed1253f9fed0bc4b9aaca2607218259b4b5bcf7ec7fd11af9d2009ea66 not found: ID does not exist" Mar 01 10:28:36 crc kubenswrapper[4792]: I0301 10:28:36.174263 4792 scope.go:117] "RemoveContainer" containerID="3556e01ca51d89d9810182ac44c5e47267a5a99cda21c1fe6ba8f3a99f10147e" Mar 01 10:28:36 crc kubenswrapper[4792]: E0301 10:28:36.174668 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3556e01ca51d89d9810182ac44c5e47267a5a99cda21c1fe6ba8f3a99f10147e\": container with ID starting with 3556e01ca51d89d9810182ac44c5e47267a5a99cda21c1fe6ba8f3a99f10147e not found: ID does not exist" containerID="3556e01ca51d89d9810182ac44c5e47267a5a99cda21c1fe6ba8f3a99f10147e" Mar 01 10:28:36 crc kubenswrapper[4792]: I0301 10:28:36.174716 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3556e01ca51d89d9810182ac44c5e47267a5a99cda21c1fe6ba8f3a99f10147e"} err="failed to get container status \"3556e01ca51d89d9810182ac44c5e47267a5a99cda21c1fe6ba8f3a99f10147e\": rpc error: code = NotFound desc = could not find container \"3556e01ca51d89d9810182ac44c5e47267a5a99cda21c1fe6ba8f3a99f10147e\": container with ID starting with 3556e01ca51d89d9810182ac44c5e47267a5a99cda21c1fe6ba8f3a99f10147e not found: ID does not exist" Mar 01 10:28:37 crc kubenswrapper[4792]: I0301 10:28:37.409172 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:28:37 crc kubenswrapper[4792]: E0301 10:28:37.409712 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:28:37 crc kubenswrapper[4792]: I0301 10:28:37.419038 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a74a274-b6f7-421d-917b-33034be46bcf" path="/var/lib/kubelet/pods/7a74a274-b6f7-421d-917b-33034be46bcf/volumes" Mar 01 10:28:40 crc kubenswrapper[4792]: I0301 10:28:40.303965 4792 scope.go:117] "RemoveContainer" containerID="ec9227dfee7817fbe5632f2b037665aa0557b7bf6988989846a2502b4704a463" Mar 01 10:28:49 crc kubenswrapper[4792]: I0301 10:28:49.685788 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7f86869f48-jg6nw_6c97472a-b6b7-4fc4-b872-a318812f0999/barbican-api/0.log" Mar 01 10:28:49 crc kubenswrapper[4792]: I0301 10:28:49.916015 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-64f98fd86b-96l6n_d30c642c-b4ae-495a-8acd-cc8be4a0f412/barbican-keystone-listener/0.log" Mar 01 10:28:49 crc kubenswrapper[4792]: I0301 10:28:49.940887 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7f86869f48-jg6nw_6c97472a-b6b7-4fc4-b872-a318812f0999/barbican-api-log/0.log" Mar 01 10:28:49 crc kubenswrapper[4792]: I0301 10:28:49.986730 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-64f98fd86b-96l6n_d30c642c-b4ae-495a-8acd-cc8be4a0f412/barbican-keystone-listener-log/0.log" Mar 01 10:28:50 crc kubenswrapper[4792]: I0301 10:28:50.250577 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-65f4d58895-tvn59_26fbd30a-a485-4463-9aac-bb695c43e9e3/barbican-worker/0.log" Mar 01 10:28:50 crc kubenswrapper[4792]: I0301 10:28:50.278511 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-65f4d58895-tvn59_26fbd30a-a485-4463-9aac-bb695c43e9e3/barbican-worker-log/0.log" Mar 01 10:28:50 crc kubenswrapper[4792]: I0301 10:28:50.538621 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb_1201ca91-41eb-45d0-991d-71883b4014ae/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:28:50 crc kubenswrapper[4792]: I0301 10:28:50.668423 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_63238274-bc2e-4686-8371-e891944269f9/ceilometer-notification-agent/0.log" Mar 01 10:28:50 crc kubenswrapper[4792]: I0301 10:28:50.682494 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_63238274-bc2e-4686-8371-e891944269f9/ceilometer-central-agent/0.log" Mar 01 10:28:50 crc kubenswrapper[4792]: I0301 10:28:50.747168 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_63238274-bc2e-4686-8371-e891944269f9/proxy-httpd/0.log" Mar 01 10:28:50 crc kubenswrapper[4792]: I0301 10:28:50.873499 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_63238274-bc2e-4686-8371-e891944269f9/sg-core/0.log" Mar 01 10:28:50 crc kubenswrapper[4792]: I0301 10:28:50.958082 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj_f3a428e9-b35d-4f80-bb40-c158095d5bfa/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:28:51 crc kubenswrapper[4792]: I0301 10:28:51.159076 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m_2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:28:51 crc kubenswrapper[4792]: I0301 10:28:51.295210 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_084f9db1-15eb-458c-8b43-aeb5dbb0555f/cinder-api/0.log" Mar 01 10:28:51 crc kubenswrapper[4792]: I0301 10:28:51.306189 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_084f9db1-15eb-458c-8b43-aeb5dbb0555f/cinder-api-log/0.log" Mar 01 10:28:51 crc kubenswrapper[4792]: I0301 10:28:51.649076 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_23d15722-3d0f-44ce-ac55-eba67760f0e9/probe/0.log" Mar 01 10:28:51 crc kubenswrapper[4792]: I0301 10:28:51.742223 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_23d15722-3d0f-44ce-ac55-eba67760f0e9/cinder-backup/0.log" Mar 01 10:28:51 crc kubenswrapper[4792]: I0301 10:28:51.829336 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_688f590f-ae5c-4caf-b8c7-013a118f42c5/cinder-scheduler/0.log" Mar 01 10:28:51 crc kubenswrapper[4792]: I0301 10:28:51.999410 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_688f590f-ae5c-4caf-b8c7-013a118f42c5/probe/0.log" Mar 01 10:28:52 crc kubenswrapper[4792]: I0301 10:28:52.120244 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_d3ca4743-fa6c-4e2e-b2c8-b2362f44a727/cinder-volume/0.log" Mar 01 10:28:52 crc kubenswrapper[4792]: I0301 10:28:52.140592 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_d3ca4743-fa6c-4e2e-b2c8-b2362f44a727/probe/0.log" Mar 01 10:28:52 crc kubenswrapper[4792]: I0301 10:28:52.408972 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:28:52 crc kubenswrapper[4792]: E0301 10:28:52.409199 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:28:52 crc kubenswrapper[4792]: I0301 10:28:52.567655 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-dtgks_f25228f4-912f-408c-a1d6-9279c350b767/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:28:52 crc kubenswrapper[4792]: I0301 10:28:52.772705 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5_cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:28:52 crc kubenswrapper[4792]: I0301 10:28:52.886700 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-86c6bdcc4c-fqgkv_49541358-1fd0-4d1d-8b61-0c618994dfc0/init/0.log" Mar 01 10:28:53 crc kubenswrapper[4792]: I0301 10:28:53.056556 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-86c6bdcc4c-fqgkv_49541358-1fd0-4d1d-8b61-0c618994dfc0/init/0.log" Mar 01 10:28:53 crc kubenswrapper[4792]: I0301 10:28:53.200830 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_52b189da-3327-40c1-bf22-a842b0980593/glance-httpd/0.log" Mar 01 10:28:53 crc kubenswrapper[4792]: I0301 10:28:53.266778 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-86c6bdcc4c-fqgkv_49541358-1fd0-4d1d-8b61-0c618994dfc0/dnsmasq-dns/0.log" Mar 01 10:28:53 crc kubenswrapper[4792]: I0301 10:28:53.292109 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_52b189da-3327-40c1-bf22-a842b0980593/glance-log/0.log" Mar 01 10:28:53 crc kubenswrapper[4792]: I0301 10:28:53.492505 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_9d055103-6c35-481f-820a-7aa363543404/glance-httpd/0.log" Mar 01 10:28:53 crc kubenswrapper[4792]: I0301 10:28:53.508261 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_9d055103-6c35-481f-820a-7aa363543404/glance-log/0.log" Mar 01 10:28:53 crc kubenswrapper[4792]: I0301 10:28:53.909461 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-79f8cb6d9d-xg7h5_d7f79f77-ac1b-445e-8e28-85c8964f5461/horizon/0.log" Mar 01 10:28:53 crc kubenswrapper[4792]: I0301 10:28:53.985893 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-xqprh_d11c64e6-0562-41d9-a213-f1c5749b4c83/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:28:54 crc kubenswrapper[4792]: I0301 10:28:54.025640 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-79f8cb6d9d-xg7h5_d7f79f77-ac1b-445e-8e28-85c8964f5461/horizon-log/0.log" Mar 01 10:28:54 crc kubenswrapper[4792]: I0301 10:28:54.235496 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-4rw28_822af429-9091-43e5-a16d-7a287f2c5bb2/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:28:54 crc kubenswrapper[4792]: I0301 10:28:54.417264 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-749f685d77-ggsln_b60e7776-3e2a-4e08-900d-cd39a29a78bc/keystone-api/0.log" Mar 01 10:28:54 crc kubenswrapper[4792]: I0301 10:28:54.573963 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29539321-sclgm_7ec04609-b280-4df0-a0c5-2e4c7208c1c6/keystone-cron/0.log" Mar 01 10:28:54 crc kubenswrapper[4792]: I0301 10:28:54.736142 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_6f21d62f-3539-4d5d-aeaa-cc816a51d412/kube-state-metrics/0.log" Mar 01 10:28:54 crc kubenswrapper[4792]: I0301 10:28:54.857809 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt_c7230f65-7e9a-4455-8d25-c49393bfbafe/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:28:54 crc kubenswrapper[4792]: I0301 10:28:54.942644 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_e6660fdc-5636-44ec-b6c0-e0e417d72e8a/manila-api-log/0.log" Mar 01 10:28:55 crc kubenswrapper[4792]: I0301 10:28:55.100040 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_e6660fdc-5636-44ec-b6c0-e0e417d72e8a/manila-api/0.log" Mar 01 10:28:55 crc kubenswrapper[4792]: I0301 10:28:55.269209 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_5813cf9a-1d9e-4a74-82e1-68e994c9175a/probe/0.log" Mar 01 10:28:55 crc kubenswrapper[4792]: I0301 10:28:55.275189 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_5813cf9a-1d9e-4a74-82e1-68e994c9175a/manila-scheduler/0.log" Mar 01 10:28:55 crc kubenswrapper[4792]: I0301 10:28:55.433443 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_03462f2f-874f-496a-934b-9fa6e2c55850/manila-share/0.log" Mar 01 10:28:55 crc kubenswrapper[4792]: I0301 10:28:55.449503 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_03462f2f-874f-496a-934b-9fa6e2c55850/probe/0.log" Mar 01 10:28:55 crc kubenswrapper[4792]: I0301 10:28:55.835526 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6c8bdfb955-kjg92_ceced30a-39e5-413f-a498-e5d4500f1eea/neutron-httpd/0.log" Mar 01 10:28:55 crc kubenswrapper[4792]: I0301 10:28:55.850639 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6c8bdfb955-kjg92_ceced30a-39e5-413f-a498-e5d4500f1eea/neutron-api/0.log" Mar 01 10:28:56 crc kubenswrapper[4792]: I0301 10:28:56.473436 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt_f737af00-5e6f-4a95-bf94-738b72990ebd/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:28:56 crc kubenswrapper[4792]: I0301 10:28:56.993750 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_9c6de822-b7f5-4530-bb5b-ca879ff899fc/nova-api-log/0.log" Mar 01 10:28:57 crc kubenswrapper[4792]: I0301 10:28:57.077646 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_f95aafcd-79b6-4ece-b3e1-ee9ea32a2754/nova-cell0-conductor-conductor/0.log" Mar 01 10:28:57 crc kubenswrapper[4792]: I0301 10:28:57.461894 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_9c6de822-b7f5-4530-bb5b-ca879ff899fc/nova-api-api/0.log" Mar 01 10:28:57 crc kubenswrapper[4792]: I0301 10:28:57.485213 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_9ef6cc4e-2fd6-403b-a163-638395c30672/nova-cell1-conductor-conductor/0.log" Mar 01 10:28:57 crc kubenswrapper[4792]: I0301 10:28:57.579290 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_63afaac7-c934-4410-b2b5-ab04ad085489/nova-cell1-novncproxy-novncproxy/0.log" Mar 01 10:28:57 crc kubenswrapper[4792]: I0301 10:28:57.799148 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7_d7776778-c586-4ab6-8fdf-bfed4168992d/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:28:57 crc kubenswrapper[4792]: I0301 10:28:57.968546 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_cbf9560f-212f-460a-9a4d-250e20b00d18/nova-metadata-log/0.log" Mar 01 10:28:58 crc kubenswrapper[4792]: I0301 10:28:58.322247 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_3a38c1a1-88bc-4bce-aea4-13e676aab111/nova-scheduler-scheduler/0.log" Mar 01 10:28:58 crc kubenswrapper[4792]: I0301 10:28:58.430335 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f2d03d42-7830-444b-a8ae-c91e16d352b9/mysql-bootstrap/0.log" Mar 01 10:28:58 crc kubenswrapper[4792]: I0301 10:28:58.692478 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f2d03d42-7830-444b-a8ae-c91e16d352b9/mysql-bootstrap/0.log" Mar 01 10:28:58 crc kubenswrapper[4792]: I0301 10:28:58.711112 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f2d03d42-7830-444b-a8ae-c91e16d352b9/galera/0.log" Mar 01 10:28:58 crc kubenswrapper[4792]: I0301 10:28:58.947398 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b969e6eb-14a7-4e45-8342-ccbd05c06261/mysql-bootstrap/0.log" Mar 01 10:28:59 crc kubenswrapper[4792]: I0301 10:28:59.271484 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b969e6eb-14a7-4e45-8342-ccbd05c06261/galera/0.log" Mar 01 10:28:59 crc kubenswrapper[4792]: I0301 10:28:59.338380 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b969e6eb-14a7-4e45-8342-ccbd05c06261/mysql-bootstrap/0.log" Mar 01 10:28:59 crc kubenswrapper[4792]: I0301 10:28:59.606342 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_fecafda6-dcf9-46ea-8678-8da499154ad7/openstackclient/0.log" Mar 01 10:28:59 crc kubenswrapper[4792]: I0301 10:28:59.684387 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_cbf9560f-212f-460a-9a4d-250e20b00d18/nova-metadata-metadata/0.log" Mar 01 10:28:59 crc kubenswrapper[4792]: I0301 10:28:59.749417 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-7wc55_9493aff0-58e3-44ca-ba01-69f3b284d732/openstack-network-exporter/0.log" Mar 01 10:28:59 crc kubenswrapper[4792]: I0301 10:28:59.970764 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-mpvqc_d50ee3b1-4f97-4644-802d-04c85d9c3abc/ovn-controller/0.log" Mar 01 10:29:00 crc kubenswrapper[4792]: I0301 10:29:00.617123 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-nfzrr_22d78adc-2ff6-4f03-b60e-ac8e9a0f3699/ovsdb-server-init/0.log" Mar 01 10:29:00 crc kubenswrapper[4792]: I0301 10:29:00.848777 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-nfzrr_22d78adc-2ff6-4f03-b60e-ac8e9a0f3699/ovsdb-server-init/0.log" Mar 01 10:29:00 crc kubenswrapper[4792]: I0301 10:29:00.889887 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-nfzrr_22d78adc-2ff6-4f03-b60e-ac8e9a0f3699/ovs-vswitchd/0.log" Mar 01 10:29:01 crc kubenswrapper[4792]: I0301 10:29:01.032451 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-nfzrr_22d78adc-2ff6-4f03-b60e-ac8e9a0f3699/ovsdb-server/0.log" Mar 01 10:29:01 crc kubenswrapper[4792]: I0301 10:29:01.286576 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-bc5rl_e4b8a64b-6bea-426c-b1f5-2372342d4211/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:29:01 crc kubenswrapper[4792]: I0301 10:29:01.301056 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_1712e112-23fd-402b-ae0b-f63a594d4fab/openstack-network-exporter/0.log" Mar 01 10:29:01 crc kubenswrapper[4792]: I0301 10:29:01.393510 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_1712e112-23fd-402b-ae0b-f63a594d4fab/ovn-northd/0.log" Mar 01 10:29:01 crc kubenswrapper[4792]: I0301 10:29:01.587693 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_2c9312b5-705e-42f0-8462-62c8fdeb0791/openstack-network-exporter/0.log" Mar 01 10:29:01 crc kubenswrapper[4792]: I0301 10:29:01.681957 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_2c9312b5-705e-42f0-8462-62c8fdeb0791/ovsdbserver-nb/0.log" Mar 01 10:29:01 crc kubenswrapper[4792]: I0301 10:29:01.853170 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_a20f7417-3c04-411a-88b9-d60664faaee3/openstack-network-exporter/0.log" Mar 01 10:29:01 crc kubenswrapper[4792]: I0301 10:29:01.942179 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_a20f7417-3c04-411a-88b9-d60664faaee3/ovsdbserver-sb/0.log" Mar 01 10:29:02 crc kubenswrapper[4792]: I0301 10:29:02.118687 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-84f9696594-qdwsv_18f9e703-dec0-46e1-a428-580bdb68e54e/placement-api/0.log" Mar 01 10:29:02 crc kubenswrapper[4792]: I0301 10:29:02.202554 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-84f9696594-qdwsv_18f9e703-dec0-46e1-a428-580bdb68e54e/placement-log/0.log" Mar 01 10:29:02 crc kubenswrapper[4792]: I0301 10:29:02.331989 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e0e1dd7a-6a53-446d-bf90-5813f7a3fda0/setup-container/0.log" Mar 01 10:29:02 crc kubenswrapper[4792]: I0301 10:29:02.591706 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_63658b27-63d9-4a0f-afca-3a3c245b9b9d/setup-container/0.log" Mar 01 10:29:02 crc kubenswrapper[4792]: I0301 10:29:02.601114 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e0e1dd7a-6a53-446d-bf90-5813f7a3fda0/setup-container/0.log" Mar 01 10:29:02 crc kubenswrapper[4792]: I0301 10:29:02.636522 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e0e1dd7a-6a53-446d-bf90-5813f7a3fda0/rabbitmq/0.log" Mar 01 10:29:03 crc kubenswrapper[4792]: I0301 10:29:03.345477 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_63658b27-63d9-4a0f-afca-3a3c245b9b9d/setup-container/0.log" Mar 01 10:29:03 crc kubenswrapper[4792]: I0301 10:29:03.394673 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d_34275228-a1ab-4955-9d16-d184643a86d1/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:29:03 crc kubenswrapper[4792]: I0301 10:29:03.426863 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_63658b27-63d9-4a0f-afca-3a3c245b9b9d/rabbitmq/0.log" Mar 01 10:29:03 crc kubenswrapper[4792]: I0301 10:29:03.747734 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64_6c517000-6918-4f58-871b-7c4d26197ccf/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:29:03 crc kubenswrapper[4792]: I0301 10:29:03.793360 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-gxxr7_ff733b23-0a97-4623-9eeb-339aa02fc3b0/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:29:04 crc kubenswrapper[4792]: I0301 10:29:04.010185 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-8k5rj_ac58ff00-ba74-492a-97f1-e72c56686f1d/ssh-known-hosts-edpm-deployment/0.log" Mar 01 10:29:04 crc kubenswrapper[4792]: I0301 10:29:04.451525 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_ee1c75ce-61f7-4ce5-a757-b7405d7135bd/tempest-tests-tempest-tests-runner/0.log" Mar 01 10:29:04 crc kubenswrapper[4792]: I0301 10:29:04.617725 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_478d8531-4e8e-4775-999d-42af4afef106/test-operator-logs-container/0.log" Mar 01 10:29:05 crc kubenswrapper[4792]: I0301 10:29:05.264420 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-phn2l_59b987d8-9463-48cb-9651-1e5cb16aa764/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:29:07 crc kubenswrapper[4792]: I0301 10:29:07.408866 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:29:07 crc kubenswrapper[4792]: E0301 10:29:07.410096 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:29:16 crc kubenswrapper[4792]: I0301 10:29:16.914544 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_84d455ad-7bbb-4771-a8ed-9aa1984e1d40/memcached/0.log" Mar 01 10:29:22 crc kubenswrapper[4792]: I0301 10:29:22.409389 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:29:22 crc kubenswrapper[4792]: E0301 10:29:22.410193 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:29:37 crc kubenswrapper[4792]: I0301 10:29:37.409384 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:29:37 crc kubenswrapper[4792]: E0301 10:29:37.410336 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:29:38 crc kubenswrapper[4792]: I0301 10:29:38.155310 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2_d4447fd9-d2df-47f4-a94f-ff8b4c5080bd/util/0.log" Mar 01 10:29:38 crc kubenswrapper[4792]: I0301 10:29:38.397288 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2_d4447fd9-d2df-47f4-a94f-ff8b4c5080bd/util/0.log" Mar 01 10:29:38 crc kubenswrapper[4792]: I0301 10:29:38.443485 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2_d4447fd9-d2df-47f4-a94f-ff8b4c5080bd/pull/0.log" Mar 01 10:29:38 crc kubenswrapper[4792]: I0301 10:29:38.456301 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2_d4447fd9-d2df-47f4-a94f-ff8b4c5080bd/pull/0.log" Mar 01 10:29:38 crc kubenswrapper[4792]: I0301 10:29:38.596420 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2_d4447fd9-d2df-47f4-a94f-ff8b4c5080bd/pull/0.log" Mar 01 10:29:38 crc kubenswrapper[4792]: I0301 10:29:38.642153 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2_d4447fd9-d2df-47f4-a94f-ff8b4c5080bd/util/0.log" Mar 01 10:29:38 crc kubenswrapper[4792]: I0301 10:29:38.682552 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2_d4447fd9-d2df-47f4-a94f-ff8b4c5080bd/extract/0.log" Mar 01 10:29:39 crc kubenswrapper[4792]: I0301 10:29:39.355720 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-5d87c9d997-72srw_bf1f37ea-a566-4dfd-b45b-02f284f19ce3/manager/0.log" Mar 01 10:29:39 crc kubenswrapper[4792]: I0301 10:29:39.678871 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-64db6967f8-9wzbh_02dd5cc0-c44b-4ede-972b-9d26c9c54100/manager/0.log" Mar 01 10:29:39 crc kubenswrapper[4792]: I0301 10:29:39.958084 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-cf99c678f-7v65r_5044cf86-f557-41d4-b6c0-a41a668ac999/manager/0.log" Mar 01 10:29:40 crc kubenswrapper[4792]: I0301 10:29:40.277785 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-78bc7f9bd9-55qzx_cd83ed19-023d-43c2-92db-d290499db3d4/manager/0.log" Mar 01 10:29:40 crc kubenswrapper[4792]: I0301 10:29:40.880570 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-545456dc4-jvw5j_2af4993f-9ba9-4f7a-a31e-2bd133a7d4c5/manager/0.log" Mar 01 10:29:41 crc kubenswrapper[4792]: I0301 10:29:41.160325 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-f7fcc58b9-dsqtf_ea6739c2-185a-43e7-8fcf-0b2ae31957a0/manager/0.log" Mar 01 10:29:41 crc kubenswrapper[4792]: I0301 10:29:41.339109 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-55d77d7b5c-jlnsb_8741a141-0194-4eb2-956e-c41f4ffe1338/manager/0.log" Mar 01 10:29:41 crc kubenswrapper[4792]: I0301 10:29:41.553225 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7c789f89c6-wjf62_234d2ae5-7589-44cc-83f4-b0ee8a91940a/manager/0.log" Mar 01 10:29:41 crc kubenswrapper[4792]: I0301 10:29:41.660010 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-67d996989d-t5fsn_376afe52-646d-44b7-b32e-ce6cd6dc21a6/manager/0.log" Mar 01 10:29:42 crc kubenswrapper[4792]: I0301 10:29:42.495175 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-7b6bfb6475-hlzm6_1793465e-1273-4250-a238-c99798788618/manager/0.log" Mar 01 10:29:42 crc kubenswrapper[4792]: I0301 10:29:42.526404 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-54688575f-qjqd2_dfb10d33-c4f1-4287-be83-dff835c733ba/manager/0.log" Mar 01 10:29:42 crc kubenswrapper[4792]: I0301 10:29:42.888898 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-74b6b5dc96-knk7m_8307ba19-5fc4-4cfc-b3cd-cafe5eac9cb9/manager/0.log" Mar 01 10:29:42 crc kubenswrapper[4792]: I0301 10:29:42.906122 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5d86c7ddb7-54rpl_ecc17c18-7695-4d22-9a95-bcac51800d60/manager/0.log" Mar 01 10:29:43 crc kubenswrapper[4792]: I0301 10:29:43.635336 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7b4cc4776948grv_9244686e-175e-45f9-9eb7-23621cd1f3cd/manager/0.log" Mar 01 10:29:43 crc kubenswrapper[4792]: I0301 10:29:43.928597 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-595c94944c-vtchh_c967e6f5-6388-4ae5-9ccf-379b6305e1b0/operator/0.log" Mar 01 10:29:44 crc kubenswrapper[4792]: I0301 10:29:44.298343 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-5kfk4_dc22117a-72a7-4838-bb1c-111e91514b98/registry-server/0.log" Mar 01 10:29:44 crc kubenswrapper[4792]: I0301 10:29:44.375201 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-75684d597f-zkx7c_3d38195c-e4ff-49cf-9592-e9f52d73f2df/manager/0.log" Mar 01 10:29:44 crc kubenswrapper[4792]: I0301 10:29:44.556746 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-648564c9fc-jdn6k_808b8753-0a20-419b-8b04-dcbccaa2d77e/manager/0.log" Mar 01 10:29:44 crc kubenswrapper[4792]: I0301 10:29:44.700555 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-r5l9m_1ecd6b07-eda9-41d6-90af-6471699ff808/operator/0.log" Mar 01 10:29:44 crc kubenswrapper[4792]: I0301 10:29:44.958496 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9b9ff9f4d-mqndr_e0cef8e2-a392-4612-97c6-17c611b2a44e/manager/0.log" Mar 01 10:29:45 crc kubenswrapper[4792]: I0301 10:29:45.230279 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5fdb694969-jpxwz_4fe8270e-a46d-40bc-8d24-a4585b196f5e/manager/0.log" Mar 01 10:29:45 crc kubenswrapper[4792]: I0301 10:29:45.300765 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-55b5ff4dbb-bcnns_2970c60c-7b03-4667-99e4-08c094cdbfc2/manager/0.log" Mar 01 10:29:45 crc kubenswrapper[4792]: I0301 10:29:45.503252 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-bccc79885-64lkf_e45ebab9-87d5-4b2f-b3d1-f1832864584d/manager/0.log" Mar 01 10:29:45 crc kubenswrapper[4792]: I0301 10:29:45.952727 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-864b865b94-5ndlx_d1d3783f-78e9-461a-916a-5a46e3083e70/manager/0.log" Mar 01 10:29:49 crc kubenswrapper[4792]: I0301 10:29:49.411316 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:29:49 crc kubenswrapper[4792]: E0301 10:29:49.411832 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:29:50 crc kubenswrapper[4792]: I0301 10:29:50.694771 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6db6876945-ggspg_b9e3fd6b-e3e2-4380-b8d7-900891df562a/manager/0.log" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.154843 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539350-vrwj8"] Mar 01 10:30:00 crc kubenswrapper[4792]: E0301 10:30:00.155590 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a74a274-b6f7-421d-917b-33034be46bcf" containerName="extract-content" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.155620 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a74a274-b6f7-421d-917b-33034be46bcf" containerName="extract-content" Mar 01 10:30:00 crc kubenswrapper[4792]: E0301 10:30:00.155638 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a74a274-b6f7-421d-917b-33034be46bcf" containerName="extract-utilities" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.155644 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a74a274-b6f7-421d-917b-33034be46bcf" containerName="extract-utilities" Mar 01 10:30:00 crc kubenswrapper[4792]: E0301 10:30:00.155661 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a74a274-b6f7-421d-917b-33034be46bcf" containerName="registry-server" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.155668 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a74a274-b6f7-421d-917b-33034be46bcf" containerName="registry-server" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.155859 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a74a274-b6f7-421d-917b-33034be46bcf" containerName="registry-server" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.156665 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539350-vrwj8" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.159682 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.159726 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.160343 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.178545 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539350-hqxcz"] Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.179702 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539350-hqxcz" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.183402 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.183774 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.195660 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539350-vrwj8"] Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.212868 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539350-hqxcz"] Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.293722 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cf24ca39-d196-4cdd-8521-da51a4f51649-secret-volume\") pod \"collect-profiles-29539350-hqxcz\" (UID: \"cf24ca39-d196-4cdd-8521-da51a4f51649\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539350-hqxcz" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.294100 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fdq4\" (UniqueName: \"kubernetes.io/projected/f461ce0a-d106-4086-a698-987b95f5f03e-kube-api-access-4fdq4\") pod \"auto-csr-approver-29539350-vrwj8\" (UID: \"f461ce0a-d106-4086-a698-987b95f5f03e\") " pod="openshift-infra/auto-csr-approver-29539350-vrwj8" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.294252 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf24ca39-d196-4cdd-8521-da51a4f51649-config-volume\") pod \"collect-profiles-29539350-hqxcz\" (UID: \"cf24ca39-d196-4cdd-8521-da51a4f51649\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539350-hqxcz" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.294415 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf4pb\" (UniqueName: \"kubernetes.io/projected/cf24ca39-d196-4cdd-8521-da51a4f51649-kube-api-access-gf4pb\") pod \"collect-profiles-29539350-hqxcz\" (UID: \"cf24ca39-d196-4cdd-8521-da51a4f51649\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539350-hqxcz" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.395801 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cf24ca39-d196-4cdd-8521-da51a4f51649-secret-volume\") pod \"collect-profiles-29539350-hqxcz\" (UID: \"cf24ca39-d196-4cdd-8521-da51a4f51649\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539350-hqxcz" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.395900 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fdq4\" (UniqueName: \"kubernetes.io/projected/f461ce0a-d106-4086-a698-987b95f5f03e-kube-api-access-4fdq4\") pod \"auto-csr-approver-29539350-vrwj8\" (UID: \"f461ce0a-d106-4086-a698-987b95f5f03e\") " pod="openshift-infra/auto-csr-approver-29539350-vrwj8" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.395978 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf24ca39-d196-4cdd-8521-da51a4f51649-config-volume\") pod \"collect-profiles-29539350-hqxcz\" (UID: \"cf24ca39-d196-4cdd-8521-da51a4f51649\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539350-hqxcz" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.396050 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gf4pb\" (UniqueName: \"kubernetes.io/projected/cf24ca39-d196-4cdd-8521-da51a4f51649-kube-api-access-gf4pb\") pod \"collect-profiles-29539350-hqxcz\" (UID: \"cf24ca39-d196-4cdd-8521-da51a4f51649\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539350-hqxcz" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.397514 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf24ca39-d196-4cdd-8521-da51a4f51649-config-volume\") pod \"collect-profiles-29539350-hqxcz\" (UID: \"cf24ca39-d196-4cdd-8521-da51a4f51649\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539350-hqxcz" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.563713 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf4pb\" (UniqueName: \"kubernetes.io/projected/cf24ca39-d196-4cdd-8521-da51a4f51649-kube-api-access-gf4pb\") pod \"collect-profiles-29539350-hqxcz\" (UID: \"cf24ca39-d196-4cdd-8521-da51a4f51649\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539350-hqxcz" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.563831 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cf24ca39-d196-4cdd-8521-da51a4f51649-secret-volume\") pod \"collect-profiles-29539350-hqxcz\" (UID: \"cf24ca39-d196-4cdd-8521-da51a4f51649\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539350-hqxcz" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.564045 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fdq4\" (UniqueName: \"kubernetes.io/projected/f461ce0a-d106-4086-a698-987b95f5f03e-kube-api-access-4fdq4\") pod \"auto-csr-approver-29539350-vrwj8\" (UID: \"f461ce0a-d106-4086-a698-987b95f5f03e\") " pod="openshift-infra/auto-csr-approver-29539350-vrwj8" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.785208 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539350-vrwj8" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.804654 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539350-hqxcz" Mar 01 10:30:01 crc kubenswrapper[4792]: I0301 10:30:01.356992 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539350-vrwj8"] Mar 01 10:30:01 crc kubenswrapper[4792]: I0301 10:30:01.368823 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539350-hqxcz"] Mar 01 10:30:01 crc kubenswrapper[4792]: I0301 10:30:01.416261 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:30:01 crc kubenswrapper[4792]: E0301 10:30:01.416807 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:30:01 crc kubenswrapper[4792]: I0301 10:30:01.924736 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539350-vrwj8" event={"ID":"f461ce0a-d106-4086-a698-987b95f5f03e","Type":"ContainerStarted","Data":"10ba251a37312235f0dee9803ff2e110d026416ee1d83aa9693bad44cb7d068e"} Mar 01 10:30:01 crc kubenswrapper[4792]: I0301 10:30:01.936067 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539350-hqxcz" event={"ID":"cf24ca39-d196-4cdd-8521-da51a4f51649","Type":"ContainerStarted","Data":"f553d8ce8b9a885e297acd39ab9caadfb2d86a33b2a8d2eb99d093eaf06f4e9f"} Mar 01 10:30:01 crc kubenswrapper[4792]: I0301 10:30:01.936120 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539350-hqxcz" event={"ID":"cf24ca39-d196-4cdd-8521-da51a4f51649","Type":"ContainerStarted","Data":"f62065739fd8378d875f429e28e19ba02569d3da54f6ca4c619f6d141a3465fe"} Mar 01 10:30:01 crc kubenswrapper[4792]: I0301 10:30:01.961254 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29539350-hqxcz" podStartSLOduration=1.9612374940000001 podStartE2EDuration="1.961237494s" podCreationTimestamp="2026-03-01 10:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 10:30:01.959025329 +0000 UTC m=+4931.200904546" watchObservedRunningTime="2026-03-01 10:30:01.961237494 +0000 UTC m=+4931.203116691" Mar 01 10:30:02 crc kubenswrapper[4792]: E0301 10:30:02.242164 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf24ca39_d196_4cdd_8521_da51a4f51649.slice/crio-conmon-f553d8ce8b9a885e297acd39ab9caadfb2d86a33b2a8d2eb99d093eaf06f4e9f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf24ca39_d196_4cdd_8521_da51a4f51649.slice/crio-f553d8ce8b9a885e297acd39ab9caadfb2d86a33b2a8d2eb99d093eaf06f4e9f.scope\": RecentStats: unable to find data in memory cache]" Mar 01 10:30:02 crc kubenswrapper[4792]: I0301 10:30:02.945402 4792 generic.go:334] "Generic (PLEG): container finished" podID="cf24ca39-d196-4cdd-8521-da51a4f51649" containerID="f553d8ce8b9a885e297acd39ab9caadfb2d86a33b2a8d2eb99d093eaf06f4e9f" exitCode=0 Mar 01 10:30:02 crc kubenswrapper[4792]: I0301 10:30:02.945496 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539350-hqxcz" event={"ID":"cf24ca39-d196-4cdd-8521-da51a4f51649","Type":"ContainerDied","Data":"f553d8ce8b9a885e297acd39ab9caadfb2d86a33b2a8d2eb99d093eaf06f4e9f"} Mar 01 10:30:03 crc kubenswrapper[4792]: I0301 10:30:03.954040 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539350-vrwj8" event={"ID":"f461ce0a-d106-4086-a698-987b95f5f03e","Type":"ContainerStarted","Data":"72dbc4f60f34ad4150df8188c6c0da347f12c8f2e84aa4c765ae93732fb406a9"} Mar 01 10:30:04 crc kubenswrapper[4792]: I0301 10:30:04.326441 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539350-hqxcz" Mar 01 10:30:04 crc kubenswrapper[4792]: I0301 10:30:04.345384 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29539350-vrwj8" podStartSLOduration=2.265024954 podStartE2EDuration="4.345364456s" podCreationTimestamp="2026-03-01 10:30:00 +0000 UTC" firstStartedPulling="2026-03-01 10:30:01.384482646 +0000 UTC m=+4930.626361843" lastFinishedPulling="2026-03-01 10:30:03.464822148 +0000 UTC m=+4932.706701345" observedRunningTime="2026-03-01 10:30:03.970934389 +0000 UTC m=+4933.212813576" watchObservedRunningTime="2026-03-01 10:30:04.345364456 +0000 UTC m=+4933.587243653" Mar 01 10:30:04 crc kubenswrapper[4792]: I0301 10:30:04.385477 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf4pb\" (UniqueName: \"kubernetes.io/projected/cf24ca39-d196-4cdd-8521-da51a4f51649-kube-api-access-gf4pb\") pod \"cf24ca39-d196-4cdd-8521-da51a4f51649\" (UID: \"cf24ca39-d196-4cdd-8521-da51a4f51649\") " Mar 01 10:30:04 crc kubenswrapper[4792]: I0301 10:30:04.385694 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cf24ca39-d196-4cdd-8521-da51a4f51649-secret-volume\") pod \"cf24ca39-d196-4cdd-8521-da51a4f51649\" (UID: \"cf24ca39-d196-4cdd-8521-da51a4f51649\") " Mar 01 10:30:04 crc kubenswrapper[4792]: I0301 10:30:04.385720 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf24ca39-d196-4cdd-8521-da51a4f51649-config-volume\") pod \"cf24ca39-d196-4cdd-8521-da51a4f51649\" (UID: \"cf24ca39-d196-4cdd-8521-da51a4f51649\") " Mar 01 10:30:04 crc kubenswrapper[4792]: I0301 10:30:04.390097 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf24ca39-d196-4cdd-8521-da51a4f51649-config-volume" (OuterVolumeSpecName: "config-volume") pod "cf24ca39-d196-4cdd-8521-da51a4f51649" (UID: "cf24ca39-d196-4cdd-8521-da51a4f51649"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 10:30:04 crc kubenswrapper[4792]: I0301 10:30:04.398835 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf24ca39-d196-4cdd-8521-da51a4f51649-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "cf24ca39-d196-4cdd-8521-da51a4f51649" (UID: "cf24ca39-d196-4cdd-8521-da51a4f51649"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:30:04 crc kubenswrapper[4792]: I0301 10:30:04.420127 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf24ca39-d196-4cdd-8521-da51a4f51649-kube-api-access-gf4pb" (OuterVolumeSpecName: "kube-api-access-gf4pb") pod "cf24ca39-d196-4cdd-8521-da51a4f51649" (UID: "cf24ca39-d196-4cdd-8521-da51a4f51649"). InnerVolumeSpecName "kube-api-access-gf4pb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:30:04 crc kubenswrapper[4792]: I0301 10:30:04.442580 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539305-ml8xz"] Mar 01 10:30:04 crc kubenswrapper[4792]: I0301 10:30:04.455324 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539305-ml8xz"] Mar 01 10:30:04 crc kubenswrapper[4792]: I0301 10:30:04.488182 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf4pb\" (UniqueName: \"kubernetes.io/projected/cf24ca39-d196-4cdd-8521-da51a4f51649-kube-api-access-gf4pb\") on node \"crc\" DevicePath \"\"" Mar 01 10:30:04 crc kubenswrapper[4792]: I0301 10:30:04.489828 4792 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf24ca39-d196-4cdd-8521-da51a4f51649-config-volume\") on node \"crc\" DevicePath \"\"" Mar 01 10:30:04 crc kubenswrapper[4792]: I0301 10:30:04.489939 4792 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cf24ca39-d196-4cdd-8521-da51a4f51649-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 01 10:30:04 crc kubenswrapper[4792]: I0301 10:30:04.966856 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539350-hqxcz" event={"ID":"cf24ca39-d196-4cdd-8521-da51a4f51649","Type":"ContainerDied","Data":"f62065739fd8378d875f429e28e19ba02569d3da54f6ca4c619f6d141a3465fe"} Mar 01 10:30:04 crc kubenswrapper[4792]: I0301 10:30:04.966896 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f62065739fd8378d875f429e28e19ba02569d3da54f6ca4c619f6d141a3465fe" Mar 01 10:30:04 crc kubenswrapper[4792]: I0301 10:30:04.968127 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539350-hqxcz" Mar 01 10:30:04 crc kubenswrapper[4792]: I0301 10:30:04.970319 4792 generic.go:334] "Generic (PLEG): container finished" podID="f461ce0a-d106-4086-a698-987b95f5f03e" containerID="72dbc4f60f34ad4150df8188c6c0da347f12c8f2e84aa4c765ae93732fb406a9" exitCode=0 Mar 01 10:30:04 crc kubenswrapper[4792]: I0301 10:30:04.970367 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539350-vrwj8" event={"ID":"f461ce0a-d106-4086-a698-987b95f5f03e","Type":"ContainerDied","Data":"72dbc4f60f34ad4150df8188c6c0da347f12c8f2e84aa4c765ae93732fb406a9"} Mar 01 10:30:05 crc kubenswrapper[4792]: I0301 10:30:05.421176 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7a9ad8e-1c99-4a79-87eb-912aab1dc48c" path="/var/lib/kubelet/pods/e7a9ad8e-1c99-4a79-87eb-912aab1dc48c/volumes" Mar 01 10:30:06 crc kubenswrapper[4792]: I0301 10:30:06.312383 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539350-vrwj8" Mar 01 10:30:06 crc kubenswrapper[4792]: I0301 10:30:06.432694 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fdq4\" (UniqueName: \"kubernetes.io/projected/f461ce0a-d106-4086-a698-987b95f5f03e-kube-api-access-4fdq4\") pod \"f461ce0a-d106-4086-a698-987b95f5f03e\" (UID: \"f461ce0a-d106-4086-a698-987b95f5f03e\") " Mar 01 10:30:06 crc kubenswrapper[4792]: I0301 10:30:06.439343 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f461ce0a-d106-4086-a698-987b95f5f03e-kube-api-access-4fdq4" (OuterVolumeSpecName: "kube-api-access-4fdq4") pod "f461ce0a-d106-4086-a698-987b95f5f03e" (UID: "f461ce0a-d106-4086-a698-987b95f5f03e"). InnerVolumeSpecName "kube-api-access-4fdq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:30:06 crc kubenswrapper[4792]: I0301 10:30:06.535219 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fdq4\" (UniqueName: \"kubernetes.io/projected/f461ce0a-d106-4086-a698-987b95f5f03e-kube-api-access-4fdq4\") on node \"crc\" DevicePath \"\"" Mar 01 10:30:07 crc kubenswrapper[4792]: I0301 10:30:07.009812 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539350-vrwj8" event={"ID":"f461ce0a-d106-4086-a698-987b95f5f03e","Type":"ContainerDied","Data":"10ba251a37312235f0dee9803ff2e110d026416ee1d83aa9693bad44cb7d068e"} Mar 01 10:30:07 crc kubenswrapper[4792]: I0301 10:30:07.010064 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10ba251a37312235f0dee9803ff2e110d026416ee1d83aa9693bad44cb7d068e" Mar 01 10:30:07 crc kubenswrapper[4792]: I0301 10:30:07.010118 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539350-vrwj8" Mar 01 10:30:07 crc kubenswrapper[4792]: I0301 10:30:07.033340 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539344-r7v49"] Mar 01 10:30:07 crc kubenswrapper[4792]: I0301 10:30:07.062387 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539344-r7v49"] Mar 01 10:30:07 crc kubenswrapper[4792]: I0301 10:30:07.419122 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c5eb940-780b-4b4d-ab60-e1ad0c284811" path="/var/lib/kubelet/pods/6c5eb940-780b-4b4d-ab60-e1ad0c284811/volumes" Mar 01 10:30:10 crc kubenswrapper[4792]: I0301 10:30:10.640175 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-9smfd_e0b63d94-59de-45da-8058-89714bea7a90/control-plane-machine-set-operator/0.log" Mar 01 10:30:10 crc kubenswrapper[4792]: I0301 10:30:10.757048 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-nv4bp_51683a24-edad-4808-b2ec-6a628bfdd937/kube-rbac-proxy/0.log" Mar 01 10:30:10 crc kubenswrapper[4792]: I0301 10:30:10.824149 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-nv4bp_51683a24-edad-4808-b2ec-6a628bfdd937/machine-api-operator/0.log" Mar 01 10:30:16 crc kubenswrapper[4792]: I0301 10:30:16.410590 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:30:16 crc kubenswrapper[4792]: E0301 10:30:16.411444 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:30:23 crc kubenswrapper[4792]: I0301 10:30:23.056243 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-4qgsm_bf71ada0-c7b2-4255-bb2c-31ec3309a29d/cert-manager-controller/0.log" Mar 01 10:30:23 crc kubenswrapper[4792]: I0301 10:30:23.152368 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-tm5s6_2071887a-31a9-428d-92d0-bf8a361011ca/cert-manager-cainjector/0.log" Mar 01 10:30:23 crc kubenswrapper[4792]: I0301 10:30:23.235111 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-rckpb_a03eedd4-ecde-4905-95a7-c43b45ef9da9/cert-manager-webhook/0.log" Mar 01 10:30:27 crc kubenswrapper[4792]: I0301 10:30:27.409587 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:30:27 crc kubenswrapper[4792]: E0301 10:30:27.410875 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:30:35 crc kubenswrapper[4792]: I0301 10:30:35.813070 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-mtxkm_f7ca92c8-f38b-4a0a-b330-5809993cbb49/nmstate-console-plugin/0.log" Mar 01 10:30:36 crc kubenswrapper[4792]: I0301 10:30:36.033732 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-97cv9_bfe2cc56-28ca-4201-ba5a-4208dd1ec818/nmstate-metrics/0.log" Mar 01 10:30:36 crc kubenswrapper[4792]: I0301 10:30:36.041756 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-9j2tz_7105919f-ddac-45db-a8f7-bd927e5737df/nmstate-handler/0.log" Mar 01 10:30:36 crc kubenswrapper[4792]: I0301 10:30:36.065183 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-97cv9_bfe2cc56-28ca-4201-ba5a-4208dd1ec818/kube-rbac-proxy/0.log" Mar 01 10:30:36 crc kubenswrapper[4792]: I0301 10:30:36.350187 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-chfpw_fb942d1c-2a1a-4265-ae29-02f185d4cc40/nmstate-operator/0.log" Mar 01 10:30:36 crc kubenswrapper[4792]: I0301 10:30:36.416277 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-zwhpc_aa2300d6-10c0-4dc9-812a-fcb30f09920e/nmstate-webhook/0.log" Mar 01 10:30:40 crc kubenswrapper[4792]: I0301 10:30:40.451569 4792 scope.go:117] "RemoveContainer" containerID="1f71a688006db007bbde2ad2f2afb131f6dc80a772403871242301338cd9bc3e" Mar 01 10:30:40 crc kubenswrapper[4792]: I0301 10:30:40.516328 4792 scope.go:117] "RemoveContainer" containerID="5aa67c39154d74ad3f75b4616f3e6439b947759682dc6085d9ce37f8cd99894c" Mar 01 10:30:42 crc kubenswrapper[4792]: I0301 10:30:42.409135 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:30:43 crc kubenswrapper[4792]: I0301 10:30:43.295772 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerStarted","Data":"7b8e6b0aba17b92d064bcafd390c97d2064d3be27e2b966c778b962700333543"} Mar 01 10:31:09 crc kubenswrapper[4792]: I0301 10:31:09.054503 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-twxml_f73a6813-31ea-4018-bd23-45bf2f1dfe89/kube-rbac-proxy/0.log" Mar 01 10:31:09 crc kubenswrapper[4792]: I0301 10:31:09.138773 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-twxml_f73a6813-31ea-4018-bd23-45bf2f1dfe89/controller/0.log" Mar 01 10:31:09 crc kubenswrapper[4792]: I0301 10:31:09.259866 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-frr-files/0.log" Mar 01 10:31:09 crc kubenswrapper[4792]: I0301 10:31:09.411468 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-reloader/0.log" Mar 01 10:31:09 crc kubenswrapper[4792]: I0301 10:31:09.439172 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-frr-files/0.log" Mar 01 10:31:09 crc kubenswrapper[4792]: I0301 10:31:09.473843 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-metrics/0.log" Mar 01 10:31:09 crc kubenswrapper[4792]: I0301 10:31:09.507314 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-reloader/0.log" Mar 01 10:31:09 crc kubenswrapper[4792]: I0301 10:31:09.690336 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-frr-files/0.log" Mar 01 10:31:09 crc kubenswrapper[4792]: I0301 10:31:09.717232 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-metrics/0.log" Mar 01 10:31:09 crc kubenswrapper[4792]: I0301 10:31:09.722888 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-reloader/0.log" Mar 01 10:31:09 crc kubenswrapper[4792]: I0301 10:31:09.746794 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-metrics/0.log" Mar 01 10:31:09 crc kubenswrapper[4792]: I0301 10:31:09.912162 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-frr-files/0.log" Mar 01 10:31:09 crc kubenswrapper[4792]: I0301 10:31:09.916354 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-reloader/0.log" Mar 01 10:31:09 crc kubenswrapper[4792]: I0301 10:31:09.932692 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-metrics/0.log" Mar 01 10:31:10 crc kubenswrapper[4792]: I0301 10:31:09.998795 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/controller/0.log" Mar 01 10:31:10 crc kubenswrapper[4792]: I0301 10:31:10.102582 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/frr-metrics/0.log" Mar 01 10:31:10 crc kubenswrapper[4792]: I0301 10:31:10.190660 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/kube-rbac-proxy/0.log" Mar 01 10:31:10 crc kubenswrapper[4792]: I0301 10:31:10.253671 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/kube-rbac-proxy-frr/0.log" Mar 01 10:31:10 crc kubenswrapper[4792]: I0301 10:31:10.287047 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/reloader/0.log" Mar 01 10:31:10 crc kubenswrapper[4792]: I0301 10:31:10.602348 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-kfnzk_d2f0572c-e661-495c-873c-6e2d18f2ab7d/frr-k8s-webhook-server/0.log" Mar 01 10:31:10 crc kubenswrapper[4792]: I0301 10:31:10.795736 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5cd84fcfbc-lrpmz_ba22e25a-31e8-4ca7-b169-f7433eda818b/manager/0.log" Mar 01 10:31:10 crc kubenswrapper[4792]: I0301 10:31:10.916064 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-776c7d78bd-jwfh6_cf86866e-8afa-44da-a688-e1c018a025bd/webhook-server/0.log" Mar 01 10:31:11 crc kubenswrapper[4792]: I0301 10:31:11.269844 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zpr27_8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7/kube-rbac-proxy/0.log" Mar 01 10:31:11 crc kubenswrapper[4792]: I0301 10:31:11.727783 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zpr27_8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7/speaker/0.log" Mar 01 10:31:12 crc kubenswrapper[4792]: I0301 10:31:12.003735 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/frr/0.log" Mar 01 10:31:25 crc kubenswrapper[4792]: I0301 10:31:25.579633 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst_8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd/util/0.log" Mar 01 10:31:25 crc kubenswrapper[4792]: I0301 10:31:25.699959 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst_8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd/util/0.log" Mar 01 10:31:25 crc kubenswrapper[4792]: I0301 10:31:25.769272 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst_8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd/pull/0.log" Mar 01 10:31:25 crc kubenswrapper[4792]: I0301 10:31:25.804633 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst_8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd/pull/0.log" Mar 01 10:31:25 crc kubenswrapper[4792]: I0301 10:31:25.952827 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst_8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd/extract/0.log" Mar 01 10:31:25 crc kubenswrapper[4792]: I0301 10:31:25.979239 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst_8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd/pull/0.log" Mar 01 10:31:25 crc kubenswrapper[4792]: I0301 10:31:25.992794 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst_8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd/util/0.log" Mar 01 10:31:26 crc kubenswrapper[4792]: I0301 10:31:26.857069 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sksw8_55c448d6-e926-4b07-8aec-8195d42d2e30/extract-utilities/0.log" Mar 01 10:31:26 crc kubenswrapper[4792]: I0301 10:31:26.960809 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sksw8_55c448d6-e926-4b07-8aec-8195d42d2e30/extract-utilities/0.log" Mar 01 10:31:26 crc kubenswrapper[4792]: I0301 10:31:26.980461 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sksw8_55c448d6-e926-4b07-8aec-8195d42d2e30/extract-content/0.log" Mar 01 10:31:27 crc kubenswrapper[4792]: I0301 10:31:27.031361 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sksw8_55c448d6-e926-4b07-8aec-8195d42d2e30/extract-content/0.log" Mar 01 10:31:27 crc kubenswrapper[4792]: I0301 10:31:27.256845 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sksw8_55c448d6-e926-4b07-8aec-8195d42d2e30/extract-utilities/0.log" Mar 01 10:31:27 crc kubenswrapper[4792]: I0301 10:31:27.262640 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sksw8_55c448d6-e926-4b07-8aec-8195d42d2e30/extract-content/0.log" Mar 01 10:31:27 crc kubenswrapper[4792]: I0301 10:31:27.474483 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-48zdf_d875f1af-e90b-4882-b472-f91651d468a6/extract-utilities/0.log" Mar 01 10:31:27 crc kubenswrapper[4792]: I0301 10:31:27.754116 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-48zdf_d875f1af-e90b-4882-b472-f91651d468a6/extract-utilities/0.log" Mar 01 10:31:27 crc kubenswrapper[4792]: I0301 10:31:27.790452 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sksw8_55c448d6-e926-4b07-8aec-8195d42d2e30/registry-server/0.log" Mar 01 10:31:27 crc kubenswrapper[4792]: I0301 10:31:27.831245 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-48zdf_d875f1af-e90b-4882-b472-f91651d468a6/extract-content/0.log" Mar 01 10:31:28 crc kubenswrapper[4792]: I0301 10:31:28.127291 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-48zdf_d875f1af-e90b-4882-b472-f91651d468a6/extract-content/0.log" Mar 01 10:31:28 crc kubenswrapper[4792]: I0301 10:31:28.314980 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-48zdf_d875f1af-e90b-4882-b472-f91651d468a6/extract-content/0.log" Mar 01 10:31:28 crc kubenswrapper[4792]: I0301 10:31:28.364663 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-48zdf_d875f1af-e90b-4882-b472-f91651d468a6/extract-utilities/0.log" Mar 01 10:31:28 crc kubenswrapper[4792]: I0301 10:31:28.646083 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t_a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4/util/0.log" Mar 01 10:31:28 crc kubenswrapper[4792]: I0301 10:31:28.912709 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t_a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4/pull/0.log" Mar 01 10:31:28 crc kubenswrapper[4792]: I0301 10:31:28.927448 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-48zdf_d875f1af-e90b-4882-b472-f91651d468a6/registry-server/0.log" Mar 01 10:31:28 crc kubenswrapper[4792]: I0301 10:31:28.978049 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t_a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4/pull/0.log" Mar 01 10:31:28 crc kubenswrapper[4792]: I0301 10:31:28.996896 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t_a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4/util/0.log" Mar 01 10:31:29 crc kubenswrapper[4792]: I0301 10:31:29.100711 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t_a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4/util/0.log" Mar 01 10:31:29 crc kubenswrapper[4792]: I0301 10:31:29.134520 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t_a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4/pull/0.log" Mar 01 10:31:29 crc kubenswrapper[4792]: I0301 10:31:29.175974 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t_a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4/extract/0.log" Mar 01 10:31:29 crc kubenswrapper[4792]: I0301 10:31:29.309755 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-gfkbs_46fe59e7-8122-4621-ae8d-237a91daee5e/marketplace-operator/0.log" Mar 01 10:31:29 crc kubenswrapper[4792]: I0301 10:31:29.384456 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zjvjw_3003e690-c3dd-4236-a95c-a0fb6ccb438e/extract-utilities/0.log" Mar 01 10:31:29 crc kubenswrapper[4792]: I0301 10:31:29.554706 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zjvjw_3003e690-c3dd-4236-a95c-a0fb6ccb438e/extract-content/0.log" Mar 01 10:31:29 crc kubenswrapper[4792]: I0301 10:31:29.556126 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zjvjw_3003e690-c3dd-4236-a95c-a0fb6ccb438e/extract-utilities/0.log" Mar 01 10:31:29 crc kubenswrapper[4792]: I0301 10:31:29.620231 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zjvjw_3003e690-c3dd-4236-a95c-a0fb6ccb438e/extract-content/0.log" Mar 01 10:31:29 crc kubenswrapper[4792]: I0301 10:31:29.841150 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zjvjw_3003e690-c3dd-4236-a95c-a0fb6ccb438e/extract-content/0.log" Mar 01 10:31:29 crc kubenswrapper[4792]: I0301 10:31:29.849080 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zjvjw_3003e690-c3dd-4236-a95c-a0fb6ccb438e/extract-utilities/0.log" Mar 01 10:31:29 crc kubenswrapper[4792]: I0301 10:31:29.943148 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zjvjw_3003e690-c3dd-4236-a95c-a0fb6ccb438e/registry-server/0.log" Mar 01 10:31:30 crc kubenswrapper[4792]: I0301 10:31:30.036425 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7fdqh_38bc0c09-286e-427a-95c2-8e2c9213b142/extract-utilities/0.log" Mar 01 10:31:30 crc kubenswrapper[4792]: I0301 10:31:30.183476 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7fdqh_38bc0c09-286e-427a-95c2-8e2c9213b142/extract-utilities/0.log" Mar 01 10:31:30 crc kubenswrapper[4792]: I0301 10:31:30.200838 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7fdqh_38bc0c09-286e-427a-95c2-8e2c9213b142/extract-content/0.log" Mar 01 10:31:30 crc kubenswrapper[4792]: I0301 10:31:30.237736 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7fdqh_38bc0c09-286e-427a-95c2-8e2c9213b142/extract-content/0.log" Mar 01 10:31:30 crc kubenswrapper[4792]: I0301 10:31:30.382163 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7fdqh_38bc0c09-286e-427a-95c2-8e2c9213b142/extract-utilities/0.log" Mar 01 10:31:30 crc kubenswrapper[4792]: I0301 10:31:30.382671 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7fdqh_38bc0c09-286e-427a-95c2-8e2c9213b142/extract-content/0.log" Mar 01 10:31:31 crc kubenswrapper[4792]: I0301 10:31:31.017840 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7fdqh_38bc0c09-286e-427a-95c2-8e2c9213b142/registry-server/0.log" Mar 01 10:32:00 crc kubenswrapper[4792]: I0301 10:32:00.142260 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539352-dwpdm"] Mar 01 10:32:00 crc kubenswrapper[4792]: E0301 10:32:00.142975 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f461ce0a-d106-4086-a698-987b95f5f03e" containerName="oc" Mar 01 10:32:00 crc kubenswrapper[4792]: I0301 10:32:00.142988 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f461ce0a-d106-4086-a698-987b95f5f03e" containerName="oc" Mar 01 10:32:00 crc kubenswrapper[4792]: E0301 10:32:00.142997 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf24ca39-d196-4cdd-8521-da51a4f51649" containerName="collect-profiles" Mar 01 10:32:00 crc kubenswrapper[4792]: I0301 10:32:00.143003 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf24ca39-d196-4cdd-8521-da51a4f51649" containerName="collect-profiles" Mar 01 10:32:00 crc kubenswrapper[4792]: I0301 10:32:00.143187 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f461ce0a-d106-4086-a698-987b95f5f03e" containerName="oc" Mar 01 10:32:00 crc kubenswrapper[4792]: I0301 10:32:00.143208 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf24ca39-d196-4cdd-8521-da51a4f51649" containerName="collect-profiles" Mar 01 10:32:00 crc kubenswrapper[4792]: I0301 10:32:00.143803 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539352-dwpdm" Mar 01 10:32:00 crc kubenswrapper[4792]: I0301 10:32:00.149115 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 10:32:00 crc kubenswrapper[4792]: I0301 10:32:00.149217 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 10:32:00 crc kubenswrapper[4792]: I0301 10:32:00.149379 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 10:32:00 crc kubenswrapper[4792]: I0301 10:32:00.176349 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539352-dwpdm"] Mar 01 10:32:00 crc kubenswrapper[4792]: I0301 10:32:00.242024 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s45ll\" (UniqueName: \"kubernetes.io/projected/a3569407-4c99-405b-801c-6b0378e1643b-kube-api-access-s45ll\") pod \"auto-csr-approver-29539352-dwpdm\" (UID: \"a3569407-4c99-405b-801c-6b0378e1643b\") " pod="openshift-infra/auto-csr-approver-29539352-dwpdm" Mar 01 10:32:00 crc kubenswrapper[4792]: I0301 10:32:00.343838 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s45ll\" (UniqueName: \"kubernetes.io/projected/a3569407-4c99-405b-801c-6b0378e1643b-kube-api-access-s45ll\") pod \"auto-csr-approver-29539352-dwpdm\" (UID: \"a3569407-4c99-405b-801c-6b0378e1643b\") " pod="openshift-infra/auto-csr-approver-29539352-dwpdm" Mar 01 10:32:00 crc kubenswrapper[4792]: I0301 10:32:00.363685 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s45ll\" (UniqueName: \"kubernetes.io/projected/a3569407-4c99-405b-801c-6b0378e1643b-kube-api-access-s45ll\") pod \"auto-csr-approver-29539352-dwpdm\" (UID: \"a3569407-4c99-405b-801c-6b0378e1643b\") " pod="openshift-infra/auto-csr-approver-29539352-dwpdm" Mar 01 10:32:00 crc kubenswrapper[4792]: I0301 10:32:00.493543 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539352-dwpdm" Mar 01 10:32:01 crc kubenswrapper[4792]: I0301 10:32:01.088218 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539352-dwpdm"] Mar 01 10:32:01 crc kubenswrapper[4792]: I0301 10:32:01.106542 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 01 10:32:02 crc kubenswrapper[4792]: I0301 10:32:02.022928 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539352-dwpdm" event={"ID":"a3569407-4c99-405b-801c-6b0378e1643b","Type":"ContainerStarted","Data":"34c7bb3f42e4933f2ed76f82d2443b6df65a41268d1ac12d3d1128bf2f7ce4c7"} Mar 01 10:32:03 crc kubenswrapper[4792]: I0301 10:32:03.032369 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539352-dwpdm" event={"ID":"a3569407-4c99-405b-801c-6b0378e1643b","Type":"ContainerStarted","Data":"59031c13708156c9066817a64f64803e8bd7d915e063b5c1f1224fd3d37f0722"} Mar 01 10:32:04 crc kubenswrapper[4792]: I0301 10:32:04.041395 4792 generic.go:334] "Generic (PLEG): container finished" podID="a3569407-4c99-405b-801c-6b0378e1643b" containerID="59031c13708156c9066817a64f64803e8bd7d915e063b5c1f1224fd3d37f0722" exitCode=0 Mar 01 10:32:04 crc kubenswrapper[4792]: I0301 10:32:04.041436 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539352-dwpdm" event={"ID":"a3569407-4c99-405b-801c-6b0378e1643b","Type":"ContainerDied","Data":"59031c13708156c9066817a64f64803e8bd7d915e063b5c1f1224fd3d37f0722"} Mar 01 10:32:05 crc kubenswrapper[4792]: I0301 10:32:05.465003 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539352-dwpdm" Mar 01 10:32:05 crc kubenswrapper[4792]: I0301 10:32:05.538113 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s45ll\" (UniqueName: \"kubernetes.io/projected/a3569407-4c99-405b-801c-6b0378e1643b-kube-api-access-s45ll\") pod \"a3569407-4c99-405b-801c-6b0378e1643b\" (UID: \"a3569407-4c99-405b-801c-6b0378e1643b\") " Mar 01 10:32:05 crc kubenswrapper[4792]: I0301 10:32:05.561177 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3569407-4c99-405b-801c-6b0378e1643b-kube-api-access-s45ll" (OuterVolumeSpecName: "kube-api-access-s45ll") pod "a3569407-4c99-405b-801c-6b0378e1643b" (UID: "a3569407-4c99-405b-801c-6b0378e1643b"). InnerVolumeSpecName "kube-api-access-s45ll". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:32:05 crc kubenswrapper[4792]: I0301 10:32:05.640356 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s45ll\" (UniqueName: \"kubernetes.io/projected/a3569407-4c99-405b-801c-6b0378e1643b-kube-api-access-s45ll\") on node \"crc\" DevicePath \"\"" Mar 01 10:32:06 crc kubenswrapper[4792]: I0301 10:32:06.064489 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539352-dwpdm" event={"ID":"a3569407-4c99-405b-801c-6b0378e1643b","Type":"ContainerDied","Data":"34c7bb3f42e4933f2ed76f82d2443b6df65a41268d1ac12d3d1128bf2f7ce4c7"} Mar 01 10:32:06 crc kubenswrapper[4792]: I0301 10:32:06.064779 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34c7bb3f42e4933f2ed76f82d2443b6df65a41268d1ac12d3d1128bf2f7ce4c7" Mar 01 10:32:06 crc kubenswrapper[4792]: I0301 10:32:06.064557 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539352-dwpdm" Mar 01 10:32:06 crc kubenswrapper[4792]: I0301 10:32:06.110241 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539346-l8ktk"] Mar 01 10:32:06 crc kubenswrapper[4792]: I0301 10:32:06.123220 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539346-l8ktk"] Mar 01 10:32:07 crc kubenswrapper[4792]: I0301 10:32:07.420306 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c23503b-d97f-4cef-b792-e7fbdd8934ab" path="/var/lib/kubelet/pods/9c23503b-d97f-4cef-b792-e7fbdd8934ab/volumes" Mar 01 10:32:40 crc kubenswrapper[4792]: I0301 10:32:40.613514 4792 scope.go:117] "RemoveContainer" containerID="f2c5b6ef4792c5289a47b67d38a5bcb076b8c29acc57acf629d74f960223e4cf" Mar 01 10:33:04 crc kubenswrapper[4792]: I0301 10:33:04.942630 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 10:33:04 crc kubenswrapper[4792]: I0301 10:33:04.943463 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 10:33:34 crc kubenswrapper[4792]: I0301 10:33:34.943455 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 10:33:34 crc kubenswrapper[4792]: I0301 10:33:34.944015 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 10:33:40 crc kubenswrapper[4792]: I0301 10:33:40.707028 4792 scope.go:117] "RemoveContainer" containerID="9b1ac8b2a6b9a9734637c133cb7e1fc37defae1380b1bd14a5fb31b1efa6e0e5" Mar 01 10:33:56 crc kubenswrapper[4792]: I0301 10:33:56.203320 4792 generic.go:334] "Generic (PLEG): container finished" podID="f3c98b67-7926-411d-9068-0b7991b0551b" containerID="08ddb8424f25ab955f23fb248fb3199c701786d1289df5f9635fe0b5ba6df5a8" exitCode=0 Mar 01 10:33:56 crc kubenswrapper[4792]: I0301 10:33:56.203549 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gqc89/must-gather-vdffg" event={"ID":"f3c98b67-7926-411d-9068-0b7991b0551b","Type":"ContainerDied","Data":"08ddb8424f25ab955f23fb248fb3199c701786d1289df5f9635fe0b5ba6df5a8"} Mar 01 10:33:56 crc kubenswrapper[4792]: I0301 10:33:56.204578 4792 scope.go:117] "RemoveContainer" containerID="08ddb8424f25ab955f23fb248fb3199c701786d1289df5f9635fe0b5ba6df5a8" Mar 01 10:33:56 crc kubenswrapper[4792]: I0301 10:33:56.542062 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-gqc89_must-gather-vdffg_f3c98b67-7926-411d-9068-0b7991b0551b/gather/0.log" Mar 01 10:34:00 crc kubenswrapper[4792]: I0301 10:34:00.146778 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539354-xht7z"] Mar 01 10:34:00 crc kubenswrapper[4792]: E0301 10:34:00.147659 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3569407-4c99-405b-801c-6b0378e1643b" containerName="oc" Mar 01 10:34:00 crc kubenswrapper[4792]: I0301 10:34:00.147671 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3569407-4c99-405b-801c-6b0378e1643b" containerName="oc" Mar 01 10:34:00 crc kubenswrapper[4792]: I0301 10:34:00.147848 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3569407-4c99-405b-801c-6b0378e1643b" containerName="oc" Mar 01 10:34:00 crc kubenswrapper[4792]: I0301 10:34:00.148527 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539354-xht7z" Mar 01 10:34:00 crc kubenswrapper[4792]: I0301 10:34:00.150419 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 10:34:00 crc kubenswrapper[4792]: I0301 10:34:00.152750 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 10:34:00 crc kubenswrapper[4792]: I0301 10:34:00.156091 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 10:34:00 crc kubenswrapper[4792]: I0301 10:34:00.168441 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539354-xht7z"] Mar 01 10:34:00 crc kubenswrapper[4792]: I0301 10:34:00.221410 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9ckp\" (UniqueName: \"kubernetes.io/projected/cc6c842e-8a6b-4db8-b8b2-af0e9e9d64c6-kube-api-access-b9ckp\") pod \"auto-csr-approver-29539354-xht7z\" (UID: \"cc6c842e-8a6b-4db8-b8b2-af0e9e9d64c6\") " pod="openshift-infra/auto-csr-approver-29539354-xht7z" Mar 01 10:34:00 crc kubenswrapper[4792]: I0301 10:34:00.331268 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9ckp\" (UniqueName: \"kubernetes.io/projected/cc6c842e-8a6b-4db8-b8b2-af0e9e9d64c6-kube-api-access-b9ckp\") pod \"auto-csr-approver-29539354-xht7z\" (UID: \"cc6c842e-8a6b-4db8-b8b2-af0e9e9d64c6\") " pod="openshift-infra/auto-csr-approver-29539354-xht7z" Mar 01 10:34:00 crc kubenswrapper[4792]: I0301 10:34:00.576712 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9ckp\" (UniqueName: \"kubernetes.io/projected/cc6c842e-8a6b-4db8-b8b2-af0e9e9d64c6-kube-api-access-b9ckp\") pod \"auto-csr-approver-29539354-xht7z\" (UID: \"cc6c842e-8a6b-4db8-b8b2-af0e9e9d64c6\") " pod="openshift-infra/auto-csr-approver-29539354-xht7z" Mar 01 10:34:00 crc kubenswrapper[4792]: I0301 10:34:00.767691 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539354-xht7z" Mar 01 10:34:01 crc kubenswrapper[4792]: I0301 10:34:01.253205 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539354-xht7z"] Mar 01 10:34:02 crc kubenswrapper[4792]: I0301 10:34:02.251924 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539354-xht7z" event={"ID":"cc6c842e-8a6b-4db8-b8b2-af0e9e9d64c6","Type":"ContainerStarted","Data":"7f87b9ff5f5cebd3fa4f8cc04ab1fa49ab1105e996fc5ed9853ba70b1a8e1d9a"} Mar 01 10:34:03 crc kubenswrapper[4792]: I0301 10:34:03.262883 4792 generic.go:334] "Generic (PLEG): container finished" podID="cc6c842e-8a6b-4db8-b8b2-af0e9e9d64c6" containerID="18f7a1fbb5db0372beb7cb8583c564353109011b20fbac6efa4b77f8d451a33b" exitCode=0 Mar 01 10:34:03 crc kubenswrapper[4792]: I0301 10:34:03.263088 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539354-xht7z" event={"ID":"cc6c842e-8a6b-4db8-b8b2-af0e9e9d64c6","Type":"ContainerDied","Data":"18f7a1fbb5db0372beb7cb8583c564353109011b20fbac6efa4b77f8d451a33b"} Mar 01 10:34:04 crc kubenswrapper[4792]: I0301 10:34:04.694368 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539354-xht7z" Mar 01 10:34:04 crc kubenswrapper[4792]: I0301 10:34:04.842986 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9ckp\" (UniqueName: \"kubernetes.io/projected/cc6c842e-8a6b-4db8-b8b2-af0e9e9d64c6-kube-api-access-b9ckp\") pod \"cc6c842e-8a6b-4db8-b8b2-af0e9e9d64c6\" (UID: \"cc6c842e-8a6b-4db8-b8b2-af0e9e9d64c6\") " Mar 01 10:34:04 crc kubenswrapper[4792]: I0301 10:34:04.849807 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc6c842e-8a6b-4db8-b8b2-af0e9e9d64c6-kube-api-access-b9ckp" (OuterVolumeSpecName: "kube-api-access-b9ckp") pod "cc6c842e-8a6b-4db8-b8b2-af0e9e9d64c6" (UID: "cc6c842e-8a6b-4db8-b8b2-af0e9e9d64c6"). InnerVolumeSpecName "kube-api-access-b9ckp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:34:04 crc kubenswrapper[4792]: I0301 10:34:04.942589 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 10:34:04 crc kubenswrapper[4792]: I0301 10:34:04.942641 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 10:34:04 crc kubenswrapper[4792]: I0301 10:34:04.942679 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 10:34:04 crc kubenswrapper[4792]: I0301 10:34:04.943413 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7b8e6b0aba17b92d064bcafd390c97d2064d3be27e2b966c778b962700333543"} pod="openshift-machine-config-operator/machine-config-daemon-bqszv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 01 10:34:04 crc kubenswrapper[4792]: I0301 10:34:04.943464 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" containerID="cri-o://7b8e6b0aba17b92d064bcafd390c97d2064d3be27e2b966c778b962700333543" gracePeriod=600 Mar 01 10:34:04 crc kubenswrapper[4792]: I0301 10:34:04.944728 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9ckp\" (UniqueName: \"kubernetes.io/projected/cc6c842e-8a6b-4db8-b8b2-af0e9e9d64c6-kube-api-access-b9ckp\") on node \"crc\" DevicePath \"\"" Mar 01 10:34:05 crc kubenswrapper[4792]: I0301 10:34:05.302007 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539354-xht7z" event={"ID":"cc6c842e-8a6b-4db8-b8b2-af0e9e9d64c6","Type":"ContainerDied","Data":"7f87b9ff5f5cebd3fa4f8cc04ab1fa49ab1105e996fc5ed9853ba70b1a8e1d9a"} Mar 01 10:34:05 crc kubenswrapper[4792]: I0301 10:34:05.302054 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f87b9ff5f5cebd3fa4f8cc04ab1fa49ab1105e996fc5ed9853ba70b1a8e1d9a" Mar 01 10:34:05 crc kubenswrapper[4792]: I0301 10:34:05.302122 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539354-xht7z" Mar 01 10:34:05 crc kubenswrapper[4792]: I0301 10:34:05.306207 4792 generic.go:334] "Generic (PLEG): container finished" podID="9105f6b0-6f16-47aa-8009-73736a90b765" containerID="7b8e6b0aba17b92d064bcafd390c97d2064d3be27e2b966c778b962700333543" exitCode=0 Mar 01 10:34:05 crc kubenswrapper[4792]: I0301 10:34:05.306247 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerDied","Data":"7b8e6b0aba17b92d064bcafd390c97d2064d3be27e2b966c778b962700333543"} Mar 01 10:34:05 crc kubenswrapper[4792]: I0301 10:34:05.306274 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerStarted","Data":"73191cfccc209b70bde4b4703193d041d53f629db3567c2baa8fb03d0794d6ef"} Mar 01 10:34:05 crc kubenswrapper[4792]: I0301 10:34:05.306289 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:34:05 crc kubenswrapper[4792]: I0301 10:34:05.765405 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539348-5hvql"] Mar 01 10:34:05 crc kubenswrapper[4792]: I0301 10:34:05.777240 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539348-5hvql"] Mar 01 10:34:07 crc kubenswrapper[4792]: I0301 10:34:07.422156 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5867157f-b16a-460c-afc4-0981a4d8ee43" path="/var/lib/kubelet/pods/5867157f-b16a-460c-afc4-0981a4d8ee43/volumes" Mar 01 10:34:11 crc kubenswrapper[4792]: I0301 10:34:11.626591 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-gqc89/must-gather-vdffg"] Mar 01 10:34:11 crc kubenswrapper[4792]: I0301 10:34:11.627326 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-gqc89/must-gather-vdffg" podUID="f3c98b67-7926-411d-9068-0b7991b0551b" containerName="copy" containerID="cri-o://3a427244043269002de52e0a079a3760bf74195c9fc73cf6abf599f3980c7560" gracePeriod=2 Mar 01 10:34:11 crc kubenswrapper[4792]: I0301 10:34:11.638925 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-gqc89/must-gather-vdffg"] Mar 01 10:34:12 crc kubenswrapper[4792]: I0301 10:34:12.060620 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-gqc89_must-gather-vdffg_f3c98b67-7926-411d-9068-0b7991b0551b/copy/0.log" Mar 01 10:34:12 crc kubenswrapper[4792]: I0301 10:34:12.060934 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gqc89/must-gather-vdffg" Mar 01 10:34:12 crc kubenswrapper[4792]: I0301 10:34:12.198919 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcq2x\" (UniqueName: \"kubernetes.io/projected/f3c98b67-7926-411d-9068-0b7991b0551b-kube-api-access-mcq2x\") pod \"f3c98b67-7926-411d-9068-0b7991b0551b\" (UID: \"f3c98b67-7926-411d-9068-0b7991b0551b\") " Mar 01 10:34:12 crc kubenswrapper[4792]: I0301 10:34:12.199000 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f3c98b67-7926-411d-9068-0b7991b0551b-must-gather-output\") pod \"f3c98b67-7926-411d-9068-0b7991b0551b\" (UID: \"f3c98b67-7926-411d-9068-0b7991b0551b\") " Mar 01 10:34:12 crc kubenswrapper[4792]: I0301 10:34:12.203914 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3c98b67-7926-411d-9068-0b7991b0551b-kube-api-access-mcq2x" (OuterVolumeSpecName: "kube-api-access-mcq2x") pod "f3c98b67-7926-411d-9068-0b7991b0551b" (UID: "f3c98b67-7926-411d-9068-0b7991b0551b"). InnerVolumeSpecName "kube-api-access-mcq2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:34:12 crc kubenswrapper[4792]: I0301 10:34:12.301031 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcq2x\" (UniqueName: \"kubernetes.io/projected/f3c98b67-7926-411d-9068-0b7991b0551b-kube-api-access-mcq2x\") on node \"crc\" DevicePath \"\"" Mar 01 10:34:12 crc kubenswrapper[4792]: I0301 10:34:12.361817 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3c98b67-7926-411d-9068-0b7991b0551b-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "f3c98b67-7926-411d-9068-0b7991b0551b" (UID: "f3c98b67-7926-411d-9068-0b7991b0551b"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:34:12 crc kubenswrapper[4792]: I0301 10:34:12.375727 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-gqc89_must-gather-vdffg_f3c98b67-7926-411d-9068-0b7991b0551b/copy/0.log" Mar 01 10:34:12 crc kubenswrapper[4792]: I0301 10:34:12.376021 4792 generic.go:334] "Generic (PLEG): container finished" podID="f3c98b67-7926-411d-9068-0b7991b0551b" containerID="3a427244043269002de52e0a079a3760bf74195c9fc73cf6abf599f3980c7560" exitCode=143 Mar 01 10:34:12 crc kubenswrapper[4792]: I0301 10:34:12.376073 4792 scope.go:117] "RemoveContainer" containerID="3a427244043269002de52e0a079a3760bf74195c9fc73cf6abf599f3980c7560" Mar 01 10:34:12 crc kubenswrapper[4792]: I0301 10:34:12.376198 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gqc89/must-gather-vdffg" Mar 01 10:34:12 crc kubenswrapper[4792]: I0301 10:34:12.398557 4792 scope.go:117] "RemoveContainer" containerID="08ddb8424f25ab955f23fb248fb3199c701786d1289df5f9635fe0b5ba6df5a8" Mar 01 10:34:12 crc kubenswrapper[4792]: I0301 10:34:12.402364 4792 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f3c98b67-7926-411d-9068-0b7991b0551b-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 01 10:34:12 crc kubenswrapper[4792]: I0301 10:34:12.457543 4792 scope.go:117] "RemoveContainer" containerID="3a427244043269002de52e0a079a3760bf74195c9fc73cf6abf599f3980c7560" Mar 01 10:34:12 crc kubenswrapper[4792]: E0301 10:34:12.457934 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a427244043269002de52e0a079a3760bf74195c9fc73cf6abf599f3980c7560\": container with ID starting with 3a427244043269002de52e0a079a3760bf74195c9fc73cf6abf599f3980c7560 not found: ID does not exist" containerID="3a427244043269002de52e0a079a3760bf74195c9fc73cf6abf599f3980c7560" Mar 01 10:34:12 crc kubenswrapper[4792]: I0301 10:34:12.457977 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a427244043269002de52e0a079a3760bf74195c9fc73cf6abf599f3980c7560"} err="failed to get container status \"3a427244043269002de52e0a079a3760bf74195c9fc73cf6abf599f3980c7560\": rpc error: code = NotFound desc = could not find container \"3a427244043269002de52e0a079a3760bf74195c9fc73cf6abf599f3980c7560\": container with ID starting with 3a427244043269002de52e0a079a3760bf74195c9fc73cf6abf599f3980c7560 not found: ID does not exist" Mar 01 10:34:12 crc kubenswrapper[4792]: I0301 10:34:12.458005 4792 scope.go:117] "RemoveContainer" containerID="08ddb8424f25ab955f23fb248fb3199c701786d1289df5f9635fe0b5ba6df5a8" Mar 01 10:34:12 crc kubenswrapper[4792]: E0301 10:34:12.458384 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08ddb8424f25ab955f23fb248fb3199c701786d1289df5f9635fe0b5ba6df5a8\": container with ID starting with 08ddb8424f25ab955f23fb248fb3199c701786d1289df5f9635fe0b5ba6df5a8 not found: ID does not exist" containerID="08ddb8424f25ab955f23fb248fb3199c701786d1289df5f9635fe0b5ba6df5a8" Mar 01 10:34:12 crc kubenswrapper[4792]: I0301 10:34:12.458425 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08ddb8424f25ab955f23fb248fb3199c701786d1289df5f9635fe0b5ba6df5a8"} err="failed to get container status \"08ddb8424f25ab955f23fb248fb3199c701786d1289df5f9635fe0b5ba6df5a8\": rpc error: code = NotFound desc = could not find container \"08ddb8424f25ab955f23fb248fb3199c701786d1289df5f9635fe0b5ba6df5a8\": container with ID starting with 08ddb8424f25ab955f23fb248fb3199c701786d1289df5f9635fe0b5ba6df5a8 not found: ID does not exist" Mar 01 10:34:13 crc kubenswrapper[4792]: I0301 10:34:13.428219 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3c98b67-7926-411d-9068-0b7991b0551b" path="/var/lib/kubelet/pods/f3c98b67-7926-411d-9068-0b7991b0551b/volumes" Mar 01 10:34:40 crc kubenswrapper[4792]: I0301 10:34:40.921791 4792 scope.go:117] "RemoveContainer" containerID="9237ca7f55124284e9b80295a7be3e3ee8987057df25870f237eb05840050933" Mar 01 10:35:55 crc kubenswrapper[4792]: I0301 10:35:55.113202 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-q26sh"] Mar 01 10:35:55 crc kubenswrapper[4792]: E0301 10:35:55.118623 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3c98b67-7926-411d-9068-0b7991b0551b" containerName="gather" Mar 01 10:35:55 crc kubenswrapper[4792]: I0301 10:35:55.118884 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3c98b67-7926-411d-9068-0b7991b0551b" containerName="gather" Mar 01 10:35:55 crc kubenswrapper[4792]: E0301 10:35:55.119010 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3c98b67-7926-411d-9068-0b7991b0551b" containerName="copy" Mar 01 10:35:55 crc kubenswrapper[4792]: I0301 10:35:55.119094 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3c98b67-7926-411d-9068-0b7991b0551b" containerName="copy" Mar 01 10:35:55 crc kubenswrapper[4792]: E0301 10:35:55.119189 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc6c842e-8a6b-4db8-b8b2-af0e9e9d64c6" containerName="oc" Mar 01 10:35:55 crc kubenswrapper[4792]: I0301 10:35:55.119268 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc6c842e-8a6b-4db8-b8b2-af0e9e9d64c6" containerName="oc" Mar 01 10:35:55 crc kubenswrapper[4792]: I0301 10:35:55.119571 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3c98b67-7926-411d-9068-0b7991b0551b" containerName="gather" Mar 01 10:35:55 crc kubenswrapper[4792]: I0301 10:35:55.119678 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3c98b67-7926-411d-9068-0b7991b0551b" containerName="copy" Mar 01 10:35:55 crc kubenswrapper[4792]: I0301 10:35:55.119783 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc6c842e-8a6b-4db8-b8b2-af0e9e9d64c6" containerName="oc" Mar 01 10:35:55 crc kubenswrapper[4792]: I0301 10:35:55.121529 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q26sh" Mar 01 10:35:55 crc kubenswrapper[4792]: I0301 10:35:55.130571 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q26sh"] Mar 01 10:35:55 crc kubenswrapper[4792]: I0301 10:35:55.147948 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7r8m\" (UniqueName: \"kubernetes.io/projected/c2a39635-b289-4498-8fe7-89dd096cd6b7-kube-api-access-m7r8m\") pod \"redhat-marketplace-q26sh\" (UID: \"c2a39635-b289-4498-8fe7-89dd096cd6b7\") " pod="openshift-marketplace/redhat-marketplace-q26sh" Mar 01 10:35:55 crc kubenswrapper[4792]: I0301 10:35:55.148186 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2a39635-b289-4498-8fe7-89dd096cd6b7-utilities\") pod \"redhat-marketplace-q26sh\" (UID: \"c2a39635-b289-4498-8fe7-89dd096cd6b7\") " pod="openshift-marketplace/redhat-marketplace-q26sh" Mar 01 10:35:55 crc kubenswrapper[4792]: I0301 10:35:55.148231 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2a39635-b289-4498-8fe7-89dd096cd6b7-catalog-content\") pod \"redhat-marketplace-q26sh\" (UID: \"c2a39635-b289-4498-8fe7-89dd096cd6b7\") " pod="openshift-marketplace/redhat-marketplace-q26sh" Mar 01 10:35:55 crc kubenswrapper[4792]: I0301 10:35:55.249605 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2a39635-b289-4498-8fe7-89dd096cd6b7-utilities\") pod \"redhat-marketplace-q26sh\" (UID: \"c2a39635-b289-4498-8fe7-89dd096cd6b7\") " pod="openshift-marketplace/redhat-marketplace-q26sh" Mar 01 10:35:55 crc kubenswrapper[4792]: I0301 10:35:55.249666 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2a39635-b289-4498-8fe7-89dd096cd6b7-catalog-content\") pod \"redhat-marketplace-q26sh\" (UID: \"c2a39635-b289-4498-8fe7-89dd096cd6b7\") " pod="openshift-marketplace/redhat-marketplace-q26sh" Mar 01 10:35:55 crc kubenswrapper[4792]: I0301 10:35:55.249734 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7r8m\" (UniqueName: \"kubernetes.io/projected/c2a39635-b289-4498-8fe7-89dd096cd6b7-kube-api-access-m7r8m\") pod \"redhat-marketplace-q26sh\" (UID: \"c2a39635-b289-4498-8fe7-89dd096cd6b7\") " pod="openshift-marketplace/redhat-marketplace-q26sh" Mar 01 10:35:55 crc kubenswrapper[4792]: I0301 10:35:55.250173 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2a39635-b289-4498-8fe7-89dd096cd6b7-utilities\") pod \"redhat-marketplace-q26sh\" (UID: \"c2a39635-b289-4498-8fe7-89dd096cd6b7\") " pod="openshift-marketplace/redhat-marketplace-q26sh" Mar 01 10:35:55 crc kubenswrapper[4792]: I0301 10:35:55.250293 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2a39635-b289-4498-8fe7-89dd096cd6b7-catalog-content\") pod \"redhat-marketplace-q26sh\" (UID: \"c2a39635-b289-4498-8fe7-89dd096cd6b7\") " pod="openshift-marketplace/redhat-marketplace-q26sh" Mar 01 10:35:55 crc kubenswrapper[4792]: I0301 10:35:55.271799 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7r8m\" (UniqueName: \"kubernetes.io/projected/c2a39635-b289-4498-8fe7-89dd096cd6b7-kube-api-access-m7r8m\") pod \"redhat-marketplace-q26sh\" (UID: \"c2a39635-b289-4498-8fe7-89dd096cd6b7\") " pod="openshift-marketplace/redhat-marketplace-q26sh" Mar 01 10:35:55 crc kubenswrapper[4792]: I0301 10:35:55.447430 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q26sh" Mar 01 10:35:55 crc kubenswrapper[4792]: I0301 10:35:55.972719 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q26sh"] Mar 01 10:35:56 crc kubenswrapper[4792]: I0301 10:35:56.390796 4792 generic.go:334] "Generic (PLEG): container finished" podID="c2a39635-b289-4498-8fe7-89dd096cd6b7" containerID="614c20a304f51f6b29c6201e2d4a764ee9d3e3b0dedf3183d0763288ac384659" exitCode=0 Mar 01 10:35:56 crc kubenswrapper[4792]: I0301 10:35:56.390954 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q26sh" event={"ID":"c2a39635-b289-4498-8fe7-89dd096cd6b7","Type":"ContainerDied","Data":"614c20a304f51f6b29c6201e2d4a764ee9d3e3b0dedf3183d0763288ac384659"} Mar 01 10:35:56 crc kubenswrapper[4792]: I0301 10:35:56.391376 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q26sh" event={"ID":"c2a39635-b289-4498-8fe7-89dd096cd6b7","Type":"ContainerStarted","Data":"5edeba1a0067c6ac8c2c982f4caebc08f5998e30ac052774d0d5008a752a8a8f"} Mar 01 10:35:57 crc kubenswrapper[4792]: I0301 10:35:57.405463 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q26sh" event={"ID":"c2a39635-b289-4498-8fe7-89dd096cd6b7","Type":"ContainerStarted","Data":"865dacb19b31d57f85dd3c75502964ee8529dd4b64b612af715d2c4d0c996a48"} Mar 01 10:35:58 crc kubenswrapper[4792]: I0301 10:35:58.420832 4792 generic.go:334] "Generic (PLEG): container finished" podID="c2a39635-b289-4498-8fe7-89dd096cd6b7" containerID="865dacb19b31d57f85dd3c75502964ee8529dd4b64b612af715d2c4d0c996a48" exitCode=0 Mar 01 10:35:58 crc kubenswrapper[4792]: I0301 10:35:58.421039 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q26sh" event={"ID":"c2a39635-b289-4498-8fe7-89dd096cd6b7","Type":"ContainerDied","Data":"865dacb19b31d57f85dd3c75502964ee8529dd4b64b612af715d2c4d0c996a48"} Mar 01 10:35:59 crc kubenswrapper[4792]: I0301 10:35:59.433876 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q26sh" event={"ID":"c2a39635-b289-4498-8fe7-89dd096cd6b7","Type":"ContainerStarted","Data":"dfb56846a9c9011248501381ec8f8dc238f1066b0d1c5de121342cb87f451be3"} Mar 01 10:35:59 crc kubenswrapper[4792]: I0301 10:35:59.458613 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-q26sh" podStartSLOduration=2.04419399 podStartE2EDuration="4.458589849s" podCreationTimestamp="2026-03-01 10:35:55 +0000 UTC" firstStartedPulling="2026-03-01 10:35:56.39302662 +0000 UTC m=+5285.634905817" lastFinishedPulling="2026-03-01 10:35:58.807422489 +0000 UTC m=+5288.049301676" observedRunningTime="2026-03-01 10:35:59.45262213 +0000 UTC m=+5288.694501327" watchObservedRunningTime="2026-03-01 10:35:59.458589849 +0000 UTC m=+5288.700469046" Mar 01 10:36:00 crc kubenswrapper[4792]: I0301 10:36:00.167610 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539356-2s5vx"] Mar 01 10:36:00 crc kubenswrapper[4792]: I0301 10:36:00.169713 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539356-2s5vx" Mar 01 10:36:00 crc kubenswrapper[4792]: I0301 10:36:00.172496 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 10:36:00 crc kubenswrapper[4792]: I0301 10:36:00.175363 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 10:36:00 crc kubenswrapper[4792]: I0301 10:36:00.175507 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 10:36:00 crc kubenswrapper[4792]: I0301 10:36:00.188656 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539356-2s5vx"] Mar 01 10:36:00 crc kubenswrapper[4792]: I0301 10:36:00.279219 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvjcf\" (UniqueName: \"kubernetes.io/projected/a9cf147c-d70d-4595-bc95-e97ed6d5e6e4-kube-api-access-wvjcf\") pod \"auto-csr-approver-29539356-2s5vx\" (UID: \"a9cf147c-d70d-4595-bc95-e97ed6d5e6e4\") " pod="openshift-infra/auto-csr-approver-29539356-2s5vx" Mar 01 10:36:00 crc kubenswrapper[4792]: I0301 10:36:00.381670 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvjcf\" (UniqueName: \"kubernetes.io/projected/a9cf147c-d70d-4595-bc95-e97ed6d5e6e4-kube-api-access-wvjcf\") pod \"auto-csr-approver-29539356-2s5vx\" (UID: \"a9cf147c-d70d-4595-bc95-e97ed6d5e6e4\") " pod="openshift-infra/auto-csr-approver-29539356-2s5vx" Mar 01 10:36:00 crc kubenswrapper[4792]: I0301 10:36:00.401741 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvjcf\" (UniqueName: \"kubernetes.io/projected/a9cf147c-d70d-4595-bc95-e97ed6d5e6e4-kube-api-access-wvjcf\") pod \"auto-csr-approver-29539356-2s5vx\" (UID: \"a9cf147c-d70d-4595-bc95-e97ed6d5e6e4\") " pod="openshift-infra/auto-csr-approver-29539356-2s5vx" Mar 01 10:36:00 crc kubenswrapper[4792]: I0301 10:36:00.497075 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539356-2s5vx" Mar 01 10:36:00 crc kubenswrapper[4792]: I0301 10:36:00.993486 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539356-2s5vx"] Mar 01 10:36:01 crc kubenswrapper[4792]: I0301 10:36:01.458118 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539356-2s5vx" event={"ID":"a9cf147c-d70d-4595-bc95-e97ed6d5e6e4","Type":"ContainerStarted","Data":"bc1c3b0b371c7e7bdc178d978e2d171b704f2ab3100c1cf4de3a74a7575130a8"} Mar 01 10:36:02 crc kubenswrapper[4792]: I0301 10:36:02.469895 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539356-2s5vx" event={"ID":"a9cf147c-d70d-4595-bc95-e97ed6d5e6e4","Type":"ContainerStarted","Data":"44e705e0a477a0d8f3cbeb58cb0807914e2a2315e9879ce644f415ed2c08652b"} Mar 01 10:36:02 crc kubenswrapper[4792]: I0301 10:36:02.489639 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29539356-2s5vx" podStartSLOduration=1.552427459 podStartE2EDuration="2.489612314s" podCreationTimestamp="2026-03-01 10:36:00 +0000 UTC" firstStartedPulling="2026-03-01 10:36:01.005711002 +0000 UTC m=+5290.247590199" lastFinishedPulling="2026-03-01 10:36:01.942895857 +0000 UTC m=+5291.184775054" observedRunningTime="2026-03-01 10:36:02.485498221 +0000 UTC m=+5291.727377418" watchObservedRunningTime="2026-03-01 10:36:02.489612314 +0000 UTC m=+5291.731491511" Mar 01 10:36:03 crc kubenswrapper[4792]: I0301 10:36:03.479144 4792 generic.go:334] "Generic (PLEG): container finished" podID="a9cf147c-d70d-4595-bc95-e97ed6d5e6e4" containerID="44e705e0a477a0d8f3cbeb58cb0807914e2a2315e9879ce644f415ed2c08652b" exitCode=0 Mar 01 10:36:03 crc kubenswrapper[4792]: I0301 10:36:03.479201 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539356-2s5vx" event={"ID":"a9cf147c-d70d-4595-bc95-e97ed6d5e6e4","Type":"ContainerDied","Data":"44e705e0a477a0d8f3cbeb58cb0807914e2a2315e9879ce644f415ed2c08652b"} Mar 01 10:36:04 crc kubenswrapper[4792]: I0301 10:36:04.847191 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539356-2s5vx" Mar 01 10:36:04 crc kubenswrapper[4792]: I0301 10:36:04.912759 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvjcf\" (UniqueName: \"kubernetes.io/projected/a9cf147c-d70d-4595-bc95-e97ed6d5e6e4-kube-api-access-wvjcf\") pod \"a9cf147c-d70d-4595-bc95-e97ed6d5e6e4\" (UID: \"a9cf147c-d70d-4595-bc95-e97ed6d5e6e4\") " Mar 01 10:36:04 crc kubenswrapper[4792]: I0301 10:36:04.922887 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9cf147c-d70d-4595-bc95-e97ed6d5e6e4-kube-api-access-wvjcf" (OuterVolumeSpecName: "kube-api-access-wvjcf") pod "a9cf147c-d70d-4595-bc95-e97ed6d5e6e4" (UID: "a9cf147c-d70d-4595-bc95-e97ed6d5e6e4"). InnerVolumeSpecName "kube-api-access-wvjcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:36:05 crc kubenswrapper[4792]: I0301 10:36:05.015501 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvjcf\" (UniqueName: \"kubernetes.io/projected/a9cf147c-d70d-4595-bc95-e97ed6d5e6e4-kube-api-access-wvjcf\") on node \"crc\" DevicePath \"\"" Mar 01 10:36:05 crc kubenswrapper[4792]: I0301 10:36:05.448254 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-q26sh" Mar 01 10:36:05 crc kubenswrapper[4792]: I0301 10:36:05.450197 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-q26sh" Mar 01 10:36:05 crc kubenswrapper[4792]: I0301 10:36:05.498696 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539356-2s5vx" Mar 01 10:36:05 crc kubenswrapper[4792]: I0301 10:36:05.499058 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539356-2s5vx" event={"ID":"a9cf147c-d70d-4595-bc95-e97ed6d5e6e4","Type":"ContainerDied","Data":"bc1c3b0b371c7e7bdc178d978e2d171b704f2ab3100c1cf4de3a74a7575130a8"} Mar 01 10:36:05 crc kubenswrapper[4792]: I0301 10:36:05.499172 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc1c3b0b371c7e7bdc178d978e2d171b704f2ab3100c1cf4de3a74a7575130a8" Mar 01 10:36:05 crc kubenswrapper[4792]: I0301 10:36:05.537071 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-q26sh" Mar 01 10:36:05 crc kubenswrapper[4792]: I0301 10:36:05.581604 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539350-vrwj8"] Mar 01 10:36:05 crc kubenswrapper[4792]: I0301 10:36:05.597552 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539350-vrwj8"] Mar 01 10:36:06 crc kubenswrapper[4792]: I0301 10:36:06.580026 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-q26sh" Mar 01 10:36:06 crc kubenswrapper[4792]: I0301 10:36:06.640327 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q26sh"] Mar 01 10:36:07 crc kubenswrapper[4792]: I0301 10:36:07.419184 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f461ce0a-d106-4086-a698-987b95f5f03e" path="/var/lib/kubelet/pods/f461ce0a-d106-4086-a698-987b95f5f03e/volumes" Mar 01 10:36:08 crc kubenswrapper[4792]: I0301 10:36:08.539076 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-q26sh" podUID="c2a39635-b289-4498-8fe7-89dd096cd6b7" containerName="registry-server" containerID="cri-o://dfb56846a9c9011248501381ec8f8dc238f1066b0d1c5de121342cb87f451be3" gracePeriod=2 Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.076347 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q26sh" Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.205777 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2a39635-b289-4498-8fe7-89dd096cd6b7-utilities\") pod \"c2a39635-b289-4498-8fe7-89dd096cd6b7\" (UID: \"c2a39635-b289-4498-8fe7-89dd096cd6b7\") " Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.205896 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7r8m\" (UniqueName: \"kubernetes.io/projected/c2a39635-b289-4498-8fe7-89dd096cd6b7-kube-api-access-m7r8m\") pod \"c2a39635-b289-4498-8fe7-89dd096cd6b7\" (UID: \"c2a39635-b289-4498-8fe7-89dd096cd6b7\") " Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.206005 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2a39635-b289-4498-8fe7-89dd096cd6b7-catalog-content\") pod \"c2a39635-b289-4498-8fe7-89dd096cd6b7\" (UID: \"c2a39635-b289-4498-8fe7-89dd096cd6b7\") " Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.206640 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2a39635-b289-4498-8fe7-89dd096cd6b7-utilities" (OuterVolumeSpecName: "utilities") pod "c2a39635-b289-4498-8fe7-89dd096cd6b7" (UID: "c2a39635-b289-4498-8fe7-89dd096cd6b7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.211083 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2a39635-b289-4498-8fe7-89dd096cd6b7-kube-api-access-m7r8m" (OuterVolumeSpecName: "kube-api-access-m7r8m") pod "c2a39635-b289-4498-8fe7-89dd096cd6b7" (UID: "c2a39635-b289-4498-8fe7-89dd096cd6b7"). InnerVolumeSpecName "kube-api-access-m7r8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.231877 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2a39635-b289-4498-8fe7-89dd096cd6b7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c2a39635-b289-4498-8fe7-89dd096cd6b7" (UID: "c2a39635-b289-4498-8fe7-89dd096cd6b7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.307891 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2a39635-b289-4498-8fe7-89dd096cd6b7-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.308124 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7r8m\" (UniqueName: \"kubernetes.io/projected/c2a39635-b289-4498-8fe7-89dd096cd6b7-kube-api-access-m7r8m\") on node \"crc\" DevicePath \"\"" Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.308219 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2a39635-b289-4498-8fe7-89dd096cd6b7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.551002 4792 generic.go:334] "Generic (PLEG): container finished" podID="c2a39635-b289-4498-8fe7-89dd096cd6b7" containerID="dfb56846a9c9011248501381ec8f8dc238f1066b0d1c5de121342cb87f451be3" exitCode=0 Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.551051 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q26sh" event={"ID":"c2a39635-b289-4498-8fe7-89dd096cd6b7","Type":"ContainerDied","Data":"dfb56846a9c9011248501381ec8f8dc238f1066b0d1c5de121342cb87f451be3"} Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.551084 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q26sh" event={"ID":"c2a39635-b289-4498-8fe7-89dd096cd6b7","Type":"ContainerDied","Data":"5edeba1a0067c6ac8c2c982f4caebc08f5998e30ac052774d0d5008a752a8a8f"} Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.551105 4792 scope.go:117] "RemoveContainer" containerID="dfb56846a9c9011248501381ec8f8dc238f1066b0d1c5de121342cb87f451be3" Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.551057 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q26sh" Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.582977 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q26sh"] Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.587379 4792 scope.go:117] "RemoveContainer" containerID="865dacb19b31d57f85dd3c75502964ee8529dd4b64b612af715d2c4d0c996a48" Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.597884 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-q26sh"] Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.608817 4792 scope.go:117] "RemoveContainer" containerID="614c20a304f51f6b29c6201e2d4a764ee9d3e3b0dedf3183d0763288ac384659" Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.654764 4792 scope.go:117] "RemoveContainer" containerID="dfb56846a9c9011248501381ec8f8dc238f1066b0d1c5de121342cb87f451be3" Mar 01 10:36:09 crc kubenswrapper[4792]: E0301 10:36:09.655211 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfb56846a9c9011248501381ec8f8dc238f1066b0d1c5de121342cb87f451be3\": container with ID starting with dfb56846a9c9011248501381ec8f8dc238f1066b0d1c5de121342cb87f451be3 not found: ID does not exist" containerID="dfb56846a9c9011248501381ec8f8dc238f1066b0d1c5de121342cb87f451be3" Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.655242 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfb56846a9c9011248501381ec8f8dc238f1066b0d1c5de121342cb87f451be3"} err="failed to get container status \"dfb56846a9c9011248501381ec8f8dc238f1066b0d1c5de121342cb87f451be3\": rpc error: code = NotFound desc = could not find container \"dfb56846a9c9011248501381ec8f8dc238f1066b0d1c5de121342cb87f451be3\": container with ID starting with dfb56846a9c9011248501381ec8f8dc238f1066b0d1c5de121342cb87f451be3 not found: ID does not exist" Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.655264 4792 scope.go:117] "RemoveContainer" containerID="865dacb19b31d57f85dd3c75502964ee8529dd4b64b612af715d2c4d0c996a48" Mar 01 10:36:09 crc kubenswrapper[4792]: E0301 10:36:09.655983 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"865dacb19b31d57f85dd3c75502964ee8529dd4b64b612af715d2c4d0c996a48\": container with ID starting with 865dacb19b31d57f85dd3c75502964ee8529dd4b64b612af715d2c4d0c996a48 not found: ID does not exist" containerID="865dacb19b31d57f85dd3c75502964ee8529dd4b64b612af715d2c4d0c996a48" Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.656004 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"865dacb19b31d57f85dd3c75502964ee8529dd4b64b612af715d2c4d0c996a48"} err="failed to get container status \"865dacb19b31d57f85dd3c75502964ee8529dd4b64b612af715d2c4d0c996a48\": rpc error: code = NotFound desc = could not find container \"865dacb19b31d57f85dd3c75502964ee8529dd4b64b612af715d2c4d0c996a48\": container with ID starting with 865dacb19b31d57f85dd3c75502964ee8529dd4b64b612af715d2c4d0c996a48 not found: ID does not exist" Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.656018 4792 scope.go:117] "RemoveContainer" containerID="614c20a304f51f6b29c6201e2d4a764ee9d3e3b0dedf3183d0763288ac384659" Mar 01 10:36:09 crc kubenswrapper[4792]: E0301 10:36:09.656400 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"614c20a304f51f6b29c6201e2d4a764ee9d3e3b0dedf3183d0763288ac384659\": container with ID starting with 614c20a304f51f6b29c6201e2d4a764ee9d3e3b0dedf3183d0763288ac384659 not found: ID does not exist" containerID="614c20a304f51f6b29c6201e2d4a764ee9d3e3b0dedf3183d0763288ac384659" Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.656585 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"614c20a304f51f6b29c6201e2d4a764ee9d3e3b0dedf3183d0763288ac384659"} err="failed to get container status \"614c20a304f51f6b29c6201e2d4a764ee9d3e3b0dedf3183d0763288ac384659\": rpc error: code = NotFound desc = could not find container \"614c20a304f51f6b29c6201e2d4a764ee9d3e3b0dedf3183d0763288ac384659\": container with ID starting with 614c20a304f51f6b29c6201e2d4a764ee9d3e3b0dedf3183d0763288ac384659 not found: ID does not exist" Mar 01 10:36:11 crc kubenswrapper[4792]: I0301 10:36:11.427815 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2a39635-b289-4498-8fe7-89dd096cd6b7" path="/var/lib/kubelet/pods/c2a39635-b289-4498-8fe7-89dd096cd6b7/volumes" Mar 01 10:36:34 crc kubenswrapper[4792]: I0301 10:36:34.951526 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 10:36:34 crc kubenswrapper[4792]: I0301 10:36:34.952148 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 10:36:41 crc kubenswrapper[4792]: I0301 10:36:41.029702 4792 scope.go:117] "RemoveContainer" containerID="72dbc4f60f34ad4150df8188c6c0da347f12c8f2e84aa4c765ae93732fb406a9" Mar 01 10:37:04 crc kubenswrapper[4792]: I0301 10:37:04.942956 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 10:37:04 crc kubenswrapper[4792]: I0301 10:37:04.943793 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 10:37:34 crc kubenswrapper[4792]: I0301 10:37:34.943705 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 10:37:34 crc kubenswrapper[4792]: I0301 10:37:34.944265 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 10:37:34 crc kubenswrapper[4792]: I0301 10:37:34.944532 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 10:37:34 crc kubenswrapper[4792]: I0301 10:37:34.945239 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"73191cfccc209b70bde4b4703193d041d53f629db3567c2baa8fb03d0794d6ef"} pod="openshift-machine-config-operator/machine-config-daemon-bqszv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 01 10:37:34 crc kubenswrapper[4792]: I0301 10:37:34.945297 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" containerID="cri-o://73191cfccc209b70bde4b4703193d041d53f629db3567c2baa8fb03d0794d6ef" gracePeriod=600 Mar 01 10:37:35 crc kubenswrapper[4792]: E0301 10:37:35.064680 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:37:35 crc kubenswrapper[4792]: I0301 10:37:35.798363 4792 generic.go:334] "Generic (PLEG): container finished" podID="9105f6b0-6f16-47aa-8009-73736a90b765" containerID="73191cfccc209b70bde4b4703193d041d53f629db3567c2baa8fb03d0794d6ef" exitCode=0 Mar 01 10:37:35 crc kubenswrapper[4792]: I0301 10:37:35.798408 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerDied","Data":"73191cfccc209b70bde4b4703193d041d53f629db3567c2baa8fb03d0794d6ef"} Mar 01 10:37:35 crc kubenswrapper[4792]: I0301 10:37:35.798444 4792 scope.go:117] "RemoveContainer" containerID="7b8e6b0aba17b92d064bcafd390c97d2064d3be27e2b966c778b962700333543" Mar 01 10:37:35 crc kubenswrapper[4792]: I0301 10:37:35.799149 4792 scope.go:117] "RemoveContainer" containerID="73191cfccc209b70bde4b4703193d041d53f629db3567c2baa8fb03d0794d6ef" Mar 01 10:37:35 crc kubenswrapper[4792]: E0301 10:37:35.799454 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:37:46 crc kubenswrapper[4792]: I0301 10:37:46.410032 4792 scope.go:117] "RemoveContainer" containerID="73191cfccc209b70bde4b4703193d041d53f629db3567c2baa8fb03d0794d6ef" Mar 01 10:37:46 crc kubenswrapper[4792]: E0301 10:37:46.410807 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:38:00 crc kubenswrapper[4792]: I0301 10:38:00.146992 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539358-mdj6g"] Mar 01 10:38:00 crc kubenswrapper[4792]: E0301 10:38:00.148394 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2a39635-b289-4498-8fe7-89dd096cd6b7" containerName="extract-utilities" Mar 01 10:38:00 crc kubenswrapper[4792]: I0301 10:38:00.148411 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a39635-b289-4498-8fe7-89dd096cd6b7" containerName="extract-utilities" Mar 01 10:38:00 crc kubenswrapper[4792]: E0301 10:38:00.148431 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2a39635-b289-4498-8fe7-89dd096cd6b7" containerName="extract-content" Mar 01 10:38:00 crc kubenswrapper[4792]: I0301 10:38:00.148440 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a39635-b289-4498-8fe7-89dd096cd6b7" containerName="extract-content" Mar 01 10:38:00 crc kubenswrapper[4792]: E0301 10:38:00.148457 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2a39635-b289-4498-8fe7-89dd096cd6b7" containerName="registry-server" Mar 01 10:38:00 crc kubenswrapper[4792]: I0301 10:38:00.148464 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a39635-b289-4498-8fe7-89dd096cd6b7" containerName="registry-server" Mar 01 10:38:00 crc kubenswrapper[4792]: E0301 10:38:00.148485 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9cf147c-d70d-4595-bc95-e97ed6d5e6e4" containerName="oc" Mar 01 10:38:00 crc kubenswrapper[4792]: I0301 10:38:00.148492 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9cf147c-d70d-4595-bc95-e97ed6d5e6e4" containerName="oc" Mar 01 10:38:00 crc kubenswrapper[4792]: I0301 10:38:00.148714 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2a39635-b289-4498-8fe7-89dd096cd6b7" containerName="registry-server" Mar 01 10:38:00 crc kubenswrapper[4792]: I0301 10:38:00.148728 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9cf147c-d70d-4595-bc95-e97ed6d5e6e4" containerName="oc" Mar 01 10:38:00 crc kubenswrapper[4792]: I0301 10:38:00.156174 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539358-mdj6g" Mar 01 10:38:00 crc kubenswrapper[4792]: I0301 10:38:00.157220 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539358-mdj6g"] Mar 01 10:38:00 crc kubenswrapper[4792]: I0301 10:38:00.159966 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 10:38:00 crc kubenswrapper[4792]: I0301 10:38:00.160810 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 10:38:00 crc kubenswrapper[4792]: I0301 10:38:00.161935 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 10:38:00 crc kubenswrapper[4792]: I0301 10:38:00.164563 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkb8z\" (UniqueName: \"kubernetes.io/projected/93b961d8-9849-4adc-b4cb-2425eb1b96ce-kube-api-access-qkb8z\") pod \"auto-csr-approver-29539358-mdj6g\" (UID: \"93b961d8-9849-4adc-b4cb-2425eb1b96ce\") " pod="openshift-infra/auto-csr-approver-29539358-mdj6g" Mar 01 10:38:00 crc kubenswrapper[4792]: I0301 10:38:00.266216 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkb8z\" (UniqueName: \"kubernetes.io/projected/93b961d8-9849-4adc-b4cb-2425eb1b96ce-kube-api-access-qkb8z\") pod \"auto-csr-approver-29539358-mdj6g\" (UID: \"93b961d8-9849-4adc-b4cb-2425eb1b96ce\") " pod="openshift-infra/auto-csr-approver-29539358-mdj6g" Mar 01 10:38:00 crc kubenswrapper[4792]: I0301 10:38:00.290701 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkb8z\" (UniqueName: \"kubernetes.io/projected/93b961d8-9849-4adc-b4cb-2425eb1b96ce-kube-api-access-qkb8z\") pod \"auto-csr-approver-29539358-mdj6g\" (UID: \"93b961d8-9849-4adc-b4cb-2425eb1b96ce\") " pod="openshift-infra/auto-csr-approver-29539358-mdj6g" Mar 01 10:38:00 crc kubenswrapper[4792]: I0301 10:38:00.409062 4792 scope.go:117] "RemoveContainer" containerID="73191cfccc209b70bde4b4703193d041d53f629db3567c2baa8fb03d0794d6ef" Mar 01 10:38:00 crc kubenswrapper[4792]: E0301 10:38:00.410054 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:38:00 crc kubenswrapper[4792]: I0301 10:38:00.475635 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539358-mdj6g" Mar 01 10:38:00 crc kubenswrapper[4792]: I0301 10:38:00.779735 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539358-mdj6g"] Mar 01 10:38:00 crc kubenswrapper[4792]: I0301 10:38:00.791168 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 01 10:38:01 crc kubenswrapper[4792]: I0301 10:38:01.082780 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539358-mdj6g" event={"ID":"93b961d8-9849-4adc-b4cb-2425eb1b96ce","Type":"ContainerStarted","Data":"a070729f96faea9c216f7e2f740116a14a064970fb98cf422fa862c7616fa3e8"} Mar 01 10:38:02 crc kubenswrapper[4792]: I0301 10:38:02.090798 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539358-mdj6g" event={"ID":"93b961d8-9849-4adc-b4cb-2425eb1b96ce","Type":"ContainerStarted","Data":"43e979e3eaf2ad491d95cbfa686cf4ef7bcc0455205bebe27aa4d22689c1e591"} Mar 01 10:38:02 crc kubenswrapper[4792]: I0301 10:38:02.117923 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29539358-mdj6g" podStartSLOduration=1.326193235 podStartE2EDuration="2.117884691s" podCreationTimestamp="2026-03-01 10:38:00 +0000 UTC" firstStartedPulling="2026-03-01 10:38:00.790862254 +0000 UTC m=+5410.032741471" lastFinishedPulling="2026-03-01 10:38:01.58255371 +0000 UTC m=+5410.824432927" observedRunningTime="2026-03-01 10:38:02.106750862 +0000 UTC m=+5411.348630069" watchObservedRunningTime="2026-03-01 10:38:02.117884691 +0000 UTC m=+5411.359763888" Mar 01 10:38:03 crc kubenswrapper[4792]: I0301 10:38:03.101541 4792 generic.go:334] "Generic (PLEG): container finished" podID="93b961d8-9849-4adc-b4cb-2425eb1b96ce" containerID="43e979e3eaf2ad491d95cbfa686cf4ef7bcc0455205bebe27aa4d22689c1e591" exitCode=0 Mar 01 10:38:03 crc kubenswrapper[4792]: I0301 10:38:03.101583 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539358-mdj6g" event={"ID":"93b961d8-9849-4adc-b4cb-2425eb1b96ce","Type":"ContainerDied","Data":"43e979e3eaf2ad491d95cbfa686cf4ef7bcc0455205bebe27aa4d22689c1e591"} Mar 01 10:38:04 crc kubenswrapper[4792]: I0301 10:38:04.515898 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539358-mdj6g" Mar 01 10:38:04 crc kubenswrapper[4792]: I0301 10:38:04.576541 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkb8z\" (UniqueName: \"kubernetes.io/projected/93b961d8-9849-4adc-b4cb-2425eb1b96ce-kube-api-access-qkb8z\") pod \"93b961d8-9849-4adc-b4cb-2425eb1b96ce\" (UID: \"93b961d8-9849-4adc-b4cb-2425eb1b96ce\") " Mar 01 10:38:04 crc kubenswrapper[4792]: I0301 10:38:04.583487 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93b961d8-9849-4adc-b4cb-2425eb1b96ce-kube-api-access-qkb8z" (OuterVolumeSpecName: "kube-api-access-qkb8z") pod "93b961d8-9849-4adc-b4cb-2425eb1b96ce" (UID: "93b961d8-9849-4adc-b4cb-2425eb1b96ce"). InnerVolumeSpecName "kube-api-access-qkb8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:38:04 crc kubenswrapper[4792]: I0301 10:38:04.679679 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkb8z\" (UniqueName: \"kubernetes.io/projected/93b961d8-9849-4adc-b4cb-2425eb1b96ce-kube-api-access-qkb8z\") on node \"crc\" DevicePath \"\"" Mar 01 10:38:05 crc kubenswrapper[4792]: I0301 10:38:05.123935 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539358-mdj6g" event={"ID":"93b961d8-9849-4adc-b4cb-2425eb1b96ce","Type":"ContainerDied","Data":"a070729f96faea9c216f7e2f740116a14a064970fb98cf422fa862c7616fa3e8"} Mar 01 10:38:05 crc kubenswrapper[4792]: I0301 10:38:05.123970 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a070729f96faea9c216f7e2f740116a14a064970fb98cf422fa862c7616fa3e8" Mar 01 10:38:05 crc kubenswrapper[4792]: I0301 10:38:05.123977 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539358-mdj6g" Mar 01 10:38:05 crc kubenswrapper[4792]: I0301 10:38:05.604554 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539352-dwpdm"] Mar 01 10:38:05 crc kubenswrapper[4792]: I0301 10:38:05.619551 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539352-dwpdm"] Mar 01 10:38:07 crc kubenswrapper[4792]: I0301 10:38:07.420824 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3569407-4c99-405b-801c-6b0378e1643b" path="/var/lib/kubelet/pods/a3569407-4c99-405b-801c-6b0378e1643b/volumes" Mar 01 10:38:15 crc kubenswrapper[4792]: I0301 10:38:15.408804 4792 scope.go:117] "RemoveContainer" containerID="73191cfccc209b70bde4b4703193d041d53f629db3567c2baa8fb03d0794d6ef" Mar 01 10:38:15 crc kubenswrapper[4792]: E0301 10:38:15.411236 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:38:26 crc kubenswrapper[4792]: I0301 10:38:26.409668 4792 scope.go:117] "RemoveContainer" containerID="73191cfccc209b70bde4b4703193d041d53f629db3567c2baa8fb03d0794d6ef" Mar 01 10:38:26 crc kubenswrapper[4792]: E0301 10:38:26.410350 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:38:26 crc kubenswrapper[4792]: I0301 10:38:26.464276 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fhpp9"] Mar 01 10:38:26 crc kubenswrapper[4792]: E0301 10:38:26.464677 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93b961d8-9849-4adc-b4cb-2425eb1b96ce" containerName="oc" Mar 01 10:38:26 crc kubenswrapper[4792]: I0301 10:38:26.464693 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="93b961d8-9849-4adc-b4cb-2425eb1b96ce" containerName="oc" Mar 01 10:38:26 crc kubenswrapper[4792]: I0301 10:38:26.464934 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="93b961d8-9849-4adc-b4cb-2425eb1b96ce" containerName="oc" Mar 01 10:38:26 crc kubenswrapper[4792]: I0301 10:38:26.466182 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fhpp9" Mar 01 10:38:26 crc kubenswrapper[4792]: I0301 10:38:26.504483 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fhpp9"] Mar 01 10:38:26 crc kubenswrapper[4792]: I0301 10:38:26.562582 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cd69486-3302-43f5-9d4a-39bdfca63877-catalog-content\") pod \"community-operators-fhpp9\" (UID: \"2cd69486-3302-43f5-9d4a-39bdfca63877\") " pod="openshift-marketplace/community-operators-fhpp9" Mar 01 10:38:26 crc kubenswrapper[4792]: I0301 10:38:26.562649 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2stn9\" (UniqueName: \"kubernetes.io/projected/2cd69486-3302-43f5-9d4a-39bdfca63877-kube-api-access-2stn9\") pod \"community-operators-fhpp9\" (UID: \"2cd69486-3302-43f5-9d4a-39bdfca63877\") " pod="openshift-marketplace/community-operators-fhpp9" Mar 01 10:38:26 crc kubenswrapper[4792]: I0301 10:38:26.562705 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cd69486-3302-43f5-9d4a-39bdfca63877-utilities\") pod \"community-operators-fhpp9\" (UID: \"2cd69486-3302-43f5-9d4a-39bdfca63877\") " pod="openshift-marketplace/community-operators-fhpp9" Mar 01 10:38:26 crc kubenswrapper[4792]: I0301 10:38:26.664468 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cd69486-3302-43f5-9d4a-39bdfca63877-catalog-content\") pod \"community-operators-fhpp9\" (UID: \"2cd69486-3302-43f5-9d4a-39bdfca63877\") " pod="openshift-marketplace/community-operators-fhpp9" Mar 01 10:38:26 crc kubenswrapper[4792]: I0301 10:38:26.664526 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2stn9\" (UniqueName: \"kubernetes.io/projected/2cd69486-3302-43f5-9d4a-39bdfca63877-kube-api-access-2stn9\") pod \"community-operators-fhpp9\" (UID: \"2cd69486-3302-43f5-9d4a-39bdfca63877\") " pod="openshift-marketplace/community-operators-fhpp9" Mar 01 10:38:26 crc kubenswrapper[4792]: I0301 10:38:26.664597 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cd69486-3302-43f5-9d4a-39bdfca63877-utilities\") pod \"community-operators-fhpp9\" (UID: \"2cd69486-3302-43f5-9d4a-39bdfca63877\") " pod="openshift-marketplace/community-operators-fhpp9" Mar 01 10:38:26 crc kubenswrapper[4792]: I0301 10:38:26.665155 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cd69486-3302-43f5-9d4a-39bdfca63877-catalog-content\") pod \"community-operators-fhpp9\" (UID: \"2cd69486-3302-43f5-9d4a-39bdfca63877\") " pod="openshift-marketplace/community-operators-fhpp9" Mar 01 10:38:26 crc kubenswrapper[4792]: I0301 10:38:26.665192 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cd69486-3302-43f5-9d4a-39bdfca63877-utilities\") pod \"community-operators-fhpp9\" (UID: \"2cd69486-3302-43f5-9d4a-39bdfca63877\") " pod="openshift-marketplace/community-operators-fhpp9" Mar 01 10:38:26 crc kubenswrapper[4792]: I0301 10:38:26.683602 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2stn9\" (UniqueName: \"kubernetes.io/projected/2cd69486-3302-43f5-9d4a-39bdfca63877-kube-api-access-2stn9\") pod \"community-operators-fhpp9\" (UID: \"2cd69486-3302-43f5-9d4a-39bdfca63877\") " pod="openshift-marketplace/community-operators-fhpp9" Mar 01 10:38:26 crc kubenswrapper[4792]: I0301 10:38:26.788959 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fhpp9" Mar 01 10:38:27 crc kubenswrapper[4792]: I0301 10:38:27.355469 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fhpp9"] Mar 01 10:38:28 crc kubenswrapper[4792]: I0301 10:38:28.318832 4792 generic.go:334] "Generic (PLEG): container finished" podID="2cd69486-3302-43f5-9d4a-39bdfca63877" containerID="a8f21807225e3a6ace4c6002bda4fdd20872ce050c558297e617d70bab489f52" exitCode=0 Mar 01 10:38:28 crc kubenswrapper[4792]: I0301 10:38:28.319884 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhpp9" event={"ID":"2cd69486-3302-43f5-9d4a-39bdfca63877","Type":"ContainerDied","Data":"a8f21807225e3a6ace4c6002bda4fdd20872ce050c558297e617d70bab489f52"} Mar 01 10:38:28 crc kubenswrapper[4792]: I0301 10:38:28.320041 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhpp9" event={"ID":"2cd69486-3302-43f5-9d4a-39bdfca63877","Type":"ContainerStarted","Data":"4ea7cfaef119086071ad5bfbe19f079a614e7d099a6039996c82016b9fcdeb1a"} Mar 01 10:38:29 crc kubenswrapper[4792]: I0301 10:38:29.330668 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhpp9" event={"ID":"2cd69486-3302-43f5-9d4a-39bdfca63877","Type":"ContainerStarted","Data":"ef3cde72a14f8edae0053d1910b9b1bf9e136f1998ac725622b1087bb8fb26db"} Mar 01 10:38:30 crc kubenswrapper[4792]: I0301 10:38:30.056124 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vn8wq"] Mar 01 10:38:30 crc kubenswrapper[4792]: I0301 10:38:30.058490 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vn8wq" Mar 01 10:38:30 crc kubenswrapper[4792]: I0301 10:38:30.082241 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vn8wq"] Mar 01 10:38:30 crc kubenswrapper[4792]: I0301 10:38:30.147985 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1fa6d03-9021-4bf7-bb72-9b4ff5f63a71-utilities\") pod \"redhat-operators-vn8wq\" (UID: \"e1fa6d03-9021-4bf7-bb72-9b4ff5f63a71\") " pod="openshift-marketplace/redhat-operators-vn8wq" Mar 01 10:38:30 crc kubenswrapper[4792]: I0301 10:38:30.148288 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1fa6d03-9021-4bf7-bb72-9b4ff5f63a71-catalog-content\") pod \"redhat-operators-vn8wq\" (UID: \"e1fa6d03-9021-4bf7-bb72-9b4ff5f63a71\") " pod="openshift-marketplace/redhat-operators-vn8wq" Mar 01 10:38:30 crc kubenswrapper[4792]: I0301 10:38:30.148460 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsg4l\" (UniqueName: \"kubernetes.io/projected/e1fa6d03-9021-4bf7-bb72-9b4ff5f63a71-kube-api-access-wsg4l\") pod \"redhat-operators-vn8wq\" (UID: \"e1fa6d03-9021-4bf7-bb72-9b4ff5f63a71\") " pod="openshift-marketplace/redhat-operators-vn8wq" Mar 01 10:38:30 crc kubenswrapper[4792]: I0301 10:38:30.250689 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsg4l\" (UniqueName: \"kubernetes.io/projected/e1fa6d03-9021-4bf7-bb72-9b4ff5f63a71-kube-api-access-wsg4l\") pod \"redhat-operators-vn8wq\" (UID: \"e1fa6d03-9021-4bf7-bb72-9b4ff5f63a71\") " pod="openshift-marketplace/redhat-operators-vn8wq" Mar 01 10:38:30 crc kubenswrapper[4792]: I0301 10:38:30.250815 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1fa6d03-9021-4bf7-bb72-9b4ff5f63a71-utilities\") pod \"redhat-operators-vn8wq\" (UID: \"e1fa6d03-9021-4bf7-bb72-9b4ff5f63a71\") " pod="openshift-marketplace/redhat-operators-vn8wq" Mar 01 10:38:30 crc kubenswrapper[4792]: I0301 10:38:30.250950 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1fa6d03-9021-4bf7-bb72-9b4ff5f63a71-catalog-content\") pod \"redhat-operators-vn8wq\" (UID: \"e1fa6d03-9021-4bf7-bb72-9b4ff5f63a71\") " pod="openshift-marketplace/redhat-operators-vn8wq" Mar 01 10:38:30 crc kubenswrapper[4792]: I0301 10:38:30.251595 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1fa6d03-9021-4bf7-bb72-9b4ff5f63a71-utilities\") pod \"redhat-operators-vn8wq\" (UID: \"e1fa6d03-9021-4bf7-bb72-9b4ff5f63a71\") " pod="openshift-marketplace/redhat-operators-vn8wq" Mar 01 10:38:30 crc kubenswrapper[4792]: I0301 10:38:30.251736 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1fa6d03-9021-4bf7-bb72-9b4ff5f63a71-catalog-content\") pod \"redhat-operators-vn8wq\" (UID: \"e1fa6d03-9021-4bf7-bb72-9b4ff5f63a71\") " pod="openshift-marketplace/redhat-operators-vn8wq" Mar 01 10:38:30 crc kubenswrapper[4792]: I0301 10:38:30.275556 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsg4l\" (UniqueName: \"kubernetes.io/projected/e1fa6d03-9021-4bf7-bb72-9b4ff5f63a71-kube-api-access-wsg4l\") pod \"redhat-operators-vn8wq\" (UID: \"e1fa6d03-9021-4bf7-bb72-9b4ff5f63a71\") " pod="openshift-marketplace/redhat-operators-vn8wq" Mar 01 10:38:30 crc kubenswrapper[4792]: I0301 10:38:30.341362 4792 generic.go:334] "Generic (PLEG): container finished" podID="2cd69486-3302-43f5-9d4a-39bdfca63877" containerID="ef3cde72a14f8edae0053d1910b9b1bf9e136f1998ac725622b1087bb8fb26db" exitCode=0 Mar 01 10:38:30 crc kubenswrapper[4792]: I0301 10:38:30.341411 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhpp9" event={"ID":"2cd69486-3302-43f5-9d4a-39bdfca63877","Type":"ContainerDied","Data":"ef3cde72a14f8edae0053d1910b9b1bf9e136f1998ac725622b1087bb8fb26db"} Mar 01 10:38:30 crc kubenswrapper[4792]: I0301 10:38:30.380710 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vn8wq" Mar 01 10:38:30 crc kubenswrapper[4792]: I0301 10:38:30.939579 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vn8wq"] Mar 01 10:38:31 crc kubenswrapper[4792]: I0301 10:38:31.362739 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhpp9" event={"ID":"2cd69486-3302-43f5-9d4a-39bdfca63877","Type":"ContainerStarted","Data":"837b7041bb8a5791d0127c1c3cb81c752d36344eee9bb73c0eb8909af5087158"} Mar 01 10:38:31 crc kubenswrapper[4792]: I0301 10:38:31.369277 4792 generic.go:334] "Generic (PLEG): container finished" podID="e1fa6d03-9021-4bf7-bb72-9b4ff5f63a71" containerID="af3b88479908d8281d69c63cd42aba38c75d2bec13893233bab68d039d7309b8" exitCode=0 Mar 01 10:38:31 crc kubenswrapper[4792]: I0301 10:38:31.369339 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vn8wq" event={"ID":"e1fa6d03-9021-4bf7-bb72-9b4ff5f63a71","Type":"ContainerDied","Data":"af3b88479908d8281d69c63cd42aba38c75d2bec13893233bab68d039d7309b8"} Mar 01 10:38:31 crc kubenswrapper[4792]: I0301 10:38:31.369365 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vn8wq" event={"ID":"e1fa6d03-9021-4bf7-bb72-9b4ff5f63a71","Type":"ContainerStarted","Data":"499017856f3064abf3990efcb275a5bd7a272ec526e4c1a7c48f4798e24dc43e"} Mar 01 10:38:31 crc kubenswrapper[4792]: I0301 10:38:31.422322 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fhpp9" podStartSLOduration=3.031760356 podStartE2EDuration="5.422304796s" podCreationTimestamp="2026-03-01 10:38:26 +0000 UTC" firstStartedPulling="2026-03-01 10:38:28.32268772 +0000 UTC m=+5437.564566917" lastFinishedPulling="2026-03-01 10:38:30.71323216 +0000 UTC m=+5439.955111357" observedRunningTime="2026-03-01 10:38:31.402531006 +0000 UTC m=+5440.644410203" watchObservedRunningTime="2026-03-01 10:38:31.422304796 +0000 UTC m=+5440.664183993" Mar 01 10:38:32 crc kubenswrapper[4792]: I0301 10:38:32.379064 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vn8wq" event={"ID":"e1fa6d03-9021-4bf7-bb72-9b4ff5f63a71","Type":"ContainerStarted","Data":"12e1c3fffa7722bb12c081f8cd190eb38a21faf473a8b5a6e9303b13cfbf398c"} Mar 01 10:38:36 crc kubenswrapper[4792]: I0301 10:38:36.789366 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fhpp9" Mar 01 10:38:36 crc kubenswrapper[4792]: I0301 10:38:36.789995 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fhpp9" Mar 01 10:38:36 crc kubenswrapper[4792]: I0301 10:38:36.834577 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fhpp9" Mar 01 10:38:37 crc kubenswrapper[4792]: I0301 10:38:37.409649 4792 scope.go:117] "RemoveContainer" containerID="73191cfccc209b70bde4b4703193d041d53f629db3567c2baa8fb03d0794d6ef" Mar 01 10:38:37 crc kubenswrapper[4792]: E0301 10:38:37.410324 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:38:37 crc kubenswrapper[4792]: I0301 10:38:37.427568 4792 generic.go:334] "Generic (PLEG): container finished" podID="e1fa6d03-9021-4bf7-bb72-9b4ff5f63a71" containerID="12e1c3fffa7722bb12c081f8cd190eb38a21faf473a8b5a6e9303b13cfbf398c" exitCode=0 Mar 01 10:38:37 crc kubenswrapper[4792]: I0301 10:38:37.428801 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vn8wq" event={"ID":"e1fa6d03-9021-4bf7-bb72-9b4ff5f63a71","Type":"ContainerDied","Data":"12e1c3fffa7722bb12c081f8cd190eb38a21faf473a8b5a6e9303b13cfbf398c"} Mar 01 10:38:37 crc kubenswrapper[4792]: I0301 10:38:37.521132 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fhpp9" Mar 01 10:38:38 crc kubenswrapper[4792]: I0301 10:38:38.441356 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vn8wq" event={"ID":"e1fa6d03-9021-4bf7-bb72-9b4ff5f63a71","Type":"ContainerStarted","Data":"41b19061e8be656c49c0989d5fdb0315202b6dd922c27b7da32cfe153c01210f"} Mar 01 10:38:38 crc kubenswrapper[4792]: I0301 10:38:38.468295 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vn8wq" podStartSLOduration=2.027491446 podStartE2EDuration="8.468268008s" podCreationTimestamp="2026-03-01 10:38:30 +0000 UTC" firstStartedPulling="2026-03-01 10:38:31.370491241 +0000 UTC m=+5440.612370428" lastFinishedPulling="2026-03-01 10:38:37.811267793 +0000 UTC m=+5447.053146990" observedRunningTime="2026-03-01 10:38:38.459318626 +0000 UTC m=+5447.701197843" watchObservedRunningTime="2026-03-01 10:38:38.468268008 +0000 UTC m=+5447.710147225" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515151013472024444 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015151013473017362 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015151000325016475 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015151000326015446 5ustar corecore